WorldWideScience

Sample records for source document compilation

  1. Health physics source document for codes of practice

    International Nuclear Information System (INIS)

    Pearson, G.W.; Meggitt, G.C.

    1989-05-01

    Personnel preparing codes of practice often require basic Health Physics information or advice relating to radiological protection problems and this document is written primarily to supply such information. Certain technical terms used in the text are explained in the extensive glossary. Due to the pace of change in the field of radiological protection it is difficult to produce an up-to-date document. This document was compiled during 1988 however, and therefore contains the principle changes brought about by the introduction of the Ionising Radiations Regulations (1985). The paper covers the nature of ionising radiation, its biological effects and the principles of control. It is hoped that the document will provide a useful source of information for both codes of practice and wider areas and stimulate readers to study radiological protection issues in greater depth. (author)

  2. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  3. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  4. Discussion on the compilation of document of iso quality management system for radiation sterilization enterprises

    International Nuclear Information System (INIS)

    Li Chunhong; Ha Yiming; Zhou Hongjie; Feng Zhiguo; Wang Feng

    2006-01-01

    According to the character of cooperation of radiation sterilization, and association with request of ISO9001, ISO13485 and ISO11137, compilation of document in quality manual, procedure document and technological document during certification of ISO quality management system of cooperation of radiation sterilization was discussed. (authors)

  5. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  6. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  7. Supplement to nuclear EQ sourcebook: A compilation of documents for nuclear equipment qualification

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    In the nuclear power industry, environmental and seismic qualification of safety-related electrical and instrumentation equipment is collectively known as Equipment Qualification (EQ). Related technology, as well as regulatory requirements, have evolved rapidly during the last 15 years. For environmental qualification, what began in 1971 with one trial-use guide (IEEE Std 323-1971), now stands as a full complement of Nuclear Regulatory Commission (NRC) rules, guides, and industry standards. As the original 1992 version of the Nuclear EQ Sourcebook was compiled to serve the user-community need for a complete and exhaustive single source of nuclear EQ documentation, this Supplement is published to provide the user community with nuclear EQ documentation revisions and updates that have been published since the publication of the original volume in May of 1992. The first volume of this publication included documents issued before December, 1991. That first volume was well received and also prompted positive response from industry on its usefulness and on recommendations to improve the completeness and enhance the ease of use. This Supplement, therefore, includes new documents, revisions, and updates issued and published since December 1991 and up to and including March 1993. It incorporates various enhancements in order to better serve the user community. One such enhancement included in this Supplement is the new Equipment Classification Index that is described in the User Guide section of this publication. 37 papers, including Bulletins, Federal rules, Generic Letters, Notices, Regulatory Guides, IEEE Standards, IEEE Recommended Practices, and IEEE Guides, have been processed separately for inclusion on the data base

  8. Reporting session of UWTF operation. Compilation of documents

    International Nuclear Information System (INIS)

    Shimizu, Kaoru; Togashi, Akio; Irinouchi, Shigenori

    1999-07-01

    This is the compilation of the papers and OHP transparencies presented, as well as discussions and comments, on the occasion of UWTF reporting session. UWTF stands for The Second Uranium Waste Treatment Facility, which was constructed for compression of metallic wastes and used filters, which are parts of uranium bearing solid wastes generated from Tokai Works, Japan Nuclear Cycle Development Institute. UWTF has been processing wastes since June 4 1998. In the session, based on the one year experience of UWTF operation, the difficulties met and the suggestions to the waste sources are mainly discussed. A brief summary of the UWTF construction, description of waste treatment process, and operation report of fiscal year 1998 are attached. (A. Yamamoto)

  9. Nuclear EQ sourcebook: A compilation of documents for nuclear equipment qualification

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    In the nuclear power industry, environmental and seismic qualification of safety-related electrical and instrumentation equipment is collectively known as Equipment Qualification (EQ). Related technology, as well as regulatory requirements, have evolved rapidly during the last 15 years. For environmental qualification, what began in 1971 with one trial-use guide (IEEE Std 323-1971), now stands as a full complement of Nuclear Regulatory Commission (NRC) rules, guides and industry standards. In addition to the Institute of Electrical and Electronics Engineers (IEEE), the American Society of Mechanical Engineers (ASME) has also undertaken development of its own set of standards for use in qualifying safety-related mechanical equipment. To ensure that the original design and qualification is preserved, engineers need to select and use the correct set of NRC regulations, regulatory guides, industry standards, and generic correspondence. Given that the total number of these documents exceed 200, this task becomes resource intensive. This compilation is the first known publication available to serve the user-community need for a complete and exhaustive single source. Approximately 180 items (Bulletins, Federal Rules, Generic Letters, Notices, Regulatory Guides, IEEE Standards, IEEE Recommended Practices, and IEEE Guides) have been processed separately for inclusion on the data base

  10. Analysis and classification of oncology activities on the way to workflow based single source documentation in clinical information systems.

    Science.gov (United States)

    Wagner, Stefan; Beckmann, Matthias W; Wullich, Bernd; Seggewies, Christof; Ries, Markus; Bürkle, Thomas; Prokosch, Hans-Ulrich

    2015-12-22

    Today, cancer documentation is still a tedious task involving many different information systems even within a single institution and it is rarely supported by appropriate documentation workflows. In a comprehensive 14 step analysis we compiled diagnostic and therapeutic pathways for 13 cancer entities using a mixed approach of document analysis, workflow analysis, expert interviews, workflow modelling and feedback loops. These pathways were stepwise classified and categorized to create a final set of grouped pathways and workflows including electronic documentation forms. A total of 73 workflows for the 13 entities based on 82 paper documentation forms additionally to computer based documentation systems were compiled in a 724 page document comprising 130 figures, 94 tables and 23 tumour classifications as well as 12 follow-up tables. Stepwise classification made it possible to derive grouped diagnostic and therapeutic pathways for the three major classes - solid entities with surgical therapy - solid entities with surgical and additional therapeutic activities and - non-solid entities. For these classes it was possible to deduct common documentation workflows to support workflow-guided single-source documentation. Clinical documentation activities within a Comprehensive Cancer Center can likely be realized in a set of three documentation workflows with conditional branching in a modern workflow supporting clinical information system.

  11. Implementing the EuroFIR Document and Data Repositories as accessible resources of food composition information.

    Science.gov (United States)

    Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul

    2016-02-15

    The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  13. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  14. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  15. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  16. Using Primary Source Documents.

    Science.gov (United States)

    Mintz, Steven

    2003-01-01

    Explores the use of primary sources when teaching about U.S. slavery. Includes primary sources from the Gilder Lehrman Documents Collection (New York Historical Society) to teach about the role of slaves in the Revolutionary War, such as a proclamation from Lord Dunmore offering freedom to slaves who joined his army. (CMK)

  17. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  18. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  19. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  20. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  1. TWRS configuration management requirement source document

    International Nuclear Information System (INIS)

    Vann, J.M.

    1997-01-01

    The TWRS Configuration Management (CM) Requirement Source document prescribes CM as a basic product life-cycle function by which work and activities are conducted or accomplished. This document serves as the requirements basis for the TWRS CM program. The objective of the TWRS CM program is to establish consistency among requirements, physical/functional configuration, information, and documentation for TWRS and TWRS products, and to maintain this consistency throughout the life-cycle of TWRS and the product, particularly as changes are being made

  2. Supplemental Information Source Document Waste Management

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Craig [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Halpern, Jonathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wrons, Ralph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reiser, Anita [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mond, Michael du [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shain, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    This Supplemental Information Source Document for Waste Management was prepared in support of future analyses including those that may be performed as part of the Sandia National Laboratories, New Mexico (SNL/NM) Site-Wide Environmental Impact Statement. This document presents information about waste management practices at SNL/NM, including definitions, inventory data, and an overview of current activities.

  3. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  4. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  5. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  6. Overview of Historical Earthquake Document Database in Japan and Future Development

    Science.gov (United States)

    Nishiyama, A.; Satake, K.

    2014-12-01

    In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami

  7. ENDF/B summary documentation

    International Nuclear Information System (INIS)

    Kinsey, R.

    1979-07-01

    This publication provides a localized source of descriptions for the evaluations contained in the ENDF/B Library. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The summary documentations were written by the CSEWB (Cross Section Evaluation Working Group) evaluators and compiled by NNDC (National Nuclear Data Center). This edition includes documentation for materials found on ENDF/B Version V tapes 501 to 516 (General Purpose File) excluding tape 504. ENDF/B-V also includes tapes containing partial evaluations for the Special Purpose Actinide (521, 522), Dosimetry (531), Activation (532), Gas Production (533), and Fission Product (541-546) files. The materials found on these tapes are documented elsewhere. Some of the evaluation descriptions in this report contain cross sections or energy level information

  8. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  9. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  10. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  11. ENDF/B summary documentation

    Energy Technology Data Exchange (ETDEWEB)

    Kinsey, R. (comp.)

    1979-07-01

    This publication provides a localized source of descriptions for the evaluations contained in the ENDF/B Library. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The summary documentations were written by the CSEWB (Cross Section Evaluation Working Group) evaluators and compiled by NNDC (National Nuclear Data Center). This edition includes documentation for materials found on ENDF/B Version V tapes 501 to 516 (General Purpose File) excluding tape 504. ENDF/B-V also includes tapes containing partial evaluations for the Special Purpose Actinide (521, 522), Dosimetry (531), Activation (532), Gas Production (533), and Fission Product (541-546) files. The materials found on these tapes are documented elsewhere. Some of the evaluation descriptions in this report contain cross sections or energy level information. (RWR)

  12. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  13. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  14. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  15. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  16. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  17. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  18. 2011 Addendum to the SNL/NM SWEIS Supplemental Information Source Documents

    Energy Technology Data Exchange (ETDEWEB)

    Dimmick, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    This document contains updates to the Supplemental Information Sandia National Laboratories/New Mexico Site-Wide Environmental Impact Statement Source Documents that were developed in 2010. In general, this addendum provides calendar year 2010 data, along with changes or additions to text in the original documents.

  19. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  20. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  1. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  2. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  3. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  4. Compilation and Review of Supersonic Business Jet Studies from 1963 through 1995

    Science.gov (United States)

    Maglieri, Domenic J.

    2011-01-01

    This document provides a compilation of all known supersonic business jet studies/activities conducted from 1963 through 1995 by university, industry and the NASA. First, an overview is provided which chronologically displays all known supersonic business jet studies/activities conducted by universities, industry, and the NASA along with the key features of the study vehicles relative to configuration, planform, operation parameters, and the source of study. This is followed by a brief description of each study along with some comments on the study. Mention will be made as to whether the studies addressed cost, market needs, and the environmental issues of airport-community noise, sonic boom, and ozone.

  5. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  6. Description of source term data on contaminated sites and buildings compiled for the waste management programmatic environmental impact statement (WMPEIS)

    International Nuclear Information System (INIS)

    Short, S.M.; Smith, D.E.; Hill, J.G.; Lerchen, M.E.

    1995-10-01

    The U.S. Department of Energy (DOE) and its predecessor agencies have historically had responsibility for carrying out various national missions primarily related to nuclear weapons development and energy research. Recently, these missions have been expanded to include remediation of sites and facilities contaminated as a result of past activities. In January 1990, the Secretary of Energy announced that DOE would prepare a Programmatic Environmental Impact Statement on the DOE's environmental restoration and waste management program; the primary focus was the evaluation of (1) strategies for conducting remediation of all DOE contaminated sites and facilities and (2) potential configurations for waste management capabilities. Several different environmental restoration strategies were identified for evaluation, ranging from doing no remediation to strategies where the level of remediation was driven by such factors as final land use and health effects. A quantitative assessment of the costs and health effects of remediation activities and residual contamination levels associated with each remediation strategy was made. These analyses required that information be compiled on each individual contaminated site and structure located at each DOE installation and that the information compiled include quantitative measurements and/or estimates of contamination levels and extent of contamination. This document provides a description of the types of information and data compiled for use in the analyses. Also provided is a description of the database used to manage the data, a detailed discussion of the methodology and assumptions used in compiling the data, and a summary of the data compiled into the database as of March 1995. As of this date, over 10,000 contaminated sites and structures and over 8,000 uncontaminated structures had been identified across the DOE complex of installations

  7. How Do Open Source Communities Document Software Architecture: An Exploratory Survey

    NARCIS (Netherlands)

    Ding, W.; Liang, P.; Tang, A.; Van Vliet, H.; Shahin, M.

    2014-01-01

    Software architecture (SA) documentation provides a blueprint of a software-intensive system for the communication between stakeholders about the high-level design of the system. In open source software (OSS) development, a lack of SA documentation may hinder the use and further development of OSS,

  8. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  9. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  10. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  11. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  12. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  13. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  14. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  15. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  16. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  17. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  18. Advanced Photon Source experimental beamline Safety Assessment Document: Addendum to the Advanced Photon Source Accelerator Systems Safety Assessment Document (APS-3.2.2.1.0)

    International Nuclear Information System (INIS)

    1995-01-01

    This Safety Assessment Document (SAD) addresses commissioning and operation of the experimental beamlines at the Advanced Photon Source (APS). Purpose of this document is to identify and describe the hazards associated with commissioning and operation of these beamlines and to document the measures taken to minimize these hazards and mitigate the hazard consequences. The potential hazards associated with the commissioning and operation of the APS facility have been identified and analyzed. Physical and administrative controls mitigate identified hazards. No hazard exists in this facility that has not been previously encountered and successfully mitigated in other accelerator and synchrotron radiation research facilities. This document is an updated version of the APS Preliminary Safety Analysis Report (PSAR). During the review of the PSAR in February 1990, the APS was determined to be a Low Hazard Facility. On June 14, 1993, the Acting Director of the Office of Energy Research endorsed the designation of the APS as a Low Hazard Facility, and this Safety Assessment Document supports that designation

  19. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  20. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  1. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  2. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  3. ENDF/B summary documentation

    International Nuclear Information System (INIS)

    Garber, D.

    1975-10-01

    Descriptions of the evaluations contained in the ENDF/B library are given. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer-readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The documentations were written by the CSEWG evaluators and compiled by NNCSC. Selected materials which comprise this volume include from 1 H to 244 Cm

  4. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  5. Compilation of data and descriptions for United States and foreign liquid metal fast breeder reactors

    International Nuclear Information System (INIS)

    Appleby, E.R.

    1975-08-01

    This document is a compilation of design and engineering information pertaining to liquid metal cooled fast breeder reactors which have operated, are operating, or are currently under construction, in the United States and abroad. All data has been taken from publicly available documents, journals, and books

  6. Power and transmission rate orders and related documents. Office of Power Marketing Coordination, data compiled January 1, 1980-December 31, 1981

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-08-01

    This publication contains the power and transmission rate orders and related documents issued by the Department of Energy. It covers calendar years 1980 and 1981. The first publication, DOE/CE-007 covering the period from March through December 1979, was published July 1981. This publication is a compilation of all rate orders issued by the Assistant Secretary for Resource Applications and the Assistant Secretary for Conservation and Renewable Energy during calendar years 1980 and 1981 under Delegation Order No. 0204-33. It also includes all final approvals, remands, and disapprovals by the FERC, and a petition to the FERC for reconsideration by a Power Marketing Administration during 1980 and 1981. Also included are two delegation orders along with an amendment and a supplement to one delegation order, a departmental order on financial reporting, and Power and Transmission Rate Adjustment Procedures relating to federal power marketing.

  7. The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation

    Directory of Open Access Journals (Sweden)

    Daniel Pemstein

    2011-08-01

    Full Text Available The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers. One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1 operator and function over-loading, (2 numerous pre-fabricated utility functions, and (3 clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.

  8. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  9. Earth System Documentation (ES-DOC) Preparation for CMIP6

    Science.gov (United States)

    Denvil, S.; Murphy, S.; Greenslade, M. A.; Lawrence, B.; Guilyardi, E.; Pascoe, C.; Treshanksy, A.; Elkington, M.; Hibling, E.; Hassell, D.

    2015-12-01

    During the course of 2015 the Earth System Documentation (ES-DOC) project began its preparations for CMIP6 (Coupled Model Inter-comparison Project 6) by further extending the ES-DOC tooling ecosystem in support of Earth System Model (ESM) documentation creation, search, viewing & comparison. The ES-DOC online questionnaire, the ES-DOC desktop notebook, and the ES-DOC python toolkit will serve as multiple complementary pathways to generating CMIP6 documentation. It is envisaged that institutes will leverage these tools at different points of the CMIP6 lifecycle. Institutes will be particularly interested to know that the documentation burden will be either streamlined or completely automated.As all the tools are tightly integrated with the ES-DOC web-service, institutes can be confident that the latency between documentation creation & publishing will be reduced to a minimum. Published documents will be viewable with the online ES-DOC Viewer (accessible via citable URL's). Model inter-comparison scenarios will be supported using the ES-DOC online Comparator tool. The Comparator is being extended to:• Support comparison of both Model descriptions & Simulation runs;• Greatly streamline the effort involved in compiling official tables.The entire ES-DOC ecosystem is open source and built upon open standards such as the Common Information Model (CIM) (versions 1 and 2).

  10. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.

  11. World Energy Projection System model documentation

    International Nuclear Information System (INIS)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA

  12. World Energy Projection System model documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.

  13. Human Rights Texts: Converting Human Rights Primary Source Documents into Data.

    Science.gov (United States)

    Fariss, Christopher J; Linder, Fridolin J; Jones, Zachary M; Crabtree, Charles D; Biek, Megan A; Ross, Ana-Sophia M; Kaur, Taranamol; Tsai, Michael

    2015-01-01

    We introduce and make publicly available a large corpus of digitized primary source human rights documents which are published annually by monitoring agencies that include Amnesty International, Human Rights Watch, the Lawyers Committee for Human Rights, and the United States Department of State. In addition to the digitized text, we also make available and describe document-term matrices, which are datasets that systematically organize the word counts from each unique document by each unique term within the corpus of human rights documents. To contextualize the importance of this corpus, we describe the development of coding procedures in the human rights community and several existing categorical indicators that have been created by human coding of the human rights documents contained in the corpus. We then discuss how the new human rights corpus and the existing human rights datasets can be used with a variety of statistical analyses and machine learning algorithms to help scholars understand how human rights practices and reporting have evolved over time. We close with a discussion of our plans for dataset maintenance, updating, and availability.

  14. Compilation of reactor physics data of the year 1984, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-12-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1984 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  15. Compilation of reactor physics data of the year 1983, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-06-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1983 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  16. Priorities for injury prevention in women's Australian football: a compilation of national data from different sources.

    Science.gov (United States)

    Fortington, Lauren V; Finch, Caroline F

    2016-01-01

    Participation in Australian football (AF) has traditionally been male dominated and current understanding of injury and priorities for prevention are based solely on reports of injuries in male players. There is evidence in other sports that indicates that injury types differ between males and females. With increasing participation in AF by females, it is important to consider their specific injury and prevention needs. This study aimed to provide a first injury profile from existing sources for female AF. Compilation of injury data from four prospectively recorded data sets relating to female AF: (1) hospital admissions in Victoria, 2008/09-13/14, n=500 injuries; (2) emergency department (ED) presentations in Victoria, 2008/09-2012/13, n=1,879 injuries; (3) insurance claims across Australia 2004-2013, n=522 injuries; (4) West Australian Women's Football League (WAWFL), 2014 season club data, n=49 injuries. Descriptive results are presented as injury frequencies, injury types and injury to body parts. Hospital admissions and ED presentations were dominated by upper limb injuries, representing 47% and 51% of all injuries, respectively, primarily to the wrist/hand at 32% and 40%. Most (65%) insurance claim injuries involved the lower limb, 27% of which were for knee ligament damage. A high proportion of concussions (33%) were reported in the club-collected data. The results provide the first compilation of existing data sets of women's AF injuries and highlight the need for a rigorous and systematic injury surveillance system to be instituted.

  17. Compilation of documented computer codes applicable to environmental assessment of radioactivity releases

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Shaeffer, D.L.; Garten, C.T. Jr.; Shor, R.W.; Ensminger, J.T.

    1977-04-01

    The objective of this paper is to present a compilation of computer codes for the assessment of accidental or routine releases of radioactivity to the environment from nuclear power facilities. The capabilities of 83 computer codes in the areas of environmental transport and radiation dosimetry are summarized in tabular form. This preliminary analysis clearly indicates that the initial efforts in assessment methodology development have concentrated on atmospheric dispersion, external dosimetry, and internal dosimetry via inhalation. The incorporation of terrestrial and aquatic food chain pathways has been a more recent development and reflects the current requirements of environmental legislation and the needs of regulatory agencies. The characteristics of the conceptual models employed by these codes are reviewed. The appendixes include abstracts of the codes and indexes by author, key words, publication description, and title

  18. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  19. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  20. Programme documentation to control programme for Solar-tracker; Programdokumentation til styringsprogram til Solar-tracker

    Energy Technology Data Exchange (ETDEWEB)

    Rudbeck, C.

    1995-07-01

    The report contains the programme documentation partly for a programme to control of a tracking system and partly a programme, which uses this programme to make measurements of transmittance for covering layer. Both the transmittance measurement programme and the programme is built in Borland Pascal v7.0, and is compiled in Real mode for the use on a processor of the 80X86-family. The source code for the programme for transmittance measurements and the programmes (the positioning routines) are described in Appendix B. (EHS)

  1. Source document for waste area groupings at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Osborne, P.L.; Kuhaida, A.J., Jr.

    1996-09-01

    This document serves as a source document for Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and other types of documents developed for and pertaining to Environmental Restoration (ER) Program activities at Oak Ridge National Laboratory (ORNL). It contains descriptions of the (1) regulatory requirements for the ORR ER Program, (2) Oak Ridge Reservation (ORR) ER Program, (3) ORNL site history and characterization, and (4) history and characterization of Waste Area Groupings (WAGS) 1-20. This document was created to save time, effort, and money for persons and organizations drafting documents for the ER Program and to improve consistency in the documents prepared for the program. By eliminating the repetitious use of selected information about the program, this document will help reduce the time and costs associated with producing program documents. By serving as a benchmark for selected information about the ER Program, this reference will help ensure that information presented in future documents is accurate and complete

  2. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  3. Global Compilation of InSAR Earthquake Source Models: Comparisons with Seismic Catalogues and the Effects of 3D Earth Structure

    Science.gov (United States)

    Weston, J. M.; Ferreira, A. M.; Funning, G. J.

    2010-12-01

    While past progress in seismology led to extensive earthquake catalogues such as the Global Centroid Moment Tensor (GCMT) catalogue, recent advances in space geodesy have enabled earthquake parameter estimations from the measurement of the deformation of the Earth’s surface, notably using InSAR data. Many earthquakes have now been studied using InSAR, but a full assessment of the quality and of the additional value of these source parameters compared to traditional seismological techniques is still lacking. In this study we present results of systematic comparisons between earthquake CMT parameters determined using InSAR and seismic data, on a global scale. We compiled a large database of source parameters obtained using InSAR data from the literature and estimated the corresponding CMT parameters into a ICMT compilation. We here present results from the analysis of 58 earthquakes that occurred between 1992-2007 from about 80 published InSAR studies. Multiple studies of the same earthquake are included in the archive, as they are valuable to assess uncertainties. Where faults are segmented, with changes in width along-strike, a weighted average based on the seismic moment in each fault has been used to determine overall earthquake parameters. For variable slip models, we have calculated source parameters taking the spatial distribution of slip into account. The parameters in our ICMT compilation are compared with those taken from the Global CMT (GCMT), ISC, EHB and NEIC catalogues. We find that earthquake fault strike, dip and rake values in the GCMT and ICMT archives are generally compatible with each other. Likewise, the differences in seismic moment in these two archives are relatively small. However, the locations of the centroid epicentres show substantial discrepancies, which are larger when comparing with GCMT locations (10-30km differences) than for EHB, ISC and NEIC locations (5-15km differences). Since InSAR data have a high spatial resolution, and thus

  4. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  5. Compilation of neutron flux density spectra and reaction rates in different neutron fields. V.3

    International Nuclear Information System (INIS)

    Ertek, C.

    1980-04-01

    Upon the recommendation of the International Working Group of Reactor Radiation Measurements (IWGRRM) a compilation of documents containing neutron flux density spectra and the reaction rates obtained by activiation and fission foils in different neutron fields is presented

  6. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  7. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  8. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  9. IFLA General Conference, 1989. Pre-Session Seminar on Interlending and Document Supply. Papers.

    Science.gov (United States)

    International Federation of Library Associations, The Hague (Netherlands).

    The papers in this compilation address topics relating to interlending and document supply and include the following: (1) "Training in Interlending" (D. E. K. Wijasuriya, Malaysia); (2) "Resource Collections and Document Supply Centre" (Stephney Ferguson, Jamaica); (3) "Organizing for Interlibrary Loan and Document Supply" (Rodrick S. Mabomba);…

  10. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  11. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  12. Palaeoecological studies as a source of peat depth data: A discussion and data compilation for Scotland

    Directory of Open Access Journals (Sweden)

    J. Ratcliffe

    2016-06-01

    Full Text Available The regional/national carbon (C stock of peatlands is often poorly characterised, even for comparatively well-studied areas. A key obstacle to better estimates of landscape C stock is the scarcity of data on peat depth, leading to simplistic assumptions. New measurements of peat depth become unrealistically resource-intensive when considering large areas. Therefore, it is imperative to maximise the use of pre-existing datasets. Here we propose that one potentially valuable and currently unexploited source of peat depth data is palaeoecological studies. We discuss the value of these data and present an initial compilation for Scotland (United Kingdom which consists of records from 437 sites and yields an average depth of 282 cm per site. This figure is likely to be an over-estimate of true average peat depth and is greater than figures used in current estimates of peatland C stock. Depth data from palaeoecological studies have the advantages of wide distribution, high quality, and often the inclusion of valuable supporting information; but also the disadvantage of spatial bias due to the differing motivations of the original researchers. When combined with other data sources, each with its own advantages and limitations, we believe that palaeoecological datasets can make an important contribution to better-constrained estimates of peat depth which, in turn, will lead to better estimates of peatland landscape carbon stock.

  13. List of documented bird species from the municipality of Ubatuba, state of São Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    Rick Simpson

    2012-01-01

    Full Text Available Although preliminary surveys have been conducted at the Atlantic Forest of Ubatuba, there is no list of documented bird records from this coastline municipality. To organize such a compilation, we searched the literature and a number of different sources for all documented records of birds from Ubatuba, state of São Paulo. We further carried out a 7-year non-systematic bird inventory in different regions and elevations to document the species within the municipality. The total number of documented bird species is 417, 11% of which are endemic to Brazil. Another 26% are Atlantic Forest endemics and as many as 60 species are under threat categories, including near-threatened birds, in the state. Some 49 species of 27 families are reported from the municipality but still lack documentation. Considering historical records, no species have extinguished from the municipality. Ubatuba is one of the most studied regions along Serra do Mar in São Paulo regarding its ornithology, but there are still high-elevational gaps that will yield significant additions of species to the area with increasing surveying efforts.

  14. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  15. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  16. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  17. Renewable energy sources. European Commission papers; Energies renouvelables. Documents de la Commission Europeenne

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The ''Directive on the Promotion of Electricity from Renewable Sources of Energy in the Internal Electricity Market'' was adopted in September 2001. Its purpose is to promote an increase in the contribution of renewable energy sources to electricity production in the internal market for electricity and to create a basis for a future Community framework. Energie-Cites provides in this document a summary of its opinion on the Green Paper and on Alterner II and gives a proposal for an Action Plan concerning the White Paper. (A.L.B.)

  18. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  19. Transitioning Existing Content: inferring organisation-specific documents

    Directory of Open Access Journals (Sweden)

    Arijit Sengupta

    2000-11-01

    Full Text Available A definition for a document type within an organization represents an organizational norm about the way the organizational actors represent products and supporting evidence of organizational processes. Generating a good organization-specific document structure is, therefore, important since it can capture a shared understanding among the organizational actors about how certain business processes should be performed. Current tools that generate document type definitions focus on the underlying technology, emphasizing tags created in a single instance document. The tools, thus, fall short of capturing the shared understanding between organizational actors about how a given document type should be represented. We propose a method for inferring organization-specific document structures using multiple instance documents as inputs. The method consists of heuristics that combine individual document definitions, which may have been compiled using standard algorithms. We propose a number of heuristics utilizing artificial intelligence and natural language processing techniques. As the research progresses, the heuristics will be tested on a suite of test cases representing multiple instance documents for different document types. The complete methodology will be implemented as a research prototype

  20. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  1. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  2. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  3. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  4. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  5. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  6. The OCaml system release 4.04: Documentation and user's manual

    OpenAIRE

    Leroy, Xavier; Doligez, Damien; Frisch, Alain; Garrigue, Jacques; Rémy, Didier; Vouillon, Jérôme

    2016-01-01

    This manual documents the release 4.04 of the OCaml system. It is organized as follows. Part I, "An introduction to OCaml", gives an overview of the language. Part II, "The OCaml language", is the reference description of the language. Part III, "The OCaml tools", documents the compilers, toplevel system, and programming utilities. Part IV, "The OCaml library", describes the modules provided in the standard library.

  7. The OCaml system release 4.02: Documentation and user's manual

    OpenAIRE

    Leroy, Xavier; Doligez, Damien; Frisch, Alain; Garrigue, Jacques; Rémy, Didier; Vouillon, Jérôme

    2014-01-01

    This manual documents the release 4.02 of the OCaml system. It is organized as follows. Part I, "An introduction to OCaml", gives an overview of the language. Part II, "The OCaml language", is the reference description of the language. Part III, "The OCaml tools", documents the compilers, toplevel system, and programming utilities. Part IV, "The OCaml library", describes the modules provided in the standard library.

  8. The OCaml system release 4.06: Documentation and user's manual

    OpenAIRE

    Leroy , Xavier; Doligez , Damien; Frisch , Alain; Garrigue , Jacques; Rémy , Didier; Vouillon , Jérôme

    2017-01-01

    This manual documents the release 4.06 of the OCaml system. It is organized as follows. Part I, "An introduction to OCaml", gives an overview of the language. Part II, "The OCaml language", is the reference description of the language. Part III, "The OCaml tools", documents the compilers, toplevel system, and programming utilities. Part IV, "The OCaml library", describes the modules provided in the standard library.

  9. Compilation of the FY 1999 Department of the Navy Working Capital Fund Financial Statements

    National Research Council Canada - National Science Library

    2000-01-01

    ...) Cleveland Center consistently and accurately compiled and consolidated financial data received from Navy field organizations and other sources to prepare the FY 1999 Navy Working Capital Fund financial statements...

  10. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.

    Science.gov (United States)

    MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J

    2010-04-01

    Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.

  11. Advisory Committee on human radiation experiments. Supplemental Volume 2a, Sources and documentation appendices. Final report

    International Nuclear Information System (INIS)

    1995-01-01

    This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE, and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note

  12. Advisory Committee on human radiation experiments. Supplemental Volume 2a, Sources and documentation appendices. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE, and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note.

  13. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  14. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs.

  15. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    International Nuclear Information System (INIS)

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs

  16. 1989 OCRWM [Office of Civilian Radioactive Waste Management] Bulletin compilation and index

    International Nuclear Information System (INIS)

    1990-02-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1989 calendar year. A table of contents and one index have been provided to assist in finding information contained in this year's Bulletins. The pages have been numbered consecutively at the bottom for easy reference. 7 figs

  17. Writing in the workplace: Constructing documents using multiple digital sources

    Directory of Open Access Journals (Sweden)

    Mariëlle Leijten

    2014-02-01

    Full Text Available In today’s workplaces professional communication often involves constructing documents from multiple digital sources—integrating one’s own texts/graphics with ideas based on others’ text/graphics. This article presents a case study of a professional communication designer as he constructs a proposal over several days. Drawing on keystroke and interview data, we map the professional’s overall process, plot the time course of his writing/design, illustrate how he searches for content and switches among optional digital sources, and show how he modifies and reuses others’ content. The case study reveals not only that the professional (1 searches extensively through multiple sources for content and ideas but that he also (2 constructs visual content (charts, graphs, photographs as well as verbal content, and (3 manages his attention and motivation over this extended task. Since these three activities are not represented in current models of writing, we propose their addition not just to models of communication design, but also to models of writing in general.

  18. 76 FR 1173 - Draft Guidance for Industry on Electronic Source Documentation in Clinical Investigations...

    Science.gov (United States)

    2011-01-07

    ... Web page at http://www.fda.gov/RegulatoryInformation/Guidances/default.htm . FDA guidances are issued and updated regularly. We recommend you check the Web site to ensure that you have the most up-to-date... electronic diaries provided by study subjects. When paper source documents are available for review, tracing...

  19. FINANCIAL REPORTING AND SOURCE DOCUMENTS OF UKRAINIAN ENTERPRISES WHEN APPLYING THE IFRS

    Directory of Open Access Journals (Sweden)

    G. Golubnicha

    2013-08-01

    Full Text Available The theoretical, methodological and practical aspects of changes in financial reporting and source documents specific to Ukrainian enterprises in the new conditions resulting from the application of International Financial Reporting Standards have been analyzed. Also, a conceptual approach of defining the patterns of changes in financial reporting and elements of accounting method has been proposed. The issue of internal quality control of analytical accounting information at various stages of its formation has been researched.

  20. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  1. The systematic profiling of false identity documents: method validation and performance evaluation using seizures known to originate from common and different sources.

    Science.gov (United States)

    Baechler, Simon; Terrasse, Vincent; Pujol, Jean-Philippe; Fritz, Thibaud; Ribaux, Olivier; Margot, Pierre

    2013-10-10

    False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Title list of documents made publicly available, August 1-31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    This document is the August 1995 listing of publicly available documents that are issued by the Nuclear Regulatory Commission. The compilation is broken down into docketed and undocketed items, and each of these categories is further broken down into narrower categories. In general, the docketed items pertain to specific NRC licensees, while the undocketed items are of general interest to all licensees of a particular group.

  3. Document understanding for a broad class of documents

    NARCIS (Netherlands)

    Aiello, Marco; Monz, Christof; Todoran, Leon; Worring, Marcel

    2002-01-01

    We present a document analysis system able to assign logical labels and extract the reading order in a broad set of documents. All information sources, from geometric features and spatial relations to the textual features and content are employed in the analysis. To deal effectively with these

  4. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  5. Forms of contractual documents for public gas distribution; Modeles de documents contractuels pour la distribution publique de gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-12-01

    This document is a compilation of standard forms of concession agreements and of specifications for public gas distribution (general dispositions, granted network and works, connection to the granted network, gas quality, contracts and conditions of supply, gas prices, concession end and control, various dispositions, agreement between the town and the grantee, calculation of profit rate, gas retail prices, general conditions of supply). (J.S.)

  6. Evaluation and compilation of fission product yields 1993

    International Nuclear Information System (INIS)

    England, T.R.; Rider, B.F.

    1995-01-01

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993

  7. Evaluation and compilation of fission product yields 1993

    Energy Technology Data Exchange (ETDEWEB)

    England, T.R.; Rider, B.F.

    1995-12-31

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993.

  8. Documentation of Accounting Records in Light of Legislative Innovations

    Directory of Open Access Journals (Sweden)

    K. V. BEZVERKHIY

    2017-05-01

    Full Text Available Legislative reforms in accounting aim to simplify accounting records and compilation of financial reports by business entities, thus increasing the position of Ukraine in the global ranking of Doing Business. This simplification is implied in the changes in the Regulation on Documentation of Accounting Records, entered into force to the Resolution of the Ukrainian Ministry of Finance. The objective of the study is to analyze the legislative innovations involved. The review of changes in documentation of accounting records is made. A comparative analysis of changes in the Regulation on Documentation of Accounting Records is made by sections: 1 General; 2 Primary documents; 3 Accounting records; 4 Correction of errors in primary documents and accounting records; 5 Organization of document circulation; 6 Storage of documents. Methods of analysis and synthesis are used for separating the differences in the editions of the Regulation on Documentation of Accounting Records. The result of the study has theoretical and practical value for the domestic business enterprise sector.

  9. The Application Of Open-Source And Free Photogrammetric Software For The Purposes Of Cultural Heritage Documentation

    Directory of Open Access Journals (Sweden)

    Bartoš Karol

    2014-07-01

    Full Text Available The documentation of cultural heritage is an essential part of appropriate care of historical monuments, representing a part of our history. At present, it represents the current issue, for which considerable funds are being spent, as well as for the documentation of immovable historical monuments in a form of castle ruins, among the others. Non-contact surveying technologies - terrestrial laser scanning and digital photogrammetry belong to the most commonly used technologies, by which suitable documentation can be obtained, however their use may be very costly. In recent years, various types of software products and web services based on the SfM (or MVS method and developed as open-source software, or as a freely available and free service, relying on the basic principles of photogrammetry and computer vision, have started to get into the spotlight. By using the services and software, acquired digital images of a given object can be processed into a point cloud, serving directly as a final output or as a basis for further processing. The aim of this paper, based on images of various objects of the Slanec castle ruins obtained by the DSLR Pentax K5, is to assess the suitability of different types of open-source and free software and free web services and their reliability in terms of surface reconstruction and photo-texture quality for the purposes of castle ruins documentation.

  10. Perspectives on Linguistic Documentation from Sociolinguistic Research on Dialects

    Science.gov (United States)

    Tagliamonte, Sali A.

    2017-01-01

    The goal of the paper is to demonstrate how sociolinguistic research can be applied to endangered language documentation field linguistics. It first provides an overview of the techniques and practices of sociolinguistic fieldwork and the ensuring corpus compilation methods. The discussion is framed with examples from research projects focused on…

  11. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  12. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  13. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  14. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  15. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1993 annual

    International Nuclear Information System (INIS)

    1994-04-01

    This compilation contains 47 ACRS reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1993. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area

  16. Irradiation of strawberries. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1994-12-01

    The document contains a compilation of all available scientific and technical data on the irradiation of strawberries. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. Refs, 1 tab

  17. Compiling Utility Requirements For New Nuclear Power Plant Project

    International Nuclear Information System (INIS)

    Patrakka, Eero

    2002-01-01

    Teollisuuden Voima Oy (TVO) submitted in November 2000 to the Finnish Government an application for a Decision-in-Principle concerning the construction of a new nuclear power plant in Finland. The actual investment decision can be made first after a positive decision has been made by the Government and the Parliament. Parallel to the licensing process, technical preparedness has been upheld so that the procurement process can be commenced without delay, when needed. This includes the definition of requirements for the plant and preliminary preparation of bid inquiry specifications. The core of the technical requirements corresponds to the specifications presented in the European Utility Requirement (EUR) document, compiled by major European electricity producers. Quite naturally, an amount of modifications to the EUR document are needed that take into account the country- and site-specific conditions as well as the experiences gained in the operation of the existing NPP units. Along with the EUR-related requirements concerning the nuclear island and power generation plant, requirements are specified for scope of supply as well as for a variety of issues related to project implementation. (author)

  18. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  19. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  20. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  1. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  2. Sources of patient uncertainty when reviewing medical disclosure and consent documentation.

    Science.gov (United States)

    Donovan-Kicken, Erin; Mackert, Michael; Guinn, Trey D; Tollison, Andrew C; Breckinridge, Barbara

    2013-02-01

    Despite evidence that medical disclosure and consent forms are ineffective at communicating the risks and hazards of treatment and diagnostic procedures, little is known about exactly why they are difficult for patients to understand. The objective of this research was to examine what features of the forms increase people's uncertainty. Interviews were conducted with 254 individuals. After reading a sample consent form, participants described what they found confusing in the document. With uncertainty management as a theoretical framework, interview responses were analyzed for prominent themes. Four distinct sources of uncertainty emerged from participants' responses: (a) language, (b) risks and hazards, (c) the nature of the procedure, and (d) document composition and format. Findings indicate the value of simplifying medico-legal jargon, signposting definitions of terms, removing language that addresses multiple readers simultaneously, reorganizing bulleted lists of risks, and adding section breaks or negative space. These findings offer suggestions for providing more straightforward details about risks and hazards to patients, not necessarily through greater amounts of information but rather through more clear and sufficient material and better formatting. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  3. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  4. Forensic document analysis using scanning microscopy

    Science.gov (United States)

    Shaffer, Douglas K.

    2009-05-01

    The authentication and identification of the source of a printed document(s) can be important in forensic investigations involving a wide range of fraudulent materials, including counterfeit currency, travel and identity documents, business and personal checks, money orders, prescription labels, travelers checks, medical records, financial documents and threatening correspondence. The physical and chemical characterization of document materials - including paper, writing inks and printed media - is becoming increasingly relevant for law enforcement agencies, with the availability of a wide variety of sophisticated commercial printers and copiers which are capable of producing fraudulent documents of extremely high print quality, rendering these difficult to distinguish from genuine documents. This paper describes various applications and analytical methodologies using scanning electron miscoscopy/energy dispersive (x-ray) spectroscopy (SEM/EDS) and related technologies for the characterization of fraudulent documents, and illustrates how their morphological and chemical profiles can be compared to (1) authenticate and (2) link forensic documents with a common source(s) in their production history.

  5. Gaz de France. Source document

    International Nuclear Information System (INIS)

    2005-01-01

    This document was issued by Gaz de France, the French gas utility, at the occasion of the opening of the capital of the company. It is intended to shareholders and presents some informations relative to the stocks admitted to Euronext's Eurolist, some general informations about the company and its capital, some informations about the activities of Gaz de France group, about its financial situation and results, about its management, and about its recent evolution and future perspectives. (J.S.)

  6. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  7. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  8. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  9. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  10. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  11. A compilation of reports of the Advisory Committee on Nuclear Waste, July 1990--June 1991

    International Nuclear Information System (INIS)

    1991-08-01

    This compilation contains 20 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the third year of its operation. The reports were submitted to the Chairman, US Nuclear Regulatory Commission, or to the Director, Office of Nuclear Material Safety and Safeguards. All reports prepared by the Committee have been made available to the public through the NRC Public Document Room and the US Library of Congress

  12. A compilation of reports of the Advisory Committee on nuclear waste, July 1995 -- June 1996

    International Nuclear Information System (INIS)

    1996-08-01

    This compilation contains 8 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the eighth year of its operation. The reports were submitted to the Chairman and Commissioners of the U.S. Nuclear Regulatory Commission. All reports prepared by the Committee have been made available to the public through the NRC Public Document Room, the U.S. Library of Congress, and the internet at http://www.nrc.gov/ACRSACNW

  13. A compilation of reports of the Advisory Committee on nuclear waste, July 1996--June 1997

    International Nuclear Information System (INIS)

    1997-08-01

    This compilation contains 11 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the ninth year of its operation. The reports were submitted to the Chairman and Commissioners of the U.S. Nuclear Regulatory Commission. All reports prepared by the Committee have been made available to the public through the NRC Public Document Room, the U.S. Library of Congress, and the internet at http://www.nrc.gov/ACRSACNW

  14. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD50s and a compilation of experimental data

    International Nuclear Information System (INIS)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD 50 where sufficient experimental data are available. Exposure rates varied in magnitude from about 10 -2 to 10 3 R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs

  15. Compilation of Quality Assurance Documentation for Analyses Performed for the Resumption of Transient Testing Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Annette L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sondrup, A. Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This is a companion document to the analyses performed in support of the environmental assessment for the Resumption of Transient Fuels and Materials Testing. It is provided to allow transparency of the supporting calculations. It provides computer code input and output. The basis for the calculations is documented separately in INL (2013) and is referenced, as appropriate. Spreadsheets used to manipulate the code output are not provided.

  16. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  17. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  18. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  19. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  20. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  1. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  2. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  3. Advisory Committee on human radiation experiments. Final report, Supplemental Volume 2. Sources and documentation

    International Nuclear Information System (INIS)

    1995-01-01

    This volume and its appendixes supplement the Advisory Committee's final report by reporting how we went about looking for information concerning human radiation experiments and intentional releases, a description of what we found and where we found it, and a finding aid for the information that we collected. This volume begins with an overview of federal records, including general descriptions of the types of records that have been useful and how the federal government handles these records. This is followed by an agency-by-agency account of the discovery process and descriptions of the records reviewed, together with instructions on how to obtain further information from those agencies. There is also a description of other sources of information that have been important, including institutional records, print resources, and nonprint media and interviews. The third part contains brief accounts of ACHRE's two major contemporary survey projects (these are described in greater detail in the final report and another supplemental volume) and other research activities. The final section describes how the ACHRE information-nation collections were managed and the records that ACHRE created in the course of its work; this constitutes a general finding aid for the materials deposited with the National Archives. The appendices provide brief references to federal records reviewed, descriptions of the accessions that comprise the ACHRE Research Document Collection, and descriptions of the documents selected for individual treatment. Also included are an account of the documentation available for ACHRE meetings, brief abstracts of the almost 4,000 experiments individually described by ACHRE staff, a full bibliography of secondary sources used, and other information

  4. Advisory Committee on human radiation experiments. Final report, Supplemental Volume 2. Sources and documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This volume and its appendixes supplement the Advisory Committee`s final report by reporting how we went about looking for information concerning human radiation experiments and intentional releases, a description of what we found and where we found it, and a finding aid for the information that we collected. This volume begins with an overview of federal records, including general descriptions of the types of records that have been useful and how the federal government handles these records. This is followed by an agency-by-agency account of the discovery process and descriptions of the records reviewed, together with instructions on how to obtain further information from those agencies. There is also a description of other sources of information that have been important, including institutional records, print resources, and nonprint media and interviews. The third part contains brief accounts of ACHRE`s two major contemporary survey projects (these are described in greater detail in the final report and another supplemental volume) and other research activities. The final section describes how the ACHRE information-nation collections were managed and the records that ACHRE created in the course of its work; this constitutes a general finding aid for the materials deposited with the National Archives. The appendices provide brief references to federal records reviewed, descriptions of the accessions that comprise the ACHRE Research Document Collection, and descriptions of the documents selected for individual treatment. Also included are an account of the documentation available for ACHRE meetings, brief abstracts of the almost 4,000 experiments individually described by ACHRE staff, a full bibliography of secondary sources used, and other information.

  5. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  6. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  7. A compilation of reports of the Advisory Committee on Reactor Safeguards. 1994 annual. Volume 16

    International Nuclear Information System (INIS)

    1995-04-01

    This compilation contains 30 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1994. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the U.S. Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area

  8. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  9. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  10. The Einstein Observatory catalog of IPC x ray sources. Volume 1E: Documentation

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics, which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  11. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  12. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  13. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD/sub 50/s and a compilation of experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD/sub 50/ where sufficient experimental data are available. Exposure rates varied in magnitude from about 10/sup -2/ to 10/sup 3/ R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs.

  14. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  15. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  16. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  17. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  18. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  19. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  20. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  1. Synthesis document on the long time behavior of packages: operational document ''bituminous'' 2204

    International Nuclear Information System (INIS)

    Tiffreau, C.

    2004-09-01

    This document is realized in the framework of the law of 1991 on the radioactive wastes management. The 2004 synthesis document on long time behavior of bituminous sludges packages is constituted by two documents, the reference document and the operational document. This paper presents the operational model describing the water alteration of the packages and the associated radioelements release, as the gas term source and the swelling associated to the self-irradiation and the bituminous radiolysis. (A.L.B.)

  2. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  3. Poster session: Fifth users meeting for the Advanced Photon Source

    International Nuclear Information System (INIS)

    1992-11-01

    The Advanced Photon Source (APS), which is currently under construction as a national user facility at Argonne National Laboratory is a third-generation synchrotron x-ray source, one of only three in the world. It is expected to produce x-rays that are 10,000 times brighter than any currently produced elsewhere for use in research in a wide range of scientific areas. Users from industry, national laboratories, universities, and business will be able to come to the APS to conduct research either as members of Collaborative Access Teams (CATS) or as Independent Investigators. Principal users will be members of CATS, which will be building and operating all of the beamlines present in the first phase of APS beamline development. The first set of CATs has been selected through a competitive proposal process involving peer scientific review, thorough technical evaluation, and significant management oversight by the APS. This document is a compilation of posters presented at the Fifth Users Meeting for the Advanced Photon Source, held at Argonne National Laboratory on October 14--15, 1992. All CATs whose scientific cases were approved by the APS Proposal Evaluation Board are included. In addition, this document contains a poster from the Center for Synchrotron Radiation and Research and Instrumentation at the Illinois Institute of Technology

  4. Compilation of FY 1995 and FY 1996 DOD Financial Statements at the Defense Finance and Accounting Service, Indianapolis Center

    National Research Council Canada - National Science Library

    1996-01-01

    The audit objective was to determine whether the Defense Finance and Accounting Service, Indianapolis Center, consistently and accurately compiled financial data from field entities and other sources...

  5. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  6. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  7. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    Science.gov (United States)

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  8. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  9. Third-party brachytherapy source calibrations and physicist responsibilities: Report of the AAPM Low Energy Brachytherapy Source Calibration Working Group

    International Nuclear Information System (INIS)

    Butler, Wayne M.; Bice, William S. Jr.; DeWerd, Larry A.; Hevezi, James M.; Huq, M. Saiful; Ibbott, Geoffrey S.; Palta, Jatinder R.; Rivard, Mark J.; Seuntjens, Jan P.; Thomadsen, Bruce R.

    2008-01-01

    The AAPM Low Energy Brachytherapy Source Calibration Working Group was formed to investigate and recommend quality control and quality assurance procedures for brachytherapy sources prior to clinical use. Compiling and clarifying recommendations established by previous AAPM Task Groups 40, 56, and 64 were among the working group's charges, which also included the role of third-party handlers to perform loading and assay of sources. This document presents the findings of the working group on the responsibilities of the institutional medical physicist and a clarification of the existing AAPM recommendations in the assay of brachytherapy sources. Responsibility for the performance and attestation of source assays rests with the institutional medical physicist, who must use calibration equipment appropriate for each source type used at the institution. Such equipment and calibration procedures shall ensure secondary traceability to a national standard. For each multi-source implant, 10% of the sources or ten sources, whichever is greater, are to be assayed. Procedures for presterilized source packaging are outlined. The mean source strength of the assayed sources must agree with the manufacturer's stated strength to within 3%, or action must be taken to resolve the difference. Third party assays do not absolve the institutional physicist from the responsibility to perform the institutional measurement and attest to the strength of the implanted sources. The AAPM leaves it to the discretion of the institutional medical physicist whether the manufacturer's or institutional physicist's measured value should be used in performing dosimetry calculations

  10. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  11. OMPC: an Open-Source MATLAB?-to-Python Compiler

    OpenAIRE

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical...

  12. FreeSASA: An open source C library for solvent accessible surface area calculations.

    Science.gov (United States)

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  13. Toward Documentation of Program Evolution

    DEFF Research Database (Denmark)

    Vestdam, Thomas; Nørmark, Kurt

    2005-01-01

    The documentation of a program often falls behind the evolution of the program source files. When this happens it may be attractive to shift the documentation mode from updating the documentation to documenting the evolution of the program. This paper describes tools that support the documentatio....... It is concluded that our approach can help revitalize older documentation, and that discovery of the fine grained program evolution steps help the programmer in documenting the evolution of the program....

  14. Code of ethics for the national pharmaceutical system: Codifying and compilation.

    Science.gov (United States)

    Salari, Pooneh; Namazi, Hamidreza; Abdollahi, Mohammad; Khansari, Fatemeh; Nikfar, Shekoufeh; Larijani, Bagher; Araminia, Behin

    2013-05-01

    Pharmacists as one of health-care providers face ethical issues in terms of pharmaceutical care, relationship with patients and cooperation with the health-care team. Other than pharmacy, there are pharmaceutical companies in various fields of manufacturing, importing or distributing that have their own ethical issues. Therefore, pharmacy practice is vulnerable to ethical challenges and needs special code of conducts. On feeling the need, based on a shared project between experts of the ethics from relevant research centers, all the needs were fully recognized and then specified code of conduct for each was written. The code of conduct was subject to comments of all experts involved in the pharmaceutical sector and thus criticized in several meetings. The prepared code of conduct is comprised of professional code of ethics for pharmacists, ethics guideline for pharmaceutical manufacturers, ethics guideline for pharmaceutical importers, ethics guideline for pharmaceutical distributors, and ethics guideline for policy makers. The document was compiled based on the principles of bioethics and professionalism. The compiling the code of ethics for the national pharmaceutical system is the first step in implementing ethics in pharmacy practice and further attempts into teaching the professionalism and the ethical code as the necessary and complementary effort are highly recommended.

  15. Indoor air quality environmental information handbook: Combustion sources

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    This environmental information handbook was prepared to assist both the non-technical reader (i.e., homeowner) and technical persons (such as researchers, policy analysts, and builders/designers) in understanding the current state of knowledge regarding combustion sources of indoor air pollution. Quantitative and descriptive data addressing the emissions, indoor concentrations, factors influencing indoor concentrations, and health effects of combustion-generated pollutants are provided. In addition, a review of the models, controls, and standards applicable to indoor air pollution from combustion sources is presented. The emphasis is on the residential environment. The data presented here have been compiled from government and privately-funded research results, conference proceedings, technical journals, and recent publications. It is intended to provide the technical reader with a comprehensive overview and reference source on the major indoor air quality aspects relating to indoor combustion activities, including tobacco smoking. In addition, techniques for determining potential concentrations of pollutants in residential settings are presented. This is an update of a 1985 study documenting the state of knowledge of combustion-generated pollutants in the indoor environment. 191 refs., 51 figs., 71 tabs.

  16. Documentation of knowledge in the development of CLEO

    International Nuclear Information System (INIS)

    Smith, D.E.

    1987-01-01

    The developer of an expert based system is a translator between the world of a specialist and the world of computer models and algorithms. In the process of translating, the developer must make explicit the implicit assumptions and actions of the specialist. The process is akin in many ways to that of developing a two way compiler which takes the input of the specialist, parses it for semantic content, and develops the action routines to be performed based on the semantic contents. The process is complicated by the need for it to be two way, for the specialist to be able to verify that the translation is correct. The developer must act as the inverse function to decompile the action routines into the specialist's language. As the process continues, iteration after iteration, the development of the compiler documents the knowledge of the specialist in a more regular form

  17. Market Analysis and Consumer Impacts Source Document. Part I. The Motor Vehicle Market in the Late 1970's

    Science.gov (United States)

    1980-12-01

    The source document on motor vehicle market analysis and consumer impact consists of three parts. Part I is an integrated overview of the motor vehicle market in the late 1970's, with sections on the structure of the market, motor vehicle trends, con...

  18. Advanced Air Transportation Technologies Project, Final Document Collection

    Science.gov (United States)

    Mogford, Richard H.; Wold, Sheryl (Editor)

    2008-01-01

    This CD ROM contains a compilation of the final documents of the Advanced Air Transportation Technologies (AAIT) project, which was an eight-year (1996 to 2004), $400M project managed by the Airspace Systems Program office, which was part of the Aeronautics Research Mission Directorate at NASA Headquarters. AAIT focused on developing advanced automation tools and air traffic management concepts that would help improve the efficiency of the National Airspace System, while maintaining or enhancing safety. The documents contained in the CD are final reports on AAIT tasks that serve to document the project's accomplishments over its eight-year term. Documents include information on: Advanced Air Transportation Technologies, Autonomous Operations Planner, Collaborative Arrival Planner, Distributed Air/Ground Traffic Management Concept Elements 5, 6, & 11, Direct-To, Direct-To Technology Transfer, Expedite Departure Path, En Route Data Exchange, Final Approach Spacing Tool - (Active and Passive), Multi-Center Traffic Management Advisor, Multi Center Traffic Management Advisor Technology Transfer, Surface Movement Advisor, Surface Management System, Surface Management System Technology Transfer and Traffic Flow Management Research & Development.

  19. Compilation of the FY 1998 Army General Fund Financial Statements at the Defense Finance and Accounting Service Indianapolis Center

    National Research Council Canada - National Science Library

    1999-01-01

    Our objective was to determine whether the DFAS Indianapolis Center consistently and accurately compiled financial data from field activities and other sources for the FY 1998 Army General Fund financial statements...

  20. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  1. Gaz de France. Source document; Gaz de France. Document de base

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This document was issued by Gaz de France, the French gas utility, at the occasion of the opening of the capital of the company. It is intended to shareholders and presents some informations relative to the stocks admitted to Euronext's Eurolist, some general informations about the company and its capital, some informations about the activities of Gaz de France group, about its financial situation and results, about its management, and about its recent evolution and future perspectives. (J.S.)

  2. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  3. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  4. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  5. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  6. Documentation of Accounting Records in Light of Legislative Innovations

    OpenAIRE

    K. V. BEZVERKHIY

    2017-01-01

    Legislative reforms in accounting aim to simplify accounting records and compilation of financial reports by business entities, thus increasing the position of Ukraine in the global ranking of Doing Business. This simplification is implied in the changes in the Regulation on Documentation of Accounting Records, entered into force to the Resolution of the Ukrainian Ministry of Finance. The objective of the study is to analyze the legislative innovations involved. The review of changes in docum...

  7. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  8. Tank waste source term inventory validation. Volume 1. Letter report

    International Nuclear Information System (INIS)

    Brevick, C.H.; Gaddis, L.A.; Johnson, E.D.

    1995-01-01

    The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories. This document is Volume I of the Letter Report entitled Tank Waste Source Term Inventory Validation

  9. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  10. The Indian Liberation and Social Rights Movement in Kollasuyu (Bolivia). IWGIA Document 30.

    Science.gov (United States)

    Apaza, Julio Tumiri, Ed.

    For some time the Aymara and Quechua Indians have been adopting resolutions and submitting them to the relevant authorities. Compiled by the Centro de Coordinacion y Promocion Campesina "Mink'A" for consideration by the "First Meeting of Anthropologists in the Andean Region" held in September 1975, this document gives a general…

  11. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  12. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  13. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  14. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  15. A compilation of reports of the Advisory Committee on Nuclear Waste, July 1992--June 1993

    International Nuclear Information System (INIS)

    1993-08-01

    This compilation contains 17 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the fifth year of its operation. The reports were submitted to the Chairman and Commissioners of the US Nuclear Regulatory Commission, the Executive Director for Operations, the Director, Office of Nuclear Material Safety and Safeguards, or to the Director, Division of High Level Waste Management, Office of Nuclear Material Safety and Safeguards. All reports prepared by the Committee have been made available to the public through the NRC Public document Room and the US Library of Congress

  16. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  17. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  18. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  19. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1986 annual

    International Nuclear Information System (INIS)

    1987-04-01

    This compilation contains 58 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1986. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. No classified or other controlled information was prepared in 1986. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  20. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  1. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  2. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  3. The Italian Geographers' Document on the University Education of Future Primary School Teachers

    Science.gov (United States)

    Giorda, Cristiano; Di Palma, Maria Teresa

    2011-01-01

    This article describes an important document compiled by a group of Italian geographers who teach in the Teaching Sciences faculty. Twenty-two university professors in an online community debated concepts and compared ideas in order to establish content, methods and didactic approaches to be applied when training Primary School teachers (pupils…

  4. Gulf Coast geopressured-geothermal program summary report compilation. Volume 4: Bibliography (annotated only for all major reports)

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    This bibliography contains US Department of Energy sponsored Geopressured-Geothermal reports published after 1984. Reports published prior to 1984 are documented in the Geopressured Geothermal bibliography Volumes 1, 2, and 3 that the Center for Energy Studies at the University of Texas at Austin compiled in May 1985. It represents reports, papers and articles covering topics from the scientific and technical aspects of geopressured geothermal reservoirs to the social, environmental, and legal considerations of exploiting those reservoirs for their energy resources.

  5. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  6. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  7. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  8. Microfluidic very large scale integration (VLSI) modeling, simulation, testing, compilation and physical synthesis

    CERN Document Server

    Pop, Paul; Madsen, Jan

    2016-01-01

    This book presents the state-of-the-art techniques for the modeling, simulation, testing, compilation and physical synthesis of mVLSI biochips. The authors describe a top-down modeling and synthesis methodology for the mVLSI biochips, inspired by microelectronics VLSI methodologies. They introduce a modeling framework for the components and the biochip architecture, and a high-level microfluidic protocol language. Coverage includes a topology graph-based model for the biochip architecture, and a sequencing graph to model for biochemical application, showing how the application model can be obtained from the protocol language. The techniques described facilitate programmability and automation, enabling developers in the emerging, large biochip market. · Presents the current models used for the research on compilation and synthesis techniques of mVLSI biochips in a tutorial fashion; · Includes a set of "benchmarks", that are presented in great detail and includes the source code of several of the techniques p...

  9. New Challenges of the Documentation in Media

    Directory of Open Access Journals (Sweden)

    Antonio García Jiménez

    2015-07-01

    Full Text Available This special issue, presented by index.comunicación, is focused on media related information & documentation. This field undergoes constant and profound changes, especially visible in documentation processes. A situation characterized by the existence of tablets, smartphones, applications, and by the almost achieved digitization of traditional documents, in addition to the crisis of the press business model, that involves mutations in the journalists’ tasks and in the relationship between them and Documentation. Papers included in this special issue focus on some of the concerns in this domain: the progressive autonomy of the journalist in access to information sources, the role of press offices as documentation sources, the search of information on the web, the situation of media blogs, the viability of elements of information architecture in smart TV and the development of social TV and its connection to Documentation.

  10. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  11. Tank waste source term inventory validation. Volume II. Letter report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories.

  12. Tank waste source term inventory validation. Volume II. Letter report

    International Nuclear Information System (INIS)

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories

  13. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1992 Annual

    International Nuclear Information System (INIS)

    1993-04-01

    This compilation contains 50 ACRS reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1992. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part I contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  14. A compilation of reports of the Advisory Committee on Reactor Safeguards, 1990 annual

    International Nuclear Information System (INIS)

    1991-04-01

    This compilation contains 31 Advisory Committee on Reactor Safeguards (ACRS) reports submitted to the Commission or to the Executive Director for Operations during calendar year 1990. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subject. Part 1 contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  15. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1987 annual

    International Nuclear Information System (INIS)

    1988-04-01

    This compilation contains 47 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1987. It also includes a report to the Congress on the NRC Safety Research Program for FY 1988. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  16. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1989 annual

    International Nuclear Information System (INIS)

    1990-04-01

    This compilation contains 54 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1989. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1 -- ACRS Reports on Project Reviews, and Part 2 -- ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  17. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1988 annual

    International Nuclear Information System (INIS)

    1989-04-01

    This compilation contains 47 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1988. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1, ACRS Reports on Project Reviews, and Part 2, ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order. 136 refs., 1 tab

  18. A compilation of reports of the Advisory Committee on reactor safeguards. 1996 Annual report

    International Nuclear Information System (INIS)

    1997-04-01

    This compilation contains 47 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1996. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room, the U.S. Library of Congress, and the Internet at http://www.nrc.gov/ACRSACNW. The reports are divided into two groups: Part 1 contains ACRS reports by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  19. Compilation of reports of The Advisory Committee on Reactor Safeguards, 1985 annual. Volume 7

    International Nuclear Information System (INIS)

    1986-05-01

    This compilation contains 63 ACRS reports submitted to the Commission or to the Executive Director for Operations during calendar year 1985. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. No classified or other controlled information was prepared in 1985. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and within project name by chronological order. Part 2 categorizes the reports by the most appropriate generic subject area and within subject area by chronological order

  20. Guidance documents relating to landfills and contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Schomaker, N.B.; Zunt, D.A.

    1990-01-01

    The Environmental Protection Agency is developing and updating a series of Technical Guidance Documents to provide best engineering control technology to meet the needs of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA), respectively. These documents are the compilation of the research efforts to date relating to containment of pollutants from waste disposal to the land as relates to residuals management. The specific areas of research being conducted under the RCRA land disposal program relates to laboratory, pilot and field validation studies in cover systems, waste leaching and solidification, liner systems and disposal facility evaluation. The specific areas of research being conducted under the CERCLA uncontrolled waste sites (Superfund) program relate to in situ treatment, solidification/stabilization for treating hazardous waste, combustion technologies, best demonstrated available technology (BDAT), on-site treatment technologies, emerging biosystems, expert systems, personnel health protection equipment, and site and situation assessment. The Guidance Documents are intended to assist both the regulated community and the permitting authorities, as well as the Program Offices, and Regions, as well as the states and other interested parties, with the latest information relevant to waste management.

  1. The management of electronic documents generated from compilation and revision processes of nuclear and radiation safety regulations and standards

    International Nuclear Information System (INIS)

    Wang Wenhai; Fan Yun; Shang Zhaorong

    2010-01-01

    As the Secretary Group of Regulations and Standards Review Committee on nuclear and radiation safe needs to deal with a large number of electronic documents in course of the regulation and standard review meetings, the article gives a systematical method including electronic document file naming and management as well as procedures of file transfer, storage and usage. (authors)

  2. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    Science.gov (United States)

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  3. A compilation of experimental burnout data for axial flow of water in rod bundles

    International Nuclear Information System (INIS)

    Chapman, A.G.; Carrard, G.

    1981-02-01

    A compilation has been made of burnout (critical heat flux) data from the results of more thant 12,000 tests on 321 electrically-heated, water-cooled experimental assemblies each simulating, to some extent, the operating or postulated accident conditions in the fuel elements of water-cooled nuclear power reactors. The main geometric characteristics of the assemblies are listed and references are given for the sources of information from which the data were gathered

  4. Technical resource documents and technical handbooks for hazardous-wastes management

    Energy Technology Data Exchange (ETDEWEB)

    Schomaker, N.B.; Bliss, T.M.

    1986-07-01

    The Environmental Protection Agency is preparing a series of Technical Resource Documents (TRD's) and Technical Handbooks to provide best engineering control technology to meet the needs of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA) respectively. These documents and handbooks are basically compilation of research efforts of the Land Pollution Control Division (LPCD) to date. The specific areas of research being conducted under the RCRA land disposal program relate to laboratory, pilot and field validation studies in cover systems, waste leaching and solidification, liner systems and disposal facility evaluation. The technical handbooks provide the EPA Program Offices and Regions, as well as the states and other interested parties, with the latest information relevant to remedial actions.

  5. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  6. Using 137Cs and 210Pbex and other sediment source fingerprints to document suspended sediment sources in small forested catchments in south-central Chile

    International Nuclear Information System (INIS)

    Schuller, P.; Walling, D.E.; Iroumé, A.; Quilodrán, C.; Castillo, A.; Navas, A.

    2013-01-01

    A study of the impact of forest harvesting operations on sediment mobilization from forested catchments has been undertaken in south-central Chile. The study focused on two sets of small paired catchments (treatment and control), with similar soil type, but contrasting mean annual rainfall, located about 400 km apart at Nacimiento (1200 mm yr −1 ) and Los Ulmos (2500 mm yr −1 ). The objective was to study the changes in the relative contribution of the primary sources of fine sediment caused by forestry operations. Attention focused on the pre-harvest and post-harvest periods and the post-replanting period was included for the Nacimiento treatment catchment. The sediment source fingerprinting technique was used to document the contributions of the potential sources. Emphasis was placed on discriminating between the forest slopes, forest roads and channel erosion as potential sources of fine sediment and on assessing the relative contributions of these three sources to the sediment yield from the catchments. The fallout radionuclides (FRNs) 137 Cs and excess lead-210, the environmental radionuclides 226 Ra and 40 K and soil organic matter (SOM) were tested as possible fingerprints for discriminating between potential sediment sources. The Kruskal–Wallis test and discriminant function analysis were used to guide the selection of the optimum fingerprint set for each catchment and observation period. Either one or both of the FRNs were selected for inclusion in the optimum fingerprint for all datasets. The relative contribution of each sediment source to the target sediment load was estimated using the selected fingerprint properties, and a mixing model coupled with a Monte Carlo simulation technique that takes account of uncertainty in characterizing sediment source properties. The goodness of fit of the mixing model was tested by comparing the measured and simulated fingerprint properties for the target sediment samples. In the Nacimiento treatment catchment

  7. Documentation associated with the WESF preparation for receiving 25 cesium capsules from the Applied Radiant Energy Corporation (ARECO)

    Energy Technology Data Exchange (ETDEWEB)

    Pawlak, M.W.

    1996-10-21

    The purpose of this report is to compile all documentation associated with facility preparation of WESF to receive 25 cesium capsules from ARECO. The WESF validated it`s preparedness by completing a facility preparedness review using a performance indicator checklist.

  8. Experimental determination of chosen document elements parameters from raster graphics sources

    Directory of Open Access Journals (Sweden)

    Jiří Rybička

    2010-01-01

    Full Text Available Visual appearance of documents and their formal quality is considered to be as important as the content quality. Formal and typographical quality of documents can be evaluated by an automated system that processes raster images of documents. A document is described by a formal model that treats a page as an object and also as a set of elements, whereas page elements include text and graphic object. All elements are described by their parameters depending on elements’ type. For future evaluation, mainly text objects are important. This paper describes the experimental determination of chosen document elements parameters from raster images. Techniques for image processing are used, where an image is represented as a matrix of dots and parameter values are extracted. Algorithms for parameter extraction from raster images were designed and were aimed mainly at typographical parameters like indentation, alignment, font size or spacing. Algorithms were tested on a set of 100 images of paragraphs or pages and provide very good results. Extracted parameters can be directly used for typographical quality evaluation.

  9. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  10. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  11. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  12. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  13. Compilation and synthesis for embedded reconfigurable systems an aspect-oriented approach

    CERN Document Server

    Diniz, Pedro; Coutinho, José; Petrov, Zlatko

    2013-01-01

    This book provides techniques to tackle the design challenges raised by the increasing diversity and complexity of emerging, heterogeneous architectures for embedded systems. It describes an approach based on techniques from software engineering called aspect-oriented programming, which allow designers to control today’s sophisticated design tool chains, while maintaining a single application source code.  Readers are introduced to the basic concepts of an aspect-oriented, domain specific language that enables control of a wide range of compilation and synthesis tools in the partitioning and mapping of an application to a heterogeneous (and possibly multi-core) target architecture.  Several examples are presented that illustrate the benefits of the approach developed for applications from avionics and digital signal processing. Using the aspect-oriented programming techniques presented in this book, developers can reuse extensive sections of their designs, while preserving the original application source-...

  14. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  15. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  16. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  17. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  18. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1991 annual

    International Nuclear Information System (INIS)

    1992-04-01

    This compilation contains 41 Advisory Committee on Reactor Safeguards (ACRS) reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1991. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports alphabetized by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  19. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    International Nuclear Information System (INIS)

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  20. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  1. SRS ecology: Environmental information document

    Energy Technology Data Exchange (ETDEWEB)

    Wike, L.D.; Shipley, R.W.; Bowers, J.A. [and others

    1993-09-01

    The purpose of this Document is to provide a source of ecological information based on the exiting knowledge gained from research conducted at the Savannah River Site. This document provides a summary and synthesis of ecological research in the three main ecosystem types found at SRS and information on the threatened and endangered species residing there.

  2. SRS ecology: Environmental information document

    International Nuclear Information System (INIS)

    Wike, L.D.; Shipley, R.W.; Bowers, J.A.

    1993-09-01

    The purpose of this Document is to provide a source of ecological information based on the exiting knowledge gained from research conducted at the Savannah River Site. This document provides a summary and synthesis of ecological research in the three main ecosystem types found at SRS and information on the threatened and endangered species residing there

  3. Thoughts and Views on the Compilation of Monolingual Dictionaries in South Africa

    Directory of Open Access Journals (Sweden)

    N.C.P Golele

    2011-10-01

    Full Text Available Abstract: Developing and documenting the eleven official languages of South Africa on all levels of communication in order to fulfil all the roles and uses characteristic of truly official languages is a great challenge. To meet this need various bodies such as the National Lexicography Units have been established by the Pan South African Language Board (PanSALB. As far as dictionary compilation is concerned, acquaintance with the state-of-the-art developments in the theory and practice of lexicography is necessary. The focus of the African languages should be directed onto the compilation of monolingual dictionaries. It is important that these monolingual dictionaries should be usable right from the start on a continuous basis. Continued attention should be given to enlarging the corpora and actual consultation of these corpora on the macro- and microstructural levels. The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily and naturally accomplished. Advanced and continued training in the compilation of monolingual dictionaries should be presented. Keywords: MONOLINGUAL DICTIONARIES, OFFICIAL LANGUAGES, DICTIONARY COMPILATION, CORPORA, NATIONAL LEXICOGRAPHY UNITS, TARGET USERS, DICTIONARY USE, DICTIONARY CULTURE, CORE TERMS Opsomming: Gedagtes en beskouings oor die samestelling van eentalige woordeboeke in Suid-Afrika. Die ontwikkeling en dokumentering van die elf amptelike tale van Suid-Afrika op alle vlakke van kommunikasie om alle rolle en gebruike van werklik amptelike tale te vervul, is 'n groot uitdaging. Om in hierdie behoefte te voorsien, is liggame soos die Nasionale Leksikografie-eenhede deur die Pan Suid-Afrikaanse Taalraad (PanSAT tot stand gebring. Wat

  4. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  5. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  6. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  7. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  8. PLOTLIB: a computerized nuclear waste source-term library storage and retrieval system

    International Nuclear Information System (INIS)

    Marshall, J.R.; Nowicki, J.A.

    1978-01-01

    The PLOTLIB code was written to provide computer access to the Nuclear Waste Source-Term Library for those users with little previous computer programming experience. The principles of user orientation, quick accessibility, and versatility were extensively employed in the development of the PLOTLIB code to accomplish this goal. The Nuclear Waste Source-Term Library consists of 16 ORIGEN computer runs incorporating a wide variety of differing light water reactor (LWR) fuel cycles and waste streams. The typical isotopic source-term data consist of information on watts, curies, grams, etc., all of which are compiled as a function of time after reactor discharge and unitized on a per metric ton heavy metal basis. The information retrieval code, PLOTLIB, is used to process source-term information requests into computer plots and/or user-specified output tables. This report will serve both as documentation of the current data library and as an operations manual for the PLOTLIB computer code. The accompanying input description, program listing, and sample problems make this code package an easily understood tool for the various nuclear waste studies under way at the Office of Waste Isolation

  9. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada.

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2007-10-24

    Between 1951 and 1992, 828 underground tests were conducted on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  10. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  11. Multimodal document management in radiotherapy

    International Nuclear Information System (INIS)

    Fahrner, H.; Kirrmann, S.; Roehner, F.; Schmucker, M.; Hall, M.; Heinemann, F.

    2013-01-01

    Background and purpose: After incorporating treatment planning and the organisational model of treatment planning in the operating schedule system (BAS, 'Betriebsablaufsystem'), complete document qualities were embedded in the digital environment. The aim of this project was to integrate all documents independent of their source (paper-bound or digital) and to make content from the BAS available in a structured manner. As many workflow steps as possible should be automated, e.g. assigning a document to a patient in the BAS. Additionally it must be guaranteed that at all times it could be traced who, when, how and from which source documents were imported into the departmental system. Furthermore work procedures should be changed that the documentation conducted either directly in the departmental system or from external systems can be incorporated digitally and paper document can be completely avoided (e.g. documents such as treatment certificate, treatment plans or documentation). It was a further aim, if possible, to automate the removal of paper documents from the departmental work flow, or even to make such paper documents superfluous. In this way patient letters for follow-up appointments should automatically generated from the BAS. Similarly patient record extracts in the form of PDF files should be enabled, e.g. for controlling purposes. Method: The available document qualities were analysed in detail by a multidisciplinary working group (BAS-AG) and after this examination and assessment of the possibility of modelling in our departmental workflow (BAS) they were transcribed into a flow diagram. The gathered specifications were implemented in a test environment by the clinical and administrative IT group of the department of radiation oncology and subsequent to a detailed analysis introduced into clinical routine. Results: The department has succeeded under the conditions of the aforementioned criteria to embed all relevant documents in the departmental

  12. International survey of environmental programmes - a compilation of information from twelve countries received in response to a questionnaire distributed in 1992

    International Nuclear Information System (INIS)

    Gyllander, C.; Karlberg, O.; Luening, M.; Larsson, C.M.; Johansson, G.

    1995-11-01

    The report compiles information from Cuba, Finland, Germany, Japan, South Korea, Lithuania, Luxembourg, Malaysia, Romania, Sweden, Switzerland and United Kingdom, relevant to the organisation and execution of programmes for environmental surveillance of nuclear facilities (source and environmental monitoring). 28 refs, 19 tabs

  13. International survey of environmental programmes - a compilation of information from twelve countries received in response to a questionnaire distributed in 1992

    Energy Technology Data Exchange (ETDEWEB)

    Gyllander, C; Karlberg, O; Luening, M; Larsson, C M; Johansson, G

    1995-11-01

    The report compiles information from Cuba, Finland, Germany, Japan, South Korea, Lithuania, Luxembourg, Malaysia, Romania, Sweden, Switzerland and United Kingdom, relevant to the organisation and execution of programmes for environmental surveillance of nuclear facilities (source and environmental monitoring). 28 refs, 19 tabs.

  14. High level waste storage tanks 242-A evaporator standards/requirement identification document

    International Nuclear Information System (INIS)

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  15. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  16. FreeSASA: An open source C library for solvent accessible surface area calculations [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Simon Mitternacht

    2016-02-01

    Full Text Available Calculating solvent accessible surface areas (SASA is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards’ and Shrake and Rupley’s approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  17. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  18. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  19. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available There are several sources that indicate a remarkable increase in the adoption of open source software (OSS) into the technology infrastructure of organizations. In fact, the number of medium to large organizations without some OSS installations...

  20. Documents on Disarmament.

    Science.gov (United States)

    Arms Control and Disarmament Agency, Washington, DC.

    This publication, latest in a series of volumes issued annually since 1960, contains primary source documents on arms control and disarmament developments during 1969. The main chronological arrangement is supplemented by both chronological and topical lists of contents. Other reference aids include a subject/author index, and lists of…

  1. Student-Centered Pedagogy and Real-World Research: Using Documents as Sources of Data in Teaching Social Science Skills and Methods

    Science.gov (United States)

    Peyrefitte, Magali; Lazar, Gillian

    2018-01-01

    This teaching note describes the design and implementation of an activity in a 90-minute teaching session that was developed to introduce a diverse cohort of first-year criminology and sociology students to the use of documents as sources of data. This approach was contextualized in real-world research through scaffolded, student-centered tasks…

  2. A methodology to compile food metrics related to diet sustainability into a single food database: Application to the French case.

    Science.gov (United States)

    Gazan, Rozenn; Barré, Tangui; Perignon, Marlène; Maillot, Matthieu; Darmon, Nicole; Vieux, Florent

    2018-01-01

    The holistic approach required to assess diet sustainability is hindered by lack of comprehensive databases compiling relevant food metrics. Those metrics are generally scattered in different data sources with various levels of aggregation hampering their matching. The objective was to develop a general methodology to compile food metrics describing diet sustainability dimensions into a single database and to apply it to the French context. Each step of the methodology is detailed: indicators and food metrics identification and selection, food list definition, food matching and values assignment. For the French case, nutrient and contaminant content, bioavailability factors, distribution of dietary intakes, portion sizes, food prices, greenhouse gas emission, acidification and marine eutrophication estimates were allocated to 212 commonly consumed generic foods. This generic database compiling 279 metrics will allow the simultaneous evaluation of the four dimensions of diet sustainability, namely health, economic, social and environmental, dimensions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  4. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  5. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  6. The MetaLex Document Server : Legal Documents as Versioned Linked Data

    NARCIS (Netherlands)

    Hoekstra, R.; Aroyo, L.; Welty, C.; Alani, H.; Taylor, J.; Bernstein, A.; Kagal, L.; Noy, N.; Blomqvist, E.

    2011-01-01

    This paper introduces the MetaLex Document Server (MDS), an ongoing project to improve access to legal sources (regulations, court rulings) by means of a generic legal XML syntax (CEN MetaLex) and Linked Data. The MDS defines a generic conversion mechanism from legacy legal XML syntaxes to CEN

  7. Global data bases on distribution, characteristics and methane emission of natural wetlands: Documentation of archived data tape

    Science.gov (United States)

    Matthews, Elaine

    1989-01-01

    Global digital data bases on the distribution and environmental characteristics of natural wetlands, compiled by Matthews and Fung (1987), were archived for public use. These data bases were developed to evaluate the role of wetlands in the annual emission of methane from terrestrial sources. Five global 1 deg latitude by 1 deg longitude arrays are included on the archived tape. The arrays are: (1) wetland data source, (2) wetland type, (3) fractional inundation, (4) vegetation type, and (5) soil type. The first three data bases on wetland locations were published by Matthews and Fung (1987). The last two arrays contain ancillary information about these wetland locations: vegetation type is from the data of Matthews (1983) and soil type from the data of Zobler (1986). Users should consult original publications for complete discussion of the data bases. This short paper is designed only to document the tape, and briefly explain the data sets and their initial application to estimating the annual emission of methane from natural wetlands. Included is information about array characteristics such as dimensions, read formats, record lengths, blocksizes and value ranges, and descriptions and translation tables for the individual data bases.

  8. Fifth Baltic Sea pollution load compilation (PLC-5)

    Energy Technology Data Exchange (ETDEWEB)

    Knuuttila, S.; Svendsen, L. M.; Staaf, H.; Kotilainen, P.; Boutrup, S.; Pyhala, M.; Durkin, M.

    2011-07-01

    This report includes the main results from the Fifth Pollution Load Compilation abbreviated PLC-5. It includes quantified annual waterborne total loads (from rivers, unmonitored and coastal areas as well as direct point and diffuse sources discharging directly to the Baltic Sea) from 1994 to 2008 to provide a basis for evaluating any decreasing (or increasing) trends in the total waterborne inputs to the Baltic Sea. Chapter 1 contains the objectives of PLC and the framework on classification of inputs and sources. Chapter 2 includes a short description of the Baltic Sea catchment area, while the methods for quantification and analysis together with quality assurance topics are briefly introduced in Chapter 3. More detailed information on methodologies is presented in the PLC-5 guidelines (HELCOM 2006). Chapter 4 reports the total inputs to the Baltic Sea of nutrients and selected heavy metals. Furthermore, the results of the quatification of discharges and losses of nitrogen and phosphorus from point and diffuse sources into inland surface waters within the Baltic Sea catchment area (source-oriented approach or gross loads) as well as the total load to the maritime area (load-oriented approarch or net loads) in 2006 are shown. Typically, results are presented by country and by main Baltic Sea sub-region. In Chapter 5, flow normalization is introduced and the results of trend analyses on 1994-2008 time series data on total waterborne loads of nitrogen and phosphorus are given together with a first evaluation of progress in obtaining the provisional reduction targets by country and by main Baltic Sea sub-region. Chapter 6 includes discussion of some of the main conclusions and advice for future PLCs. The annexes contain the flow-normalized annual load data and figures and tables with results from the PLC-5.

  9. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  10. Compilation of Requirements for Safe Handling of Fluorine and Fluorine-Containing Products of Uranium Hexafluoride Conversion

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Hightower, J.R.; Begovich, J.M.

    2000-01-01

    Public Law (PL) 105--204 requires the U.S. Department of Energy to develop a plan for inclusion in the fiscal year 2000 budget for conversion of the Department's stockpile of depleted uranium hexafluoride (DUF6) to a more stable form over an extended period. The conversion process into a more stable form will produce fluorine compounds (e.g., elemental fluorine or hydrofluoric acid) that need to be handled safely. This document compiles the requirements necessary to handle these materials within health and safety standards, which may apply in order to ensure protection of the environment and the safety and health of workers and the public

  11. Guide to NRC reporting and recordkeeping requirements. Compiled from requirements in Title 10 of the U.S. Code of Federal Regulations as codified on December 31, 1993; Revision 1

    International Nuclear Information System (INIS)

    Collins, M.; Shelton, B.

    1994-07-01

    This compilation includes in the first two sections the reporting and recordkeeping requirements applicable to US Nuclear Regulatory Commission (NRC) licensees and applicants and to members of the public. It includes those requirements codified in Title 10 of the code of Federal Regulations, Chapter 1, on December 31, 1993. It also includes, in a separate section, any of those requirements that were superseded or discontinued between January 1992 and December 1993. Finally, the appendix lists mailing and delivery addresses for NRC Headquarters and Regional Offices mentioned in the compilation. The Office of Information Resources Management staff compiled this listing of reporting and recordkeeping requirements to briefly describe each in a single document primarily to help licensees readily identify the requirements. The compilation is not a substitute for the regulations, and is not intended to impose any new requirements or technical positions. It is part of NRC's continuing efforts to comply with the Paperwork Reduction Act of 1980 and the Office of Management and Budget regulations that mandate effective and efficient Federal information resources management programs

  12. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1995 annual. Volume 17

    International Nuclear Information System (INIS)

    1996-04-01

    This compilation contains 44 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1995. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area

  13. Sandia National Laboratories/New Mexico Environmental Information Document - Volume II

    Energy Technology Data Exchange (ETDEWEB)

    GUERRERO, JOSEPH V.; KUZIO, KENNETH A.; JOHNS, WILLIAM H.; BAYLISS, LINDA S.; BAILEY-WHITE, BRENDA E.

    1999-09-01

    This Sandia National Laboratories/New Mexico Environmental Information Document (EID) compiles information on the existing environment, or environmental baseline, for SNUNM. Much of the information is drawn from existing reports and databases supplemented by new research and data. The SNL/NM EID, together with the Sandia National Laboratories/New Mexico Facilities and Safety Information Document, provide a basis for assessing the environment, safety, and health aspects of operating selected facilities at SNL/NM. The environmental baseline provides a record of the existing physical, biological, and socioeconomic environment at SNL/NLM prior to being altered (beneficially or adversely) by proposed programs or projects. More specifically, the EID provides information on the following topics: Geology; Land Use; Hydrology and Water Resources; Air Quality and Meteorology; Ecology; Noise and Vibration; Cultural Resources; Visual Resources; Socioeconomic and Community Services; Transportation; Material Management; Waste Management; and Regulatory Requirements.

  14. Sandia National Laboratories/New Mexico Environmental Information Document - Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    BAYLISS, LINDA S.; GUERRERO, JOSEPH V.; JOHNS, WILLIAM H.; KUZIO, KENNETH A.; BAILEY-WHITE, BRENDA E.

    1999-09-01

    This Sandia National Laboratories/New Mexico Environmental Information Document (EID) compiles information on the existing environment, or environmental baseline, for SNUNM. Much of the information is drawn from existing reports and databases supplemented by new research and data. The SNL/NM EID, together with the Sandia National Laboratories/New Mexico Facilities and Safety Information Document, provide a basis for assessing the environment, safety, and health aspects of operating selected facilities at SNL/NM. The environmental baseline provides a record of the existing physical, biological, and socioeconomic environment at SNL/NLM prior to being altered (beneficially or adversely) by proposed programs or projects. More specifically, the EID provides information on the following topics: Geology; Land Use; Hydrology and Water Resources; Air Quality and Meteorology; Ecology; Noise and Vibration; Cultural Resources; Visual Resources; Socioeconomic and Community Services; Transportation; Material Management; Waste Management; and Regulatory Requirements.

  15. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  16. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  17. Thirty years of progress in harmonizing and compiling food data as a result of the establishment of INFOODS.

    Science.gov (United States)

    Murphy, Suzanne P; Charrondiere, U Ruth; Burlingame, Barbara

    2016-02-15

    The International Network of Foods Data Systems (INFOODS) has provided leadership on the development and use of food composition data for over 30years. The mission of INFOODS is the promotion of international participation, cooperation and harmonization in the generation, compilation and dissemination of adequate and reliable data on the composition of foods, beverages, and their ingredients in forms appropriate to meet the needs of various users. Achievements include the development of guidelines and standards, increased capacity development in generating and compiling food composition data, a food composition database management system, improvements in laboratory quality assurance, and development of several food composition databases and tables. Recently, INFOODS has led efforts to define and document food biodiversity. As new foods and food components come into prominence, and as analytical methods evolve, the activities of INFOODS will continue to advance the quality and quantity of food composition data globally into the future. Copyright © 2015 Food and Agriculture Organization of the United Nations. Published by Elsevier Ltd.. All rights reserved.

  18. The NASA earth resources spectral information system: A data compilation, second supplement

    Science.gov (United States)

    Vincent, R. K.

    1973-01-01

    The NASA Earth Resources Spectral Information System (ERSIS) and the information contained therein are described. It is intended for use as a second supplement to the NASA Earth Resources Spectral Information System: A Data Compilation, NASA CR-31650-24-T, May 1971. The current supplement includes approximately 100 rock and mineral, and 375 vegetation directional reflectance spectral curves in the optical region from 0.2 to 22.0 microns. The data were categorized by subject and each curve plotted on a single graph. Each graph is fully titled to indicate curve source and indexed by subject to facilitate user retrieval from ERSIS magnetic tape records.

  19. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  20. A compilation of reports of The Advisory Committee on Nuclear Waste, July 1988--June 1990

    International Nuclear Information System (INIS)

    1990-08-01

    This compilation contains 37 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the first two years of its operation. The reports were submitted to the Chairman or to the Executive Director for Operations, US Nuclear Regulatory Commission (NRC). Topics include the NRC analysis of the US Department of Energy Site Characterization Plan for the high-level radioactive waste repository, the standards promulgated by the US Environmental Protection Agency for the disposal of high-level waste, the NRC policy statement on Below Regulatory Concern, technical documents prepared by the NRC Staff relative to the decommissioning of nuclear power plants, the stabilization of uranium mill tailings piles, and environmental monitoring. All reports prepared by the Committee have been made available to the public through the NRC Public Document Room and the US Library of Congress. Included in an Appendix is a listing of references to related reports on nuclear waste matters that were issued by the Advisory Committee on Reactor Safeguards prior to the establishment of the ACNW

  1. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  2. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  3. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  4. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  5. Sources for charged particles; Les sources de particules chargees

    Energy Technology Data Exchange (ETDEWEB)

    Arianer, J.

    1997-09-01

    This document is a basic course on charged particle sources for post-graduate students and thematic schools on large facilities and accelerator physics. A simple but precise description of the creation and the emission of charged particles is presented. This course relies on every year upgraded reference documents. Following relevant topics are considered: electronic emission processes, technological and practical considerations on electron guns, positron sources, production of neutral atoms, ionization, plasma and discharge, different types of positive and negative ion sources, polarized particle sources, materials for the construction of ion sources, low energy beam production and transport. (N.T.).

  6. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1986-04-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section list names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  7. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  8. Standard guide for formats for collection and compilation of corrosion data for metals for computerized database input

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This guide covers the data categories and specific data elements (fields) considered necessary to accommodate desired search strategies and reliable data comparisons in computerized corrosion databases. The data entries are designed to accommodate data relative to the basic forms of corrosion and to serve as guides for structuring multiple source database compilations capable of assessing compatibility of metals and alloys for a wide range of environments and exposure conditions.

  9. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  10. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  11. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  12. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  13. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin; Shearer, Peter; Ampuero, Jean‐Paul; Lay, Thorne

    2016-01-01

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  14. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  15. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  16. Multiple sclerosis documentation system (MSDS): moving from documentation to management of MS patients.

    Science.gov (United States)

    Ziemssen, Tjalf; Kempcke, Raimar; Eulitz, Marco; Großmann, Lars; Suhrbier, Alexander; Thomas, Katja; Schultheiss, Thorsten

    2013-09-01

    The long disease duration of multiple sclerosis and the increasing therapeutic options require a individualized therapeutic approach which should be carefully documented over years of observation. To switch from MS documentation to an innovative MS management, new computer- and internet-based tools could be implemented as we could demonstrate with the novel computer-based patient management system "multiple sclerosis management system 3D" (MSDS 3D). MSDS 3D allows documentation and management of visit schedules and mandatory examinations via defined study modules by integration of data input from various sources (patients, attending physicians and MS nurses). It provides forms for the documentation of patient visits as well as clinical and diagnostic findings. Information can be collected via interactive touch screens. Specific modules allow the management of highly efficacious treatments as natalizumab or fingolimod. MSDS can be used to transfer the documented data to databases as, e.g. the registry of the German MS society or REGIMS. MSDS has already been implemented successfully in clinical practice and is currently being evaluated in a multicenter setting. High-quality management and documentation are crucial for improvements in clinical practice and research work.

  17. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  18. Compilation of contract research for the Chemical Engineering Branch, Division of Engineering Technology. Annual report for FY 1985

    International Nuclear Information System (INIS)

    1986-07-01

    This compilation of annual research reports by the contractors to the Chemical Engineering Branch, DET, is published to disseminate information from ongoing programs and covers research conducted during fiscal year 1985. The programs covered in this document include research on: (1) engineered safety feature (ESF) system effectiveness in terms of fission product retention under severe accident conditions; (2) effectiveness and safety aspects of selected decontamination methods; (3) decontamination impacts on solidification and waste disposal; (4) evaluation of nuclear facility decommissioning projects and concepts, and (5) operational schemes to prevent or mitigate the effects of hydrogen combustion during LWR accidents

  19. Summary report of the 1. research co-ordination meeting on compilation and evaluation of photonuclear data for applications

    International Nuclear Information System (INIS)

    1997-04-01

    The present report contains the summary of the first Research Co-ordination Meeting on ''Compilation and Evaluation of Photonuclear Data for Applications'', held in Obninsk, Russia, from 3 to 6 December 1996. The project aims to produce a Technical Document on Photonuclear Data Library for Applications and to develop an IAEA Photonuclear Data Library. Summarized are the conclusions and recommendations of the meeting together with a detailed list of actions. Attached is the information sheet on the project, the agenda of the meeting and the list of participants along with extended abstracts of their presentations. Refs, figs, tabs

  20. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  1. SEGY to ASCII: Conversion and Plotting Program

    Science.gov (United States)

    Goldman, Mark R.

    1999-01-01

    This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/

  2. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  3. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2009-10-08

    Between 1951 and 1992, underground nuclear weapons testing was conducted at 828 sites on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  4. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  5. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  6. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  7. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  8. Freshwater Biological Traits Database (Traits)

    Science.gov (United States)

    The traits database was compiled for a project on climate change effects on river and stream ecosystems. The traits data, gathered from multiple sources, focused on information published or otherwise well-documented by trustworthy sources.

  9. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  10. Ada Compiler Validation Summary Report: Certificate Number: 940325S1. 11352 DDC-I DACS Sun SPARC/Solaries to Pentium PM Bare Ada Cross Compiler System, Version 4.6.4 Sun SPARCclassic = Intel Pentium (Operated as Bare Machine) Based in Xpress Desktop (Intel Product Number: XBASE6E4F-B)

    Science.gov (United States)

    1994-03-25

    Best Available Copy REPORT DOCUMENTATION PAGE _V -ONC Uft Xaf. WO -" Am u~~ ns~ 940325SI. 11352 , AVV: 94ddc5OO_3d. Compiler: DACS Sun SPARC/ aonais to...Manual for the Ada Proarammina Language, ANSI/MIL-STD-1815A, February 1983 and ISO 8652-1987. [Pro92] Ada Coupiler Validation Procedures, Version 3.1...objectives found to be irrelevant for the given Ada implementation. ISO International Organization for Standardization. LRM The Ada standard, or

  11. Swedish deep repository siting programme. Guide to the documentation of 25 years of geoscientific research (1976-2000)

    Energy Technology Data Exchange (ETDEWEB)

    Milnes, Alan Geoffrey [GEA Consulting, Uppsala (Sweden)

    2002-03-01

    Since the mid-1970s, the Swedish Nuclear Fuel and Waste Management Company (SKB) has been carrying out geoscientific research and feasibility studies aimed at identifying suitable sites for deep repositories in the Precambrian basement of the Baltic Shield. The documentation of this research effort forms an extensive body of material which is exceptionally wide-ranging and which is generally little known outside the Swedish nuclear waste community. This has now been compiled in the form of a 'documentation guide' in order to make the research results more easily accessible to the scientific community at large, and to show how they relate to their 'nearest surroundings', i.e. the relevant academic scientific literature and the documentation of similar research by other institutions, in Sweden and in other countries (Finland, Canada). The documentation covers the period 1976-2000 and contains ca. 850 citations, of which about half are technical reports published by SKB and its forerunners. In the main body of the guide (Chapters 2-9), the material is arranged thematically and the scope of the documentation in each theme is described and commented in short texts, showing the interrelationships between the individual reports and scientific papers, with appropriate cross-references. Early chapters (2-5, and 7) cover general themes: bedrock geology, fracturing, glaciation and crustal dynamics, deep groundwater, and geosphere transport, each subdivided into citation groups under headings which are of particular interest to the Swedish deep repository siting programme. Later chapters (6, and 8-9) include thumbnail sketches of the Swedish study sites (Finnsjoen, Fjaellveden, Gideaa, Kamlunge, Klipperaas, Sternoe), the underground laboratory sites of Stripa and Aespoe, and comparable sites in Finland and Canada, as well as the complete documentation to the feasibility studies carried out in eight Swedish municipalities between 1993 and 2000 (Storuman

  12. Swedish deep repository siting programme. Guide to the documentation of 25 years of geoscientific research (1976-2000)

    International Nuclear Information System (INIS)

    Milnes, Alan Geoffrey

    2002-03-01

    Since the mid-1970s, the Swedish Nuclear Fuel and Waste Management Company (SKB) has been carrying out geoscientific research and feasibility studies aimed at identifying suitable sites for deep repositories in the Precambrian basement of the Baltic Shield. The documentation of this research effort forms an extensive body of material which is exceptionally wide-ranging and which is generally little known outside the Swedish nuclear waste community. This has now been compiled in the form of a 'documentation guide' in order to make the research results more easily accessible to the scientific community at large, and to show how they relate to their 'nearest surroundings', i.e. the relevant academic scientific literature and the documentation of similar research by other institutions, in Sweden and in other countries (Finland, Canada). The documentation covers the period 1976-2000 and contains ca. 850 citations, of which about half are technical reports published by SKB and its forerunners. In the main body of the guide (Chapters 2-9), the material is arranged thematically and the scope of the documentation in each theme is described and commented in short texts, showing the interrelationships between the individual reports and scientific papers, with appropriate cross-references. Early chapters (2-5, and 7) cover general themes: bedrock geology, fracturing, glaciation and crustal dynamics, deep groundwater, and geosphere transport, each subdivided into citation groups under headings which are of particular interest to the Swedish deep repository siting programme. Later chapters (6, and 8-9) include thumbnail sketches of the Swedish study sites (Finnsjoen, Fjaellveden, Gideaa, Kamlunge, Klipperaas, Sternoe), the underground laboratory sites of Stripa and Aespoe, and comparable sites in Finland and Canada, as well as the complete documentation to the feasibility studies carried out in eight Swedish municipalities between 1993 and 2000 (Storuman, Malaa, Nykoeping

  13. Managing the consistency of distributed documents

    OpenAIRE

    Nentwich, C.

    2005-01-01

    Many businesses produce documents as part of their daily activities: software engineers produce requirements specifications, design models, source code, build scripts and more; business analysts produce glossaries, use cases, organisation charts, and domain ontology models; service providers and retailers produce catalogues, customer data, purchase orders, invoices and web pages. What these examples have in common is that the content of documents is often semantically relate...

  14. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  15. An Introduction to Document Imaging in the Financial Aid Office.

    Science.gov (United States)

    Levy, Douglas A.

    2001-01-01

    First describes the components of a document imaging system in general and then addresses this technology specifically in relation to financial aid document management: its uses and benefits, considerations in choosing a document imaging system, and additional sources for information. (EV)

  16. A compilation of information on the {sup 31}P(p,{alpha}){sup 28}Si reaction and properties of excited levels in the compound nucleus {sup 32}S

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.E.; Smith, D.L. [Argonne National Lab., IL (United States). Technology Development Div.

    1997-11-01

    This report documents a survey of the literature, and provides a compilation of data contained therein, for the {sup 31}P(p,{alpha}){sup 28}Si reaction. Attention is paid here to resonance states in the compound-nuclear system {sup 32}S formed by {sup 31}P + p, with emphasis on the alpha-particle decay channels, {sup 28}Si + {alpha} which populate specific levels in {sup 28}Si. The energy region near the proton separation energy for {sup 32}S is especially important in this context for applications in nuclear astrophysics. Properties of the excited states in {sup 28}Si are also considered. Summaries of all the located references are provided and numerical data contained in them are compiled in EXFOR format where applicable.

  17. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  18. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1987-09-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The INDC Secretariat tries to maintain this list up-to-date in order to facilitate an efficient interchange of information on nuclear data topics. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  19. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  20. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  1. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  2. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  3. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  4. Compilation of Requirements for Safe Handling of Fluorine and Fluorine-Containing Products of Uranium Hexafluoride Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Ferrada, J.J.

    2000-04-03

    Public Law (PL) 105-204 requires the U.S. Department of Energy to develop a plan for inclusion in the fiscal year 2000 budget for conversion of the Department's stockpile of depleted uranium hexafluoride (DUF{sub 6}) to a more stable form over an extended period. The conversion process into a more stable form will produce fluorine compounds (e.g., elemental fluorine or hydrofluoric acid) that need to be handled safely. This document compiles the requirements necessary to handle these materials within health and safety standards, which may apply in order to ensure protection of the environment and the safety and health of workers and the public. Fluorine is a pale-yellow gas with a pungent, irritating odor. It is the most reactive nonmetal and will react vigorously with most oxidizable substances at room temperature, frequently with ignition. Fluorine is a severe irritant of the eyes, mucous membranes, skin, and lungs. In humans, the inhalation of high concentrations causes laryngeal spasm and broncospasms, followed by the delayed onset of pulmonary edema. At sublethal levels, severe local irritation and laryngeal spasm will preclude voluntary exposure to high concentrations, unless the individual is trapped or incapacitated. A blast of fluorine gas on the shaved skin of a rabbit causes a second degree burn. Lower concentrations cause severe burns of insidious onset, resulting in ulceration, similar to the effects produced by hydrogen fluoride. Hydrofluoric acid is a colorless, fuming liquid or gas with a pungent odor. It is soluble in water with release of heat. Ingestion of an estimated 1.5 grams produced sudden death without gross pathological damage. Repeated ingestion of small amounts resulted in moderately advanced hardening of the bones. Contact of skin with anhydrous liquid produces severe burns. Inhalation of AHA or aqueous hydrofluoric acid mist or vapors can cause severe respiratory tract irritation that may be fatal. Based on the extreme chemical

  5. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  6. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  7. Integrated criteria document mercury

    International Nuclear Information System (INIS)

    Sloof, W.; Beelan, P. van; Annema, J.A.; Janus, J.A.

    1995-01-01

    The document contains a systematic review and a critical evaluation of the most relevant data on the priority substance mercury for the purpose of effect-oriented environmental policy. Chapter headings are: properties and existing standards; production, application, sources and emissions (natural sources, industry, energy, households, agriculture, dental use, waste); distribution and transformation (cinnabar; Hg 2+ , Hg 2 2+ , elemental mercury, methylmercury, behavior in soil, water, air, biota); concentrations and fluxes in the environment and exposure levels (sampling and measuring methods, occurrence in soil, water, air etc.); effects (toxicity to humans and aquatic and terrestrial systems); emissions reduction (from industrial sources, energy, waste processing etc.); and evaluation (risks, standards, emission reduction objectives, measuring strategies). 395 refs

  8. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  9. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  10. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  11. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  12. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  13. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  14. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  15. Compilation of radiometric age and trace-element geochemical data, Yucca Mountain and surrounding areas of southwestern Nevada

    International Nuclear Information System (INIS)

    Weiss, S.I.; Noble, D.C.; Larson, L.T.

    1994-01-01

    This document is a compilation of available radiometric age and trace-element geochemical data for volcanic rocks and episodes of hydrothermal activity in Yucca Mountain and the surrounding region of southwestern Nevada. Only the age determinations considered to be geologically reasonable (consistent with stratigraphic relations) are listed below. A number of the potassium-argon (K-Ar) ages of volcanic rocks given by Kistler, Marvin et al., Noble et al., Weiss et al., and Noble et al. are not included as these ages have been shown to be incorrect or disturbed by hydrothermal alteration based on subsequent stratigraphic and/or petrographic data and the recognition of errors in K-Ar age determinations related to incomplete extraction of argon. In cases where absolute ages are tightly constrained by high precision 40 Ar/ 39 Ar ages and unequivocal stratigraphic relations, we have omitted the less precise K-Ar age data. Similarly, the more precise single-crystal laser-fusion 40 Ar/ 39 Ar age determinations of certain units are reported and less precise ages by multi-grain bulk-fusion 40 Ar/ 39 Ar methods are not included. This compilation does not include age data for basaltic rocks of Pliocene and Quaternary age in the Yucca Mountain region

  16. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  17. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  18. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  19. Applying Satellite Data Sources in the Documentation and Landscape Modelling for Graeco-Roman Fortified Sites in the TŪR Abdin Area, Eastern Turkey

    Science.gov (United States)

    Silver, K.; Silver, M.; Törmä, M.; Okkonen, J.; Okkonen, T.

    2017-08-01

    In 2015-2016 the Finnish-Swedish Archaeological Project in Mesopotamia (FSAPM) initiated a pilot study of an unexplored area in the Tūr Abdin region in Northern Mesopotamia (present-day Mardin Province in southeastern Turkey). FSAPM is reliant on satellite image data sources for prospecting, identifying, recording, and mapping largely unknown archaeological sites as well as studying their landscapes in the region. The purpose is to record and document sites in this endangered area for saving its cultural heritage. The sites in question consist of fortified architectural remains in an ancient border zone between the Graeco-Roman/Byzantine world and Parthia/Persia. The location of the archaeological sites in the terrain and the visible archaeological remains, as well as their dimensions and sizes were determined from the ortorectified satellite images, which also provided coordinates. In addition, field documentation was carried out in situ with photographs and notes. The applicability of various satellite data sources for the archaeological documentation of the project was evaluated. Satellite photographs from three 1968 CORONA missions, i.e. the declassified US government satellite photograph archives were acquired. Furthermore, satellite images included a recent GeoEye-1 Satellite Sensor Image from 2010 with a resolution of 0.5 m. Its applicability for prospecting archaeological sites, studying the terrain and producing landscape models in 3D was confirmed. The GeoEye-1 revealed the ruins of a fortified town and a fortress for their documentation and study. Landscape models for the area of these sites were constructed fusing GeoEye-1 with EU-DEM (European Digital Elevation Model data using SRTM and ASTER GDEM data) in order to understand their locations in the terrain.

  20. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  1. Renewable energy sources. Erneuerbare Energien

    Energy Technology Data Exchange (ETDEWEB)

    1988-01-01

    To judge future trends in work on the exploitation of renewable energy sources for overall energy supply, it is necessary to know the following: the rules that nature abides by, the principles of technical exploitation of these energies, and the basic data for the current state of development. The above information is compiled in this publication for those renewable energy sources on which topical discussion centres: solar radiation and wind. For the remaining renowable energy sources (e.g. biomass, tidal power, geothermal energy), some examples of use are mentioned and advanced literature is indicated. (orig./HSCH).

  2. Basic freight forwarding and transport
 documentation in freight forwarder’s work

    Directory of Open Access Journals (Sweden)

    Adam Salomon

    2014-09-01

    Full Text Available The purpose of the article is to present the basic documentation in international freight forwarder’s work, in particular, insurance documents and transport documents in various modes of transport. An additional goal is to identify sources of the paper, which can be used to properly completing the individual documents.

  3. Fiscal year 1999 waste information requirements document

    International Nuclear Information System (INIS)

    Adams, M.R.

    1998-01-01

    The Waste Information Requirements Document (WIRD) has the following purposes: To describe the overall drivers that require characterization information and to document their source; To define how characterization is going to satisfy the drivers, close issues, and measure and report progress; and To describe deliverables and acceptance criteria for characterization. Characterization information is required to maintain regulatory compliance, perform operations and maintenance, resolve safety issues, and prepare for disposal of waste. Commitments addressing these requirements are derived from the Hanford Federal Facility Agreement and Consent Order, also known as the Tri-Party Agreement; the Recommendation 93-5 Implementation Plan (DOE-RL 1996a) to the Defense Nuclear Facilities Safety Board (DNFSB); and other requirement sources listed in Section 2.0. The Waste Information Requirements Document replaces the tank waste analysis plans and the tank characterization plan previously required by the Tri-Party Agreement, Milestone M-44-01 and M-44-02 series

  4. Integration of clinical research documentation in electronic health records.

    Science.gov (United States)

    Broach, Debra

    2015-04-01

    Clinical trials of investigational drugs and devices are often conducted within healthcare facilities concurrently with clinical care. With implementation of electronic health records, new communication methods are required to notify nonresearch clinicians of research participation. This article reviews clinical research source documentation, the electronic health record and the medical record, areas in which the research record and electronic health record overlap, and implications for the research nurse coordinator in documentation of the care of the patient/subject. Incorporation of clinical research documentation in the electronic health record will lead to a more complete patient/subject medical record in compliance with both research and medical records regulations. A literature search provided little information about the inclusion of clinical research documentation within the electronic health record. Although regulations and guidelines define both source documentation and the medical record, integration of research documentation in the electronic health record is not clearly defined. At minimum, the signed informed consent(s), investigational drug or device usage, and research team contact information should be documented within the electronic health record. Institutional policies should define a standardized process for this integration in the absence federal guidance. Nurses coordinating clinical trials are in an ideal position to define this integration.

  5. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  6. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  7. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  8. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  9. Compilation of data relating to the erosive response of 608 recently-burned basins in the western United States

    Science.gov (United States)

    Gartner, Joseph E.; Cannon, Susan H.; Bigio, Erica R.; Davis, Nicole K.; Parrett, Charles; Pierce, Kenneth L.; Rupert, Michael G.; Thurston, Brandon L.; Trebesch, Matthew J.; Garcia, Steve P.; Rea, Alan H.

    2005-01-01

    This report presents a compilation of data on the erosive response, debris-flow initiation processes, basin morphology, burn severity, event-triggering rainfall, rock type, and soils for 608 basins recently burned by 53 fires located throughout the Western United States.  The data presented here are a combination of those collected during our own field research and those reported in the literature.  In some cases, data from a Geographic Information System (GIS) and Digital Elevation Models (DEMs) were used to supplement the data from the primary source.  Due to gaps in the information available, not all parameters are characterized for all basins. This database provides a resource for researchers and land managers interested in examining relations between the runoff response of recently burned basins and their morphology, burn severity, soils and rock type, and triggering rainfall.  The purpose of this compilation is to provide a single resource for future studies addressing problems associated with wildfire-related erosion.  For example, data in this compilation have been used to develop a model for debris flow probability from recently burned basins using logistic multiple regression analysis (Cannon and others, 2004).  This database provides a convenient starting point for other studies.  For additional information on estimated post-fire runoff peak discharges and debris-flow volumes, see Gartner and others (2004).

  10. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  11. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  12. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  13. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  14. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  15. Sources for charged particles

    International Nuclear Information System (INIS)

    Arianer, J.

    1997-01-01

    This document is a basic course on charged particle sources for post-graduate students and thematic schools on large facilities and accelerator physics. A simple but precise description of the creation and the emission of charged particles is presented. This course relies on every year upgraded reference documents. Following relevant topics are considered: electronic emission processes, technological and practical considerations on electron guns, positron sources, production of neutral atoms, ionization, plasma and discharge, different types of positive and negative ion sources, polarized particle sources, materials for the construction of ion sources, low energy beam production and transport. (N.T.)

  16. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  17. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  18. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  19. APPLYING SATELLITE DATA SOURCES IN THE DOCUMENTATION AND LANDSCAPE MODELLING FOR GRAECO-ROMAN/BYZANTINE FORTIFIED SITES IN THE TŪR ABDIN AREA, EASTERN TURKEY

    Directory of Open Access Journals (Sweden)

    K. Silver

    2017-08-01

    Full Text Available In 2015-2016 the Finnish-Swedish Archaeological Project in Mesopotamia (FSAPM initiated a pilot study of an unexplored area in the Tūr Abdin region in Northern Mesopotamia (present-day Mardin Province in southeastern Turkey. FSAPM is reliant on satellite image data sources for prospecting, identifying, recording, and mapping largely unknown archaeological sites as well as studying their landscapes in the region. The purpose is to record and document sites in this endangered area for saving its cultural heritage. The sites in question consist of fortified architectural remains in an ancient border zone between the Graeco-Roman/Byzantine world and Parthia/Persia. The location of the archaeological sites in the terrain and the visible archaeological remains, as well as their dimensions and sizes were determined from the ortorectified satellite images, which also provided coordinates. In addition, field documentation was carried out in situ with photographs and notes. The applicability of various satellite data sources for the archaeological documentation of the project was evaluated. Satellite photographs from three 1968 CORONA missions, i.e. the declassified US government satellite photograph archives were acquired. Furthermore, satellite images included a recent GeoEye-1 Satellite Sensor Image from 2010 with a resolution of 0.5 m. Its applicability for prospecting archaeological sites, studying the terrain and producing landscape models in 3D was confirmed. The GeoEye-1 revealed the ruins of a fortified town and a fortress for their documentation and study. Landscape models for the area of these sites were constructed fusing GeoEye-1 with EU-DEM (European Digital Elevation Model data using SRTM and ASTER GDEM data in order to understand their locations in the terrain.

  20. Chandra Source Catalog: User Interface

    Science.gov (United States)

    Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.

  1. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  2. Documenting the Earliest Chinese Journals

    Directory of Open Access Journals (Sweden)

    Jian-zhong (Joe Zhou

    2001-10-01

    Full Text Available

    頁次:19-24

    According to various authoritative sources, the English word "journal" was first used in the 16lh century, but the existence of the journal in its original meaning as a daily record can be traced back to Acta Diuma (Daily Events in ancient Roman cities as early as 59 B.C. This article documents the first appearance of Chinese daily records that were much early than 59 B.C.

    The evidence of the earlier Chinese daily records came from some important archaeological discoveries in the 1970's, but they were also documented by Sima Qian (145 B.C. - 85 B.C., the grand historian of the Han Dynasty imperial court. Sima's lifetime contribution was the publication of Shi Ji (史記 (The Grand Scribe's Records, the Records hereafter. The Records is a book of history of a grand scope. It encompasses all Chinese history from 30lh century B.C. through the end of the second century B.C. in 130 chapters and over 525,000 Chinese

  3. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  4. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  5. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  6. Comprehensive trends assessment of nitrogen sources and loads to estuaries of the coterminous United States

    Science.gov (United States)

    Sources of nitrogen and phosphorus to estuaries and estuarine watersheds of the coterminous United States have been compiled from a variety of publically available data sources (1985 – 2015). Atmospheric loading was obtained from two sources. Modelled and interpolated meas...

  7. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  8. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1990-04-01

    This document is a monthly publication containing descriptions of information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials, and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. The following indexes are included: Personal Author, Corporate Source, Report Number, and Cross Reference to Principal Documents

  9. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  10. 31 CFR 501.724 - Documents that may be withheld.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Documents that may be withheld. 501.724 Section 501.724 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued... privileged; (2) The document would disclose the identity of a confidential source; or (3) The Administrative...

  11. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  12. Strontium-90 fluoride data sheet

    Energy Technology Data Exchange (ETDEWEB)

    Fullam, H.T.

    1981-06-01

    This report is a compilation of available data and appropriate literature references on the properties of strontium-90 fluoride and nonradioactive strontium fluoride. The objective of the document is to compile in a single source pertinent data to assist potential users in the development, licensing, and use of /sup 90/SrF/sub 2/-fueled radioisotope heat sources for terrestrial power conversion and thermal applications. The report is an update of the Strontium-90 Fluoride Data Sheet (BNWL-2284) originally issued in April 1977.

  13. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  14. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  15. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  16. Utility-preserving privacy protection of textual healthcare documents.

    Science.gov (United States)

    Sánchez, David; Batet, Montserrat; Viejo, Alexandre

    2014-12-01

    The adoption of ITs by medical organisations makes possible the compilation of large amounts of healthcare data, which are quite often needed to be released to third parties for research or business purposes. Many of this data are of sensitive nature, because they may include patient-related documents such as electronic healthcare records. In order to protect the privacy of individuals, several legislations on healthcare data management, which state the kind of information that should be protected, have been defined. Traditionally, to meet with current legislations, a manual redaction process is applied to patient-related documents in order to remove or black-out sensitive terms. This process is costly and time-consuming and has the undesired side effect of severely reducing the utility of the released content. Automatic methods available in the literature usually propose ad-hoc solutions that are limited to protect specific types of structured information (e.g. e-mail addresses, social security numbers, etc.); as a result, they are hardly applicable to the sensitive entities stated in current regulations that do not present those structural regularities (e.g. diseases, symptoms, treatments, etc.). To tackle these limitations, in this paper we propose an automatic sanitisation method for textual medical documents (e.g. electronic healthcare records) that is able to protect, regardless of their structure, sensitive entities (e.g. diseases) and also those semantically related terms (e.g. symptoms) that may disclose the former ones. Contrary to redaction schemes based on term removal, our approach improves the utility of the protected output by replacing sensitive terms with appropriate generalisations retrieved from several medical and general-purpose knowledge bases. Experiments conducted on highly sensitive documents and in coherency with current regulations on healthcare data privacy show promising results in terms of the practical privacy and utility of the

  17. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  18. A GIS-based time-dependent seismic source modeling of Northern Iran

    Science.gov (United States)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  19. Title list of documents made publicly available. Documents from October-December 1978 for Dockets 50-334 through STN 50-597

    International Nuclear Information System (INIS)

    1978-01-01

    This special edition of the Title List contains Docket 50 material from 1978 that has not appeared in previous issues of the Title List. The documents in this supplement are indexed by personal author, corporate source, and report number. The listings of this material on the domestic licensing of production and utilization facilities, divided into the categories used for filing and searching in the NRC Public Document Room, are presented

  20. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  1. Mercury in fish and macroinvertebrates from New York's streams and rivers: A compendium of data sources

    Science.gov (United States)

    Riva-Murray, Karen; Burns, Douglas A.

    2016-01-01

    The U.S. Geological Survey has compiled a list of existing data sets, from selected sources, containing mercury (Hg) concentration data in fish and macroinvertebrate samples that were collected from flowing waters of New York State from 1970 through 2014. Data sets selected for inclusion in this report were limited to those that contain fish and (or) macroinvertebrate data that were collected across broad areas, cover relatively long time periods, and (or) were collected as part of a broader-scale (e.g. national) study or program. In addition, all data sets listed were collected, processed, and analyzed with documented methods, and contain critical sample information (e.g. fish species, fish size, Hg species) that is needed to analyze and interpret the reported Hg concentration data. Fourteen data sets, all from state or federal agencies, are listed in this report, along with selected descriptive information regarding each data source and data set contents. Together, these 14 data sets contain Hg and related data for more than 7,000 biological samples collected from more than 700 unique stream and river locations between 1970 and 2014.

  2. SimVascular 2.0: an Integrated Open Source Pipeline for Image-Based Cardiovascular Modeling and Simulation

    Science.gov (United States)

    Lan, Hongzhi; Merkow, Jameson; Updegrove, Adam; Schiavazzi, Daniele; Wilson, Nathan; Shadden, Shawn; Marsden, Alison

    2015-11-01

    SimVascular (www.simvascular.org) is currently the only fully open source software package that provides a complete pipeline from medical image based modeling to patient specific blood flow simulation and analysis. It was initially released in 2007 and has contributed to numerous advances in fundamental hemodynamics research, surgical planning, and medical device design. However, early versions had several major barriers preventing wider adoption by new users, large-scale application in clinical and research studies, and educational access. In the past years, SimVascular 2.0 has made significant progress by integrating open source alternatives for the expensive commercial libraries previously required for anatomic modeling, mesh generation and the linear solver. In addition, it simplified the across-platform compilation process, improved the graphical user interface and launched a comprehensive documentation website. Many enhancements and new features have been incorporated for the whole pipeline, such as 3-D segmentation, Boolean operation for discrete triangulated surfaces, and multi-scale coupling for closed loop boundary conditions. In this presentation we will briefly overview the modeling/simulation pipeline and advances of the new SimVascular 2.0.

  3. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  4. PHOTOGRAPHY AS DOCUMENT: OTLET AND BRIET’S CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Izângela Maria Sansoni Tonello

    2018-04-01

    Full Text Available Introduction: The amount and variety of information that are conveyed in different media and means incite a concern, especially in relation to photographic documents, since they are currently the focus of interest of the Information Science field. In this context, this paper emphasizes the role of photographs as sources of information capable of generating knowledge as well as an important aid for research in different areas. Objective: The main goal of this study was to research the concepts and definitions underpinning the photograph as a document in information units. Methodology: Bibliographic and documentary research. Results: It can be affirmed through the meanings about the term document discussed in the literature by the researched authors that the photograph corresponds to the assumptions necessary to substantiate document and photograph in photographic document. Conclusions: It is understood that this study clarifies some issues related to photograph as a document; however, this proposition raises reflections about the importance of the production context as well as its essential relationship with other documents, so that it is indisputably consolidated as a photographic document.

  5. Clustering document fragments using background color and texture information

    Science.gov (United States)

    Chanda, Sukalpa; Franke, Katrin; Pal, Umapada

    2012-01-01

    Forensic analysis of questioned documents sometimes can be extensively data intensive. A forensic expert might need to analyze a heap of document fragments and in such cases to ensure reliability he/she should focus only on relevant evidences hidden in those document fragments. Relevant document retrieval needs finding of similar document fragments. One notion of obtaining such similar documents could be by using document fragment's physical characteristics like color, texture, etc. In this article we propose an automatic scheme to retrieve similar document fragments based on visual appearance of document paper and texture. Multispectral color characteristics using biologically inspired color differentiation techniques are implemented here. This is done by projecting document color characteristics to Lab color space. Gabor filter-based texture analysis is used to identify document texture. It is desired that document fragments from same source will have similar color and texture. For clustering similar document fragments of our test dataset we use a Self Organizing Map (SOM) of dimension 5×5, where the document color and texture information are used as features. We obtained an encouraging accuracy of 97.17% from 1063 test images.

  6. Compilation of selected deep-sea biological data for the US subseabed disposal project

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-03-01

    The US Subseabed Disposal Project (SDP) has compiled an extensive deep-sea biological data base to be used in calculating biological parameters of state and rate included in mathematical models of oceanographic transport of radionuclides. The data base is organized around a model deep-sea ecosystem which includes the following components: zooplankton, fish and other nekton, invertebrate benthic megafauna, benthic macrofauna, benthic meiofauna, heterotrophic microbiota, as well as suspended and sediment particulate organic carbon. Measurements of abundance and activity rates (e.g., respiration, production, sedimentation, etc.) reported in the international oceanographic literature are summarized in 23 tables. Included in these tables are the latitudinal position of the studies, as well as information describing sampling techniques and any special notes needed to better assess the data presented. This report has been prepared primarily as a resource document to be used in calculating parameter values for various modeling applications, and for preparing historical data reviews for other SDP reports. Depending on the intended use, these data will require further reduction and unit conversion

  7. Civilian Radioactive Waste Management System Requirements Document

    International Nuclear Information System (INIS)

    1992-12-01

    This document specifies the top-level requirements for the Civilian Radioactive Waste Management System (CRWMS). The document is referred to herein as the CRD, for CRWMS Requirements document. The OCRWM System Engineering Management Plan (SEMP) establishes the technical document hierarchy (hierarchy of technical requirements and configuration baseline documents) for the CRWMS program. The CRD is the top-level document in this hierarchy. The immediate subordinate documents are the System Requirements Documents (SRDS) for the four elements of the CRWMS and the Interface Specification (IFS). The four elements of the CRWMS are the Waste Acceptance System, the Transportation System, the Monitored Retrievable Storage (MRS) System and the Mined Geologic Disposal System (MGDS). The Interface Specification describes the six inter-element interfaces between the four elements. This hierarchy establishes the requirements to be addressed by the design of the system elements. Many of the technical requirements for the CRWMS are documented in a variety of Federal regulations, DOE directives and other Government documentation. It is the purpose of the CRD to establish the technical requirements for the entire program. In doing so, the CRD summarizes source documentation for requirements that must be addressed by the program, specifies particular requirements, and documents derived requirements that are not covered in regulatory and other Government documentation, but are necessary to accomplish the mission of the CRWMS. The CRD defines the CRWMS by identifying the top-level functions the elements must perform (These top-level functions were derived using functional analysis initially documented in the Physical System Requirements (PSR) documents). The CRD also defines the top-level physical architecture of the system and allocates the functions and requirements to the architectural elements of the system

  8. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  9. Technical Document on Control of Nitrogen Oxides From Municipal Waste Combustors

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  10. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising.

    Science.gov (United States)

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-06-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches.

  11. Improving the Product Documentation Process of a Small Software Company

    Science.gov (United States)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  12. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  13. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  14. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  15. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Science.gov (United States)

    Cunningham, Hamish; Tablan, Valentin; Roberts, Angus; Bontcheva, Kalina

    2013-01-01

    This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group) who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  16. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Directory of Open Access Journals (Sweden)

    Hamish Cunningham

    Full Text Available This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  17. New Tools to Document and Manage Data/Metadata: Example NGEE Arctic and UrbIS

    Science.gov (United States)

    Crow, M. C.; Devarakonda, R.; Hook, L.; Killeffer, T.; Krassovski, M.; Boden, T.; King, A. W.; Wullschleger, S. D.

    2016-12-01

    Tools used for documenting, archiving, cataloging, and searching data are critical pieces of informatics. This discussion describes tools being used in two different projects at Oak Ridge National Laboratory (ORNL), but at different stages of the data lifecycle. The Metadata Entry and Data Search Tool is being used for the documentation, archival, and data discovery stages for the Next Generation Ecosystem Experiment - Arctic (NGEE Arctic) project while the Urban Information Systems (UrbIS) Data Catalog is being used to support indexing, cataloging, and searching. The NGEE Arctic Online Metadata Entry Tool [1] provides a method by which researchers can upload their data and provide original metadata with each upload. The tool is built upon a Java SPRING framework to parse user input into, and from, XML output. Many aspects of the tool require use of a relational database including encrypted user-login, auto-fill functionality for predefined sites and plots, and file reference storage and sorting. The UrbIS Data Catalog is a data discovery tool supported by the Mercury cataloging framework [2] which aims to compile urban environmental data from around the world into one location, and be searchable via a user-friendly interface. Each data record conveniently displays its title, source, and date range, and features: (1) a button for a quick view of the metadata, (2) a direct link to the data and, for some data sets, (3) a button for visualizing the data. The search box incorporates autocomplete capabilities for search terms and sorted keyword filters are available on the side of the page, including a map for searching by area. References: [1] Devarakonda, Ranjeet, et al. "Use of a metadata documentation and search tool for large data volumes: The NGEE arctic example." Big Data (Big Data), 2015 IEEE International Conference on. IEEE, 2015. [2] Devarakonda, R., Palanisamy, G., Wilson, B. E., & Green, J. M. (2010). Mercury: reusable metadata management, data discovery

  18. Who has used internal company documents for biomedical and public health research and where did they find them?

    Science.gov (United States)

    Wieland, L Susan; Rutkow, Lainie; Vedula, S Swaroop; Kaufmann, Christopher N; Rosman, Lori M; Twose, Claire; Mahendraratnam, Nirosha; Dickersin, Kay

    2014-01-01

    To describe the sources of internal company documents used in public health and healthcare research. We searched PubMed and Embase for articles using internal company documents to address a research question about a health-related topic. Our primary interest was where authors obtained internal company documents for their research. We also extracted information on type of company, type of research question, type of internal documents, and funding source. Our searches identified 9,305 citations of which 357 were eligible. Scanning of reference lists and consultation with colleagues identified 4 additional articles, resulting in 361 included articles. Most articles examined internal tobacco company documents (325/361; 90%). Articles using documents from pharmaceutical companies (20/361; 6%) were the next most common. Tobacco articles used documents from repositories; pharmaceutical documents were from a range of sources. Most included articles relied upon internal company documents obtained through litigation (350/361; 97%). The research questions posed were primarily about company strategies to promote or position the company and its products (326/361; 90%). Most articles (346/361; 96%) used information from miscellaneous documents such as memos or letters, or from unspecified types of documents. When explicit information about study funding was provided (290/361 articles), the most common source was the US-based National Cancer Institute. We developed an alternative and more sensitive search targeted at identifying additional research articles using internal pharmaceutical company documents, but the search retrieved an impractical number of citations for review. Internal company documents provide an excellent source of information on health topics (e.g., corporate behavior, study data) exemplified by articles based on tobacco industry documents. Pharmaceutical and other industry documents appear to have been less used for research, indicating a need for funding for

  19. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  20. Foreign electronic information sources about environment in the Internet

    International Nuclear Information System (INIS)

    Svrsek, L.

    2005-01-01

    This presentation deals with external electronic information sources (e-sources) i. e. about data bases which are formed no by users or their institutes. Data bases are compiled by producers of data which are publishing in different forms and offerer it for users by different form. In the first part of contribution e-sources are described at the first generally. In the second part, some most significant data bases about environment in on-line medium of Internet, are described in detail

  1. Briefs for Parents in Ready-To-Copy Form: English and Spanish. 1993 Compilation.

    Science.gov (United States)

    Howley, Craig; Cahape, Pat

    This document contains English and Spanish versions of six one-page reports for parents. Each brief provides background, suggestions, and sources of further information on educational and child-rearing topics of common interest to parents. Titles are: "The Best and Worst of Times: Support Groups Help" ("Los tiempos mejores y peores: Los grupos…

  2. Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing

    Science.gov (United States)

    Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.

    2008-12-01

    The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with

  3. Hanford spent nuclear fuel project recommended path forward, volume III: Alternatives and path forward evaluation supporting documentation

    International Nuclear Information System (INIS)

    Fulton, J.C.

    1994-10-01

    Volume I of the Hanford Spent Nuclear Fuel Project - Recommended Path Forward constitutes an aggressive series of projects to construct and operate systems and facilities to safely retrieve, package, transport, process, and store K Basins fuel and sludge. Volume II provided a comparative evaluation of four Alternatives for the Path Forward and an evaluation for the Recommended Path Forward. Although Volume II contained extensive appendices, six supporting documents have been compiled in Volume III to provide additional background for Volume II

  4. Airborne release fractions/rates and respirable fractions for nonreactor nuclear facilities. Volume 2, Appendices

    International Nuclear Information System (INIS)

    1994-12-01

    This document contains compiled data from the DOE Handbook on Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear facilities. Source data and example facilities utilized, such as the Plutonium Recovery Facility, are included

  5. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  6. Phase I Contaminant Transport Parameters for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nye County, Nevada, Revision 0

    International Nuclear Information System (INIS)

    John McCord

    2007-01-01

    This report documents transport data and data analyses for Yucca Flat/Climax Mine CAU 97. The purpose of the data compilation and related analyses is to provide the primary reference to support parameterization of the Yucca Flat/Climax Mine CAU transport model. Specific task objectives were as follows: (1) Identify and compile currently available transport parameter data and supporting information that may be relevant to the Yucca Flat/Climax Mine CAU. (2) Assess the level of quality of the data and associated documentation. (3) Analyze the data to derive expected values and estimates of the associated uncertainty and variability. The scope of this document includes the compilation and assessment of data and information relevant to transport parameters for the Yucca Flat/Climax Mine CAU subsurface within the context of unclassified source-term contamination. Data types of interest include mineralogy, aqueous chemistry, matrix and effective porosity, dispersivity, matrix diffusion, matrix and fracture sorption, and colloid-facilitated transport parameters

  7. Phase I Contaminant Transport Parameters for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nye County, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    John McCord

    2007-09-01

    This report documents transport data and data analyses for Yucca Flat/Climax Mine CAU 97. The purpose of the data compilation and related analyses is to provide the primary reference to support parameterization of the Yucca Flat/Climax Mine CAU transport model. Specific task objectives were as follows: • Identify and compile currently available transport parameter data and supporting information that may be relevant to the Yucca Flat/Climax Mine CAU. • Assess the level of quality of the data and associated documentation. • Analyze the data to derive expected values and estimates of the associated uncertainty and variability. The scope of this document includes the compilation and assessment of data and information relevant to transport parameters for the Yucca Flat/Climax Mine CAU subsurface within the context of unclassified source-term contamination. Data types of interest include mineralogy, aqueous chemistry, matrix and effective porosity, dispersivity, matrix diffusion, matrix and fracture sorption, and colloid-facilitated transport parameters.

  8. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  9. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  10. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  11. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  12. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  13. Semantic Similarity between Web Documents Using Ontology

    Science.gov (United States)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  14. Semantic Similarity between Web Documents Using Ontology

    Science.gov (United States)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  15. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  16. Survey of Forensic Document Examination Habit Areas: Degree of Use and Discriminatory Power

    Energy Technology Data Exchange (ETDEWEB)

    G Sperry; PA Manzolillo; RC Hanlan; RJ Muehlberger

    1999-09-07

    Beginning in 1998, the Pacific Northwest National Laboratory (PNL), US Postal Inspection Service Forensic Laboratory (USPIS), and the Data Fusion Laboratory, Drexel University (DFL) have been collaborating on a large scale research project ''Handwriting Individuality--Moving From Art to Science''. In April 1998 a survey was distributed to the community of forensic document examiners (FDEs) requesting input on the habit areas used and their utility in distinguishing handwriting. The information obtained from this survey was intended to provide the data necessary to select the criteria and begin the evaluation of the handwriting samples currently in the project. Preliminary results of the survey were made available to the community at the American Society of Questioned Document Examiners (ASQDE) meeting in August 1998 and the American Academy of Forensic Sciences (AAFS) meeting in February 1999. This report provides final documentation of the survey and its results. This survey has two objectives: (1) to compile a list of handwriting features and characteristics used by professional forensic document examiners in the examination and comparison of handwriting and (2) to gather information about the significance of these features and characteristics. These objectives are met by having the FDEs provide an indication of their experience in the frequency of habit area evaluation and the utility of the habit area for discrimination.

  17. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  18. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  19. Title List of documents made publicly available

    International Nuclear Information System (INIS)

    1982-05-01

    This document contains descriptions of the information received and generated by the US NRC. This information includes: (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  20. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  1. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  2. Completed Research in Health, Physical Education, Recreation & Dance; Including International Sources. Volume 27. 1985 Edition.

    Science.gov (United States)

    Freedson, Patty S., Ed.

    This compilation lists research completed in the areas of health, physical education, recreation, dance, and allied areas during 1984. The document is arranged in two parts. In the index, references are arranged under the subject headings in alphabetical order. Abstracts of master's and doctor's theses from institutions offering graduate programs…

  3. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  4. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  5. Design parameters and source terms: Volume 3, Source terms

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 11 refs., 9 tabs

  6. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    Science.gov (United States)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  7. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  8. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan---Conceptual Design Report SCP-CDR. The previous study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites. Volume 2 contains tables of source terms

  9. Who has used internal company documents for biomedical and public health research and where did they find them?

    Directory of Open Access Journals (Sweden)

    L Susan Wieland

    Full Text Available OBJECTIVE: To describe the sources of internal company documents used in public health and healthcare research. METHODS: We searched PubMed and Embase for articles using internal company documents to address a research question about a health-related topic. Our primary interest was where authors obtained internal company documents for their research. We also extracted information on type of company, type of research question, type of internal documents, and funding source. RESULTS: Our searches identified 9,305 citations of which 357 were eligible. Scanning of reference lists and consultation with colleagues identified 4 additional articles, resulting in 361 included articles. Most articles examined internal tobacco company documents (325/361; 90%. Articles using documents from pharmaceutical companies (20/361; 6% were the next most common. Tobacco articles used documents from repositories; pharmaceutical documents were from a range of sources. Most included articles relied upon internal company documents obtained through litigation (350/361; 97%. The research questions posed were primarily about company strategies to promote or position the company and its products (326/361; 90%. Most articles (346/361; 96% used information from miscellaneous documents such as memos or letters, or from unspecified types of documents. When explicit information about study funding was provided (290/361 articles, the most common source was the US-based National Cancer Institute. We developed an alternative and more sensitive search targeted at identifying additional research articles using internal pharmaceutical company documents, but the search retrieved an impractical number of citations for review. CONCLUSIONS: Internal company documents provide an excellent source of information on health topics (e.g., corporate behavior, study data exemplified by articles based on tobacco industry documents. Pharmaceutical and other industry documents appear to have been

  10. ON EXPERIENCE OF THE ELECTRONIC DOCUMENT MANAGEMENT SYSTEM IMPLEMENTATION IN THE MEDICAL UNIVERSITY

    OpenAIRE

    A. V. Semenets; V. Yu. Kovalok

    2015-01-01

    An importance of the application of the electronic document management to the Ukraine healthcare is shown. The electronic document management systems market overview is presented. Example of the usage of the open-source electronic document management system in the Ternopil State Medical University by I. Ya. Horbachevsky is shown. The implementation capabilities of the electronic document management system within a cloud services are shown. The electronic document management features of the Mi...

  11. GPC Single Source Letter

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  12. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  13. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  14. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  15. Environmental Restoration Remedial Action quality assurance requirements document

    International Nuclear Information System (INIS)

    1991-01-01

    This document defines the quality assurance requirements for the US Department of Energy-Richland Operations Office Environmental Restoration Remedial Action program at the Hanford Site. The Environmental Restoration Remedial Action program implements significant commitments made by the US Department of Energy in the Hanford Federal Facility Agreement and Consent Order entered into with the Washington State Department of Ecology and the US Environmental Protection Agency. This document combines quality assurance requirements from various source documents into one set of requirements for use by the US Department of Energy-Richland Operations Office and other Environmental Restoration Remedial Action program participants. This document will serve as the basis for developing Quality Assurance Program Plans and implementing procedures by the participants. The requirements of this document will be applied to activities affecting quality, using a graded approach based on the importance of the item, service, or activity to the program objectives. The Quality Assurance Program that will be established using this document as the basis, together with other program and technical documents, form an integrated management control system for conducting the Environmental Restoration Remedial Action program activities in a manner that provides safety and protects the environment and public health

  16. Annotated bibliography National Environmental Policy Act (NEPA) documents for Sandia National Laboratories

    International Nuclear Information System (INIS)

    Harris, J.M.

    1995-04-01

    The following annotated bibliography lists documents prepared by the Department of Energy (DOE), and predecessor agencies, to meet the requirements of the National Environmental Policy Act (NEPA) for activities and facilities at Sandia National Laboratories sites. For each NEPA document summary information and a brief discussion of content is provided. This information may be used to reduce the amount of time or cost associated with NEPA compliance for future Sandia National Laboratories projects. This summary may be used to identify model documents, documents to use as sources of information, or documents from which to tier additional NEPA documents

  17. Annotated bibliography National Environmental Policy Act (NEPA) documents for Sandia National Laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Harris, J.M.

    1995-04-01

    The following annotated bibliography lists documents prepared by the Department of Energy (DOE), and predecessor agencies, to meet the requirements of the National Environmental Policy Act (NEPA) for activities and facilities at Sandia National Laboratories sites. For each NEPA document summary information and a brief discussion of content is provided. This information may be used to reduce the amount of time or cost associated with NEPA compliance for future Sandia National Laboratories projects. This summary may be used to identify model documents, documents to use as sources of information, or documents from which to tier additional NEPA documents.

  18. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  19. 1993 Annual PCB Document for Los Alamos National Laboratory EPA Region VI, January 1, 1993 through December 31, 1993

    International Nuclear Information System (INIS)

    Wechsler, R.J.; Sandoval, T.M.; Bryant, D.E.; Hupke, L.; Esquibel, L.

    1995-01-01

    This document, the open-quotes 1993 Annual PCB Document for Los Alamos National Laboratoryclose quotes was prepared to fulffill the requirements of the federal PCB (Polychlorinated Biphenyl) regulation: 40 CFR 761 Subpart J General Records and Reports. The PCB Management Program at Los Alamos National Laboratory (LANL), Environmental Protection Group, compiled this 1993 Annual PCB Document. The overall format generally follows the sequence of the applicable regulations. Subsection 1.2 cross references those regulatory requirements with the applicable Document Section. The scope of this document also includes status summaries of various aspects of LANL's PCB Management Program. The intent of this approach to the Annual Document is to provide an overview of LANL's PCB Management Program and to increase the usefulness of this document as a management tool. Section 2.0, open-quotes Status of the PCB Management Programclose quotes, discusses the use, generation of waste, and storage of PCBs at LANL. Section 3.0 is the 1993 Annual Document Log required by 761.180(a). This Section also discusses the PCB Management Program's policies for reporting under those regulatory requirements. Sections 4.0 and 5.0 contain the 1993 Annual Records for off-site and on-site disposal as required by 761.180(b). There is a tab for each manifest and its associated continuation sheets, receipt letters, and certificates of disposal

  20. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  1. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  2. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    Science.gov (United States)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  3. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  4. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  5. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  6. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  7. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  8. Renewable Energy Monitoring Protocol. Update 2010. Methodology for the calculation and recording of the amounts of energy produced from renewable sources in the Netherlands

    Energy Technology Data Exchange (ETDEWEB)

    Te Buck, S.; Van Keulen, B.; Bosselaar, L.; Gerlagh, T.; Skelton, T.

    2010-07-15

    This is the fifth, updated edition of the Dutch Renewable Energy Monitoring Protocol. The protocol, compiled on behalf of the Ministry of Economic Affairs, can be considered as a policy document that provides a uniform calculation method for determining the amount of energy produced in the Netherlands in a renewable manner. Because all governments and organisations use the calculation methods described in this protocol, this makes it possible to monitor developments in this field well and consistently. The introduction of this protocol outlines the history and describes its set-up, validity and relationship with other similar documents and agreements. The Dutch Renewable Energy Monitoring Protocol is compiled by NL Agency, and all relevant parties were given the chance to provide input. This has been incorporated as far as is possible. Statistics Netherlands (CBS) uses this protocol to calculate the amount of renewable energy produced in the Netherlands. These data are then used by the Ministry of Economic Affairs to gauge the realisation of policy objectives. In June 2009 the European Directive for energy from renewable sources was published with renewable energy targets for the Netherlands. This directive used a different calculation method - the gross energy end-use method - whilst the Dutch definition is based on the so-called substitution method. NL Agency was asked to add the calculation according to the gross end use method, although this is not clearly defined on a number of points. In describing the method, the unanswered questions become clear, as do, for example, the points the Netherlands should bring up in international discussions.

  9. A Source Book for Taxation: Myths and Realities.

    Science.gov (United States)

    Hellman, Mary A.

    This sourcebook is one of two supplementary materials for a newspaper course about taxes and tax reform. Program ideas and sources of related resources compiled in the sourcebook are designed to help civic and group leaders and educators plan educational community programs based on the course topics. Section one describes ways in which the program…

  10. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  11. Guidance and Control Software Project Data - Volume 2: Development Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  12. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  13. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 2 tabs

  14. Who Has Used Internal Company Documents for Biomedical and Public Health Research and Where Did They Find Them?

    OpenAIRE

    Wieland, L. Susan; Rutkow, Lainie; Vedula, S. Swaroop; Kaufmann, Christopher N.; Rosman, Lori M.; Twose, Claire; Mahendraratnam, Nirosha; Dickersin, Kay

    2014-01-01

    OBJECTIVE: To describe the sources of internal company documents used in public health and healthcare research. METHODS: We searched PubMed and Embase for articles using internal company documents to address a research question about a health-related topic. Our primary interest was where authors obtained internal company documents for their research. We also extracted information on type of company, type of research question, type of internal documents, and funding source. RESULTS: Our search...

  15. Establishing Trustworthiness When Students Read Multiple Documents Containing Conflicting Scientific Evidence

    Science.gov (United States)

    Bråten, Ivar; Braasch, Jason L. G.; Strømsø, Helge I.; Ferguson, Leila E.

    2015-01-01

    Students read six documents that varied in terms of their perspectives on a scientific issue and the trustworthiness of the source features. After reading, students wrote essays, rank-ordered the documents according to perceived trustworthiness, and provided reasons for their rank-order decisions. Students put the most trust in a textbook and a…

  16. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  17. Fuel Summary for Peach Bottom Unit 1 High-Temperature Gas-Cooled Reactor Cores 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Karel I. Kingrey

    2003-04-01

    This fuel summary report contains background and summary information for the Peach Bottom Unit 1, High-Temperature, Gas-Cooled Reactor Cores 1 and 2. This report contains detailed information about the fuel in the two cores, the Peach Bottom Unit 1 operating history, nuclear parameters, physical and chemical characteristics, and shipping and storage canister related data. The data in this document have been compiled from a large number of sources and are not qualified beyond the qualification of the source documents. This report is intended to provide an overview of the existing data pertaining to spent fuel management and point to pertinent reference source documents. For design applications, the original source documentation must be used. While all referenced sources are available as records or controlled documents at the Idaho National Engineering and Environmental Laboratory (INEEL), some of the sources were marked as informal or draft reports. This is noted where applicable. In some instances, source documents are not consistent. Where they are known, this document identifies those instances and provides clarification where possible. However, as stated above, this document has not been independently qualified and such clarifications are only included for information purposes. Some of the information in this summary is available in multiple source documents. An effort has been made to clearly identify at least one record document as the source for the information included in this report.

  18. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  19. Title list of documents made publicly available, July 1-31, 1979

    International Nuclear Information System (INIS)

    1979-09-01

    This document is a monthly publication containing descriptions of information received and generated by the US NRC. This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. The docketed information includes information formerly issued through US DOE's Technical Information Center under the title Power Reactor Docket Information (PRDI). This document replaces PRDI, which will no longer be prepared. It is indexed by a Personal Author Index, Corporate Source Index, US NRC Organizational Source Index and Personal Author Index

  20. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  1. Interpreting XML documents via an RDF schema

    NARCIS (Netherlands)

    Klein, Michel; Handschuh, Siegfried; Staab, Steffen

    2003-01-01

    One of the major problems in the realization of the vision of the ``Semantic Web''; is the transformation of existing web data into sources that can be processed and used by machines. This paper presents a procedure that can be used to turn XML documents into knowledge structures, by interpreting

  2. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  3. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  4. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  5. Hanford science and technology needs statements document

    Energy Technology Data Exchange (ETDEWEB)

    Piper, L.L.

    1997-12-31

    This document is a compilation of the Hanford science and technology needs statements for FY 1998. The needs were developed by the Hanford Site Technology Coordination Group (STCG) with full participation and endorsement of site user organizations, stakeholders, and regulators. The purpose of this document is to: (a) provide a comprehensive listing of Hanford science and technology needs, and (b) identify partnering and commercialization opportunities with industry, other federal and state agencies, and the academic community. The Hanford STCG reviews and updates the needs annually. Once completed, the needs are communicated to DOE for use in the development and prioritization of their science and technology programs, including the Focus Areas, Cross-Cutting Programs, and the Environmental Management Science Program. The needs are also transmitted to DOE through the Accelerating Cleanup: 2006 Plan. The public may access the need statements on the Internet on: the Hanford Home Page (www.hanford.gov), the Pacific Rim Enterprise Center`s web site (www2.pacific-rim.org/pacific rim), or the STCG web site at DOE headquarters (em-52.em.doegov/ifd/stcg/stcg.htm). This page includes links to science and technology needs for many DOE sites. Private industry is encouraged to review the need statements and contact the Hanford STCG if they can provide technologies that meet these needs. On-site points of contact are included at the ends of each need statement. The Pacific Rim Enterprise Center (206-224-9934) can also provide assistance to businesses interested in marketing technologies to the DOE.

  6. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  7. Repository of not readily available documents for project W-320

    Energy Technology Data Exchange (ETDEWEB)

    Conner, J.C.

    1997-04-18

    The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line.

  8. Repository of not readily available documents for project W-320

    International Nuclear Information System (INIS)

    Conner, J.C.

    1997-01-01

    The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line

  9. A Conceptual Model for Multidimensional Analysis of Documents

    Science.gov (United States)

    Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles

    Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.

  10. Studying Wind Energy/Bird Interactions: A Guidance Document

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, R. [California Energy Commission (US); Morrison, M. [California State Univ., Sacramento, CA (US); Sinclair, K. [Dept. of Energy/National Renewable Energy Lab. (US); Strickland, D. [WEST, Inc. (US)

    1999-12-01

    This guidance document is a product of the Avian Subcommittee of the National Wind Coordinating Committee (NWCC). The NWCC was formed to better understand and promote responsible, credible, and comparable avian/wind energy interaction studies. Bird mortality is a concern and wind power is a potential clean and green source of electricity, making study of wind energy/bird interactions essential. This document provides an overview for regulators and stakeholders concerned with wind energy/bird interactions, as well as a more technical discussion of the basic concepts and tools for studying such interactions.

  11. ON EXPERIENCE OF THE ELECTRONIC DOCUMENT MANAGEMENT SYSTEM IMPLEMENTATION IN THE MEDICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2015-05-01

    Full Text Available An importance of the application of the electronic document management to the Ukraine healthcare is shown. The electronic document management systems market overview is presented. Example of the usage of the open-source electronic document management system in the Ternopil State Medical University by I. Ya. Horbachevsky is shown. The implementation capabilities of the electronic document management system within a cloud services are shown. The electronic document management features of the Microsoft Office 365 and Google Apps For Education are compared. Some results of the usage of the Google Apps For Education inTSMUas electronic document management system are presented.

  12. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  13. Monte Carlo programs and other utilities for high energy physics

    International Nuclear Information System (INIS)

    Palounek, A.P.T.; Youssef, S.

    1990-05-01

    The Software Standards and Documentation Group of the Workshop on Physics and Detector Simulation for SSC Experiments has compiled a list of physics generators, detector simulations, and related programs. This is not meant to be an exhaustive compilation, nor is any judgment made about program quality; it is a starting point or a more complete bibliography. Where possible we have included an author and source for the code. References for most programs are in the final section

  14. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  15. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1994-06-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  16. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1982-03-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  17. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1991-01-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes docketed material associated with civilian nuclear power plants and other uses of radioactive materials and nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index. The docketed information contained in the Title List includes the information formerly issued though the Department of Energy publication Power Reactor Docket Information, last published in January 1979

  18. Understanding "Animal Farm": A Student Casebook to Issues, Sources, and Historical Documents.

    Science.gov (United States)

    Rodden, John

    "Animal Farm" is a political allegory of the USSR written in the form of a fable. Its stinging moral warning against the abuse of power is demonstrated in this casebook through a wide variety of historical, political, and literary documents that are directly applicable to George Orwell's novel. Included in the casebook are passages from…

  19. NoSQL: collection document and cloud by using a dynamic web query form

    Science.gov (United States)

    Abdalla, Hemn B.; Lin, Jinzhao; Li, Guoquan

    2015-07-01

    Mongo-DB (from "humongous") is an open-source document database and the leading NoSQL database. A NoSQL (Not Only SQL, next generation databases, being non-relational, deal, open-source and horizontally scalable) presenting a mechanism for storage and retrieval of documents. Previously, we stored and retrieved the data using the SQL queries. Here, we use the MonogoDB that means we are not utilizing the MySQL and SQL queries. Directly importing the documents into our Drives, retrieving the documents on that drive by not applying the SQL queries, using the IO BufferReader and Writer, BufferReader for importing our type of document files to my folder (Drive). For retrieving the document files, the usage is BufferWriter from the particular folder (or) Drive. In this sense, providing the security for those storing files for what purpose means if we store the documents in our local folder means all or views that file and modified that file. So preventing that file, we are furnishing the security. The original document files will be changed to another format like in this paper; Binary format is used. Our documents will be converting to the binary format after that direct storing in one of our folder, that time the storage space will provide the private key for accessing that file. Wherever any user tries to discover the Document files means that file data are in the binary format, the document's file owner simply views that original format using that personal key from receive the secret key from the cloud.

  20. Identification of documented medication non-adherence in physician notes.

    Science.gov (United States)

    Turchin, Alexander; Wheeler, Holly I; Labreche, Matthew; Chu, Julia T; Pendergrass, Merri L; Einbinder, Jonathan S; Einbinder, Jonathan Seth

    2008-11-06

    Medication non-adherence is common and the physicians awareness of it may be an important factor in clinical decision making. Few sources of data on physician awareness of medication non-adherence are available. We have designed an algorithm to identify documentation of medication non-adherence in the text of physician notes. The algorithm recognizes eight semantic classes of documentation of medication non-adherence. We evaluated the algorithm against manual ratings of 200 randomly selected notes of hypertensive patients. The algorithm detected 89% of the notes with documented medication non-adherence with specificity of 84.7% and positive predictive value of 80.2%. In a larger dataset of 1,000 documents, notes that documented medication non-adherence were more likely to report significantly elevated systolic (15.3% vs. 9.0%; p = 0.002) and diastolic (4.1% vs. 1.9%; p = 0.03) blood pressure. This novel clinically validated tool expands the range of information on medication non-adherence available to researchers.