WorldWideScience

Sample records for data compilation

  1. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  2. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  3. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  4. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  5. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  6. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  7. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  8. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  9. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  10. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  11. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  12. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  13. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  14. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  15. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  16. ccPDB: compilation and creation of data sets from Protein Data Bank.

    Science.gov (United States)

    Singh, Harinder; Chauhan, Jagat Singh; Gromiha, M Michael; Raghava, Gajendra P S

    2012-01-01

    ccPDB (http://crdd.osdd.net/raghava/ccpdb/) is a database of data sets compiled from the literature and Protein Data Bank (PDB). First, we collected and compiled data sets from the literature used for developing bioinformatics methods to annotate the structure and function of proteins. Second, data sets were derived from the latest release of PDB using standard protocols. Third, we developed a powerful module for creating a wide range of customized data sets from the current release of PDB. This is a flexible module that allows users to create data sets using a simple six step procedure. In addition, a number of web services have been integrated in ccPDB, which include submission of jobs on PDB-based servers, annotation of protein structures and generation of patterns. This database maintains >30 types of data sets such as secondary structure, tight-turns, nucleotide interacting residues, metals interacting residues, DNA/RNA binding residues and so on.

  17. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  18. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  19. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  20. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  1. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  2. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  3. VizieR Online Data Catalog: Compilation of stellar rotation data (Kovacs, 2018)

    Science.gov (United States)

    Kovacs, G.

    2018-03-01

    The three datasets included in table1-1.dat, table1-2.dat and table1-6.dat respectively, correspond to the type of stars listed in Table 1 in lines 1 [Praesepe], 2 [HJ_host] and 6 [Field(C)]. These data result from the compilation of rotational and other stellar data from the literature. (4 data files).

  4. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  5. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  6. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  7. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  8. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  9. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  10. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  11. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  12. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  13. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  14. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  15. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  16. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  17. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  18. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  19. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  20. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  1. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  2. ERES--a PC software for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingjin

    1993-01-01

    The major functions and implementation of the software ERES (EXFOR Edit System) are introduced. The ERES is developed for nuclear data compilation in EXFOR (EXchange FORmat) format, running on IBM-PC/XT or IBM-PC/AT. EXFOR is the format for the exchange of experimental neutron data accepted by four neutron data centers in the world

  3. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  4. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  5. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  6. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  7. Neutron data compilation at the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Lemmel, H.D.; Attree, P.M.; Byer, T.A.; Good, W.M.; Hjaerne, L.; Konshin, V.A.; Lorens, A.

    1968-03-01

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  8. Neutron data compilation at the International Atomic Energy Agency

    Energy Technology Data Exchange (ETDEWEB)

    Lemmel, H D; Attree, P M; Byer, T A; Good, W M; Hjaerne, L; Konshin, V A; Lorens, A [Nuclear Data Unit, International Atomic Energy Agency, Vienna (Austria)

    1968-03-15

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  9. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  10. Compilation of nuclear decay data used for dose calculations. Data for radionuclides not listed in ICRP publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Tsutomu

    1999-07-01

    Nuclear decay data used for dose calculations were compiled for 162 nuclides with half-lives greater than or equal to 10 min that are not listed in ICRP Publication 38 (Publ. 38) and their 28 daughter nuclides. Additional 14 nuclides that are considered to be important in fusion reactor facilities were also included. The data were compiled using decay data sets of the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Investigations of the data sets were performed to check their consistency by referring to recent literature and NUBASE, the database for nuclear and decay properties of nuclides, and by using the utility programs of ENSDF. Possible revisions of the data sets were made for their format and syntax errors, level schemes, normalization records, and so on. The revised data sets were processed by EDISTR in order to calculate the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays including annihilation photons, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformations of the radionuclides. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were presented in two types of format; Publ. 38 and NUCDECAY formats. This report provides the decay data in the Publ. 38 format along with decay scheme drawings. The data will be widely used for internal and external dose calculations in radiation protection. (author)

  11. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  12. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  13. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  14. Compilation of nucleon-nucleon and nucleon-antinucleon elastic scattering data

    International Nuclear Information System (INIS)

    Carter, M.K.; Collins, P.D.B.; Whalley, M.R.

    1986-01-01

    A compilation of the data on pp, pn, nn, p-barp, p-barn, n-barp, and n-barn is presented, in both tabular and graphical form, including when available the total and elastic cross sections, the differences of the total cross section in different spin states, the ratio of the real to imaginary part of the forward scattering amplitude, the elastic differential cross sections, the polarization asymmetry and the spin correlation parameters, for all laboratory-frame momenta >=2 GeV/c. All the data in this review can be found in and retrieved from the Durham-RAL HEP data base together with data on a wide variety of other reactions. (author)

  15. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias

    2015-08-12

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  16. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias; Bruckner, Stefan; Groller, M. Eduard; Hadwiger, Markus; Rautek, Peter

    2015-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  17. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure.

    Science.gov (United States)

    Labschütz, Matthias; Bruckner, Stefan; Gröller, M Eduard; Hadwiger, Markus; Rautek, Peter

    2016-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  18. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    International Nuclear Information System (INIS)

    Endo, Akira; Yamaguchi, Yasuhiro

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of α particles, β particles, γ rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt γ rays, delayed γ rays, and β particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  19. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  20. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  1. Compilations and evaluations of data on the interaction of electromagnetic radiation with matter

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-05-01

    The material contained in this report deals with data on the interaction of electromagnetic radiation with matter, listing major compilations of X-ray, photon and gamma-ray cross sections and attentuation coefficients, as well as selected reports featuring data on compton scattering, photoelectric absorption and pair production

  2. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  3. Data compilation of respiration, feeding, and growth rates of marine pelagic organisms

    DEFF Research Database (Denmark)

    2013-01-01

    's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from...

  4. Compilation of properties data for Li{sub 2}TiO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Roux, N [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1998-03-01

    Properties data obtained at CEA for Li{sub 2}TiO{sub 3} are reported. The compilation includes : stability of Li{sub 2}TiO{sub 3} {beta} phase, specific heat, thermal diffusivity, thermal conductivity, linear thermal expansion, thermal creep, interaction with water and acid. (author)

  5. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  6. A compilation of structural property data for computer impact calculation (1/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nagata, Norio.

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 1 involve structural property data and data processing computer program. (author)

  7. A compilation of structural property data for computer impact calculation (5/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 5 involve structural property data of wood. (author)

  8. A compilation of structural property data for computer impact calculation (3/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 3 involve structural property data of stainless steel. (author)

  9. A compilation of structural property data for computer impact calculation (2/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 2 involve structural property data of mild steel. (author)

  10. Hydrogeological conditions in the Finnsjoen area. Compilation of data and conceptual model

    International Nuclear Information System (INIS)

    Andersson, J.E.; Nordqvist, R.; Nyberg, G.; Smellie, J.; Tiren, S.

    1991-02-01

    In the present report all available data gathered from the Finnsjoen area of potential use for numerical modelling are compiled and discussed. The data have been collected during different phases during the period 1977-1989. This inevitably means that the quality of the measured and interpreted data varies in accordance with the continuous developments of improved equipments and interpretation techniques. The present report is an updated version of the SKB progress report 89-24 with the same title and authors, see introduction. (au)

  11. Compilation and evaluation of atomic and molecular data relevant to controlled thermonuclear research needs: USA programs

    International Nuclear Information System (INIS)

    Barnett, C.F.

    1976-01-01

    The U.S. role in the compilation and evaluation of atomic data for controlled thermonuclear research is discussed in the following three areas: (1) atomic structure data, (2) atomic collision data, and (3) surface data

  12. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  13. Principal facts for gravity data collected in the southern Albuquerque Basin area and a regional compilation, central New Mexico

    Science.gov (United States)

    Gillespie, Cindy L.; Grauch, V.J.S.; Oshetski, Kim; Keller, Gordon R.

    2000-01-01

    Principal facts for 156 new gravity stations in the southern Albuquerque basin are presented. These data fill a gap in existing data coverage. The compilation of the new data and two existing data sets into a regional data set of 5562 stations that cover the Albuquerque basin and vicinity is also described. Bouguer anomaly and isostatic residual gravity data for this regional compilation are available in digital form from ftp://greenwood.cr.usgs.gov/pub/openfile- reports/ofr-00-490.

  14. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD50s and a compilation of experimental data

    International Nuclear Information System (INIS)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD 50 where sufficient experimental data are available. Exposure rates varied in magnitude from about 10 -2 to 10 3 R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs

  15. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  16. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  17. Compilation of floristic and herbarium specimen datain Iran: proposal to data structure

    Directory of Open Access Journals (Sweden)

    Majid Sharifi-Tehrani

    2013-09-01

    Full Text Available Floristic databases constitute the second level of plant information systems, after taxonomic-nomenclatural databases. This paper provided the details of data structure and available data resources to develop a floristic database, along with some explanations on taxonomic and floristic databases. Also, this paper proposed the availability and possibility of a shortcut to constructing a national floristic database through uniforming and compilation of dispersed floristic data contained in various botanical centers of Iran. Therefore, Iran could be the second country in SW Asia region to have a national floristic database, and the resulted services can be presented to national scientific community.

  18. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  19. Mineralogy and geochemistry of rocks and fracture fillings from Forsmark and Oskarshamn: Compilation of data for SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Drake, Henrik; Sandstroem, Bjoern [Isochron GeoConsulting HB, Goeteborg (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden)

    2006-11-15

    This report is a compilation of the so far available data for the safety assessment SR-Can carried out by SKB. The data consists of mineralogy, geochemistry, porosity, density and redox properties for both dominating rock types and fracture fillings at the Forsmark and Oskarshamn candidate areas. In addition to the compilation of existing information, the aim has been to identify missing data and to clarify some conception of e.g. deformation zones. The objective of the report is to present the available data requested for the modelling of the chemical stability of the two sites. The report includes no interpretation of the data.

  20. GfW-handbook for data compilation of irradiation tested electronic components

    International Nuclear Information System (INIS)

    Wulf, F.; Braeunig, D.; Gaebler, W.

    1981-06-01

    The present 2. edition of the Data Compilation of Irradiation Tested Electronic Components represents a continuation of the 1. edition and is published as a loose-leaf handbook. In addition to the 190 reports provided in the 1. issue the present handbook contains further 44 test reports of currently used semiconductor devices in a comprehensive but easily to handle graphical and tabular presentation. Statistical values are given in order to facilitate the parts life time evaluation in a radiative environment. (orig.) [de

  1. Los Alamos geostationary orbit synoptic data set: a compilation of energetic particle data

    International Nuclear Information System (INIS)

    Baker, D.N.; Higbie, P.R.; Belian, R.D.; Aiello, W.P.; Hones, E.W. Jr.; Tech, E.R.; Halbig, M.F.; Payne, J.B.; Robinson, R.; Kedge, S.

    1981-08-01

    Energetic electron (30 to 2000 keV) and proton (145 keV to 150 MeV) measurements made by Los Alamos National Laboratory sensors at geostationary orbit 6.6 R/sub E/ are summarized. The data are plotted in terms of daily average spectra, 3-h local time averages, and in a variety of statistical formats. The data summarize conditions from mid-1976 through 1978 (S/C 1976-059) and from early 1977 through 1978 (S/C 1977-007). The compilations correspond to measurements at 35 0 W, 70 0 W, and 135 0 W geographic longitude and, thus, are indicative of conditions at 9 0 , 11 0 , and 4.8 0 geomagnetic latitude, respectively. Most of this report is comprised of data plots that are organized according to Carrington solar rotations so that the data can be easily compared to solar rotation-dependent interplanetary data. As shown in prior studies, variations in solar wind conditions modulate particle intensity within the terrestrial magnetosphere. The effects of these variations are demonstrated and discussed. Potential uses of the Synoptic Data Set by the scientific and applications-oriented communities are also discussed

  2. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  3. Compiling the functional data-parallel language SaC for Microgrids of Self-Adaptive Virtual Processors

    NARCIS (Netherlands)

    Grelck, C.; Herhut, S.; Jesshope, C.; Joslin, C.; Lankamp, M.; Scholz, S.-B.; Shafarenko, A.

    2009-01-01

    We present preliminary results from compiling the high-level, functional and data-parallel programming language SaC into a novel multi-core design: Microgrids of Self-Adaptive Virtual Processors (SVPs). The side-effect free nature of SaC in conjunction with its data-parallel foundation make it an

  4. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  5. Status on the compilation of nuclear data for medical radioisotopes produced by accelerators

    International Nuclear Information System (INIS)

    Gandarias-Cruz, D.; Okamoto, K.

    1988-10-01

    The status of data on excitation functions and thick target yields for medical radioisotopes produced by accelerators is summarized. Most of the information was extracted from the compiled data in EXFOR (EXCHANGE FORMAT) which is a common format used by the co-operating nuclear data centres in the world. The nuclear decay mode, half-life, production method, Q-value, maximum cross-section value and the energy at this maximum, are tabulated. For some commonly used reactions, the available excitation functions are plotted in graph. (author). 353 refs

  6. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD/sub 50/s and a compilation of experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD/sub 50/ where sufficient experimental data are available. Exposure rates varied in magnitude from about 10/sup -2/ to 10/sup 3/ R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs.

  7. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  8. Compilation of reactor-physical data of the AVR experimental reactor for 1982

    International Nuclear Information System (INIS)

    Werner, H.; Wawrzik, U.; Grotkamp, T.; Buettgen, I.

    1983-12-01

    Since the end of 1981 the calculation model AVR-80 has been taken as a basis for compiling reactor-physical data of the AVR experimental reactor. A brief outline of the operation history of 1982 is given, including the beginning of a large-scale experiment dealing with change-over from high enriched uranium to low enriched uranium. Calculations relative to spectral shift, diffusion, temperature, burnup, and recirculation of the fuel elements are described in brief. The essential results of neutron-physical and thermodynamic calculations and the characteristical data of the various types of fuel used are shown in tables and illustrations. (RF) [de

  9. Irradiation of strawberries. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1994-12-01

    The document contains a compilation of all available scientific and technical data on the irradiation of strawberries. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. Refs, 1 tab

  10. Compilation of reactor physics data of the year 1984, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-12-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1984 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  11. Compilation of reactor physics data of the year 1983, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-06-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1983 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  12. Neutron data compilation. Report of a Panel sponsored by the International Atomic Energy Agency and held in Brookhaven, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-02-15

    The IAEA organized and convened a Panel on Neutron Data Compilation. This Panel was organized by the Agency following the recommendations made by the International Nuclear Data Committee (INDC) which agreed that a general review of world neutron data compilation activities was desirable. In this context neutron data compilation encompasses the collection, storage and dissemination of bibliographic information and of qualitative and numerical data on the interaction of neutrons with nuclei and atoms for all incident energies. Such information and data have important applications in low energy neutron physics and many important. areas of nuclear technology. The principal objective of the Panel on Neutron Data Compilation, Which was held at Brookhaven National Laboratory during 10-14 February 1969, was to review how the world's principal data centers located at Brookhaven, Saclay, Obninsk and Vienna could ideally meet the demands and needs of experimental and theoretical neutron physicists, evaluators, reactor physicists as well as other existing and potential users. Fourteen papers were considered during formal sessions of the Panel and are reported on the following pages. The members of the Panel separated into five working groups to consider specific terms of references and make recommendations. Their reports were discussed.

  13. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  14. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  15. A compilation of experimental burnout data for axial flow of water in rod bundles

    International Nuclear Information System (INIS)

    Chapman, A.G.; Carrard, G.

    1981-02-01

    A compilation has been made of burnout (critical heat flux) data from the results of more thant 12,000 tests on 321 electrically-heated, water-cooled experimental assemblies each simulating, to some extent, the operating or postulated accident conditions in the fuel elements of water-cooled nuclear power reactors. The main geometric characteristics of the assemblies are listed and references are given for the sources of information from which the data were gathered

  16. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    International Nuclear Information System (INIS)

    1994-01-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA's 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire

  17. The NASA earth resources spectral information system: A data compilation, second supplement

    Science.gov (United States)

    Vincent, R. K.

    1973-01-01

    The NASA Earth Resources Spectral Information System (ERSIS) and the information contained therein are described. It is intended for use as a second supplement to the NASA Earth Resources Spectral Information System: A Data Compilation, NASA CR-31650-24-T, May 1971. The current supplement includes approximately 100 rock and mineral, and 375 vegetation directional reflectance spectral curves in the optical region from 0.2 to 22.0 microns. The data were categorized by subject and each curve plotted on a single graph. Each graph is fully titled to indicate curve source and indexed by subject to facilitate user retrieval from ERSIS magnetic tape records.

  18. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  19. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  20. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  1. Irradiation of red meat. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ''Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control''. 146 refs

  2. Irradiation of red meat. A compilation of technical data for its authorization and control

    Energy Technology Data Exchange (ETDEWEB)

    International consultative group on food irradiation

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ``Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control``. 146 refs.

  3. Compilation of data and descriptions for United States and foreign liquid metal fast breeder reactors

    International Nuclear Information System (INIS)

    Appleby, E.R.

    1975-08-01

    This document is a compilation of design and engineering information pertaining to liquid metal cooled fast breeder reactors which have operated, are operating, or are currently under construction, in the United States and abroad. All data has been taken from publicly available documents, journals, and books

  4. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA`s 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire 3 figs, 5 tabs

  5. Compilation of data relating to the erosive response of 608 recently-burned basins in the western United States

    Science.gov (United States)

    Gartner, Joseph E.; Cannon, Susan H.; Bigio, Erica R.; Davis, Nicole K.; Parrett, Charles; Pierce, Kenneth L.; Rupert, Michael G.; Thurston, Brandon L.; Trebesch, Matthew J.; Garcia, Steve P.; Rea, Alan H.

    2005-01-01

    This report presents a compilation of data on the erosive response, debris-flow initiation processes, basin morphology, burn severity, event-triggering rainfall, rock type, and soils for 608 basins recently burned by 53 fires located throughout the Western United States.  The data presented here are a combination of those collected during our own field research and those reported in the literature.  In some cases, data from a Geographic Information System (GIS) and Digital Elevation Models (DEMs) were used to supplement the data from the primary source.  Due to gaps in the information available, not all parameters are characterized for all basins. This database provides a resource for researchers and land managers interested in examining relations between the runoff response of recently burned basins and their morphology, burn severity, soils and rock type, and triggering rainfall.  The purpose of this compilation is to provide a single resource for future studies addressing problems associated with wildfire-related erosion.  For example, data in this compilation have been used to develop a model for debris flow probability from recently burned basins using logistic multiple regression analysis (Cannon and others, 2004).  This database provides a convenient starting point for other studies.  For additional information on estimated post-fire runoff peak discharges and debris-flow volumes, see Gartner and others (2004).

  6. Depleted uranium hexafluoride management program : data compilation for the Paducah site

    International Nuclear Information System (INIS)

    Hartmann, H.

    2001-01-01

    This report is a compilation of data and analyses for the Paducah site, near Paducah, Kentucky. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Paducah site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  7. Depleted uranium hexafluoride management program : data compilation for the Portsmouth site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the Portsmouth site, near Portsmouth, Ohio. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Portsmouth site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) management activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  8. Description of source term data on contaminated sites and buildings compiled for the waste management programmatic environmental impact statement (WMPEIS)

    International Nuclear Information System (INIS)

    Short, S.M.; Smith, D.E.; Hill, J.G.; Lerchen, M.E.

    1995-10-01

    The U.S. Department of Energy (DOE) and its predecessor agencies have historically had responsibility for carrying out various national missions primarily related to nuclear weapons development and energy research. Recently, these missions have been expanded to include remediation of sites and facilities contaminated as a result of past activities. In January 1990, the Secretary of Energy announced that DOE would prepare a Programmatic Environmental Impact Statement on the DOE's environmental restoration and waste management program; the primary focus was the evaluation of (1) strategies for conducting remediation of all DOE contaminated sites and facilities and (2) potential configurations for waste management capabilities. Several different environmental restoration strategies were identified for evaluation, ranging from doing no remediation to strategies where the level of remediation was driven by such factors as final land use and health effects. A quantitative assessment of the costs and health effects of remediation activities and residual contamination levels associated with each remediation strategy was made. These analyses required that information be compiled on each individual contaminated site and structure located at each DOE installation and that the information compiled include quantitative measurements and/or estimates of contamination levels and extent of contamination. This document provides a description of the types of information and data compiled for use in the analyses. Also provided is a description of the database used to manage the data, a detailed discussion of the methodology and assumptions used in compiling the data, and a summary of the data compiled into the database as of March 1995. As of this date, over 10,000 contaminated sites and structures and over 8,000 uncontaminated structures had been identified across the DOE complex of installations

  9. Standard guide for formats for collection and compilation of corrosion data for metals for computerized database input

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This guide covers the data categories and specific data elements (fields) considered necessary to accommodate desired search strategies and reliable data comparisons in computerized corrosion databases. The data entries are designed to accommodate data relative to the basic forms of corrosion and to serve as guides for structuring multiple source database compilations capable of assessing compatibility of metals and alloys for a wide range of environments and exposure conditions.

  10. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  11. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1987-03-01

    The Panel on Basic Nuclear Data Compilations believes that it is of paramount importance to achieve as short a cycle time as is reasonably possible in the evaluation and publication of the A-chains. The panel, therefore, has concentrated its efforts on identifying those factors that have tended to increase the cycle time and on finding ways to remove the obstacles. An important step was made during the past year to address reduction of the size of the published evaluations - another factor that can reduce cycle time. The Nuclear Structure and Decay Data (NSDD) network adopted new format guidelines, which generated a 30% reduction by eliminating redundancy and/or duplication. A current problem appears to be the rate at which the A-chains are being evaluated, which, on the average, is only about one-half of what it could be. It is hoped that the situation will improve with an increase in the number of foreign centers and an increase in efficiency as more A-chains are recycled by the same evaluator who did the previous evaluation. Progress has been made in the area of on-line access to the nuclear data files in that a subcommittee report describing the requirements of an on-line system has been produced. 2 tabs

  12. Thirty years of progress in harmonizing and compiling food data as a result of the establishment of INFOODS.

    Science.gov (United States)

    Murphy, Suzanne P; Charrondiere, U Ruth; Burlingame, Barbara

    2016-02-15

    The International Network of Foods Data Systems (INFOODS) has provided leadership on the development and use of food composition data for over 30years. The mission of INFOODS is the promotion of international participation, cooperation and harmonization in the generation, compilation and dissemination of adequate and reliable data on the composition of foods, beverages, and their ingredients in forms appropriate to meet the needs of various users. Achievements include the development of guidelines and standards, increased capacity development in generating and compiling food composition data, a food composition database management system, improvements in laboratory quality assurance, and development of several food composition databases and tables. Recently, INFOODS has led efforts to define and document food biodiversity. As new foods and food components come into prominence, and as analytical methods evolve, the activities of INFOODS will continue to advance the quality and quantity of food composition data globally into the future. Copyright © 2015 Food and Agriculture Organization of the United Nations. Published by Elsevier Ltd.. All rights reserved.

  13. Compilation of data used for the analysis of the geological and hydrogeological DFN models. Site descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    Hermanson, Jan; Fox, Aaron; Oehman, Johan; Rhen, Ingvar

    2008-08-01

    This report provides an overview and compilation of the various data that constitutes the basis for construction of the geological and hydrogeological discrete feature network (DFN) models as part of model version SDM-Site Laxemar. This includes a review of fracture data in boreholes and in outcrop. Furthermore, the basis for the construction of lineament maps is given as well as a review of the hydraulic test data from cored and percussion-drilled boreholes. An emphasis is put on graphical representation of borehole logs in the form of composites of geological, hydrogeological and even hydrogeochemical data in the case of cored boreholes. One major contribution is a compilation of characteristics of minor local deformation zones (MDZs) identified in cored boreholes. Basic orientation data and fracture intensity data are presented as a function of depth for individual boreholes. The coupling between hydrogeological data and geological data is further refined in plots of Posiva flow log (PFL) data vs. geological single hole interpretation data

  14. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-07-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  15. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    International Nuclear Information System (INIS)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-01-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  16. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  17. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  18. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  19. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  20. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    Riemer, R.L.

    1992-01-01

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets

  1. Priorities for injury prevention in women's Australian football: a compilation of national data from different sources.

    Science.gov (United States)

    Fortington, Lauren V; Finch, Caroline F

    2016-01-01

    Participation in Australian football (AF) has traditionally been male dominated and current understanding of injury and priorities for prevention are based solely on reports of injuries in male players. There is evidence in other sports that indicates that injury types differ between males and females. With increasing participation in AF by females, it is important to consider their specific injury and prevention needs. This study aimed to provide a first injury profile from existing sources for female AF. Compilation of injury data from four prospectively recorded data sets relating to female AF: (1) hospital admissions in Victoria, 2008/09-13/14, n=500 injuries; (2) emergency department (ED) presentations in Victoria, 2008/09-2012/13, n=1,879 injuries; (3) insurance claims across Australia 2004-2013, n=522 injuries; (4) West Australian Women's Football League (WAWFL), 2014 season club data, n=49 injuries. Descriptive results are presented as injury frequencies, injury types and injury to body parts. Hospital admissions and ED presentations were dominated by upper limb injuries, representing 47% and 51% of all injuries, respectively, primarily to the wrist/hand at 32% and 40%. Most (65%) insurance claim injuries involved the lower limb, 27% of which were for knee ligament damage. A high proportion of concussions (33%) were reported in the club-collected data. The results provide the first compilation of existing data sets of women's AF injuries and highlight the need for a rigorous and systematic injury surveillance system to be instituted.

  2. Compilation of radiometric age and trace-element geochemical data, Yucca Mountain and surrounding areas of southwestern Nevada

    International Nuclear Information System (INIS)

    Weiss, S.I.; Noble, D.C.; Larson, L.T.

    1994-01-01

    This document is a compilation of available radiometric age and trace-element geochemical data for volcanic rocks and episodes of hydrothermal activity in Yucca Mountain and the surrounding region of southwestern Nevada. Only the age determinations considered to be geologically reasonable (consistent with stratigraphic relations) are listed below. A number of the potassium-argon (K-Ar) ages of volcanic rocks given by Kistler, Marvin et al., Noble et al., Weiss et al., and Noble et al. are not included as these ages have been shown to be incorrect or disturbed by hydrothermal alteration based on subsequent stratigraphic and/or petrographic data and the recognition of errors in K-Ar age determinations related to incomplete extraction of argon. In cases where absolute ages are tightly constrained by high precision 40 Ar/ 39 Ar ages and unequivocal stratigraphic relations, we have omitted the less precise K-Ar age data. Similarly, the more precise single-crystal laser-fusion 40 Ar/ 39 Ar age determinations of certain units are reported and less precise ages by multi-grain bulk-fusion 40 Ar/ 39 Ar methods are not included. This compilation does not include age data for basaltic rocks of Pliocene and Quaternary age in the Yucca Mountain region

  3. Irradiation of bulbs and tuber crops. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1997-04-01

    This publication contains a compilation of available scientific and technical data on the irradiation of bulbs and tuber crops. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of the food. It was also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International Conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. 448 refs, 6 tabs

  4. Depleted uranium hexafluoride management program : data compilation for the K-25 site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the K-25 site on the Oak Ridge Reservation, Oak Ridge, Tennessee. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the K-25 site and summarizes the potential environmental impacts that could result from continued cylinder storage and preparation of cylinders for shipment at the site. It is probable that the cylinders at the K-25 site will be shipped to another site for conversion. Because conversion and long-term storage of the entire inventory at the K-25 site are highly unlikely, these data are not presented in this report. DOE's preferred alternative is to begin converting the depleted uranium hexafluoride inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  5. Compilation of selected deep-sea biological data for the US subseabed disposal project

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-03-01

    The US Subseabed Disposal Project (SDP) has compiled an extensive deep-sea biological data base to be used in calculating biological parameters of state and rate included in mathematical models of oceanographic transport of radionuclides. The data base is organized around a model deep-sea ecosystem which includes the following components: zooplankton, fish and other nekton, invertebrate benthic megafauna, benthic macrofauna, benthic meiofauna, heterotrophic microbiota, as well as suspended and sediment particulate organic carbon. Measurements of abundance and activity rates (e.g., respiration, production, sedimentation, etc.) reported in the international oceanographic literature are summarized in 23 tables. Included in these tables are the latitudinal position of the studies, as well as information describing sampling techniques and any special notes needed to better assess the data presented. This report has been prepared primarily as a resource document to be used in calculating parameter values for various modeling applications, and for preparing historical data reviews for other SDP reports. Depending on the intended use, these data will require further reduction and unit conversion

  6. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada.

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2007-10-24

    Between 1951 and 1992, 828 underground tests were conducted on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  7. Discipline, Dilemmas, Decisions and Data Distribution in the Planning and Compilation of Monolingual Dictionaries

    Directory of Open Access Journals (Sweden)

    Rufus H Gouws

    2011-10-01

    Full Text Available

    Abstract: Bilingual dictionaries play an important role in the standardisation of a language and are often the first dictionary type to be compiled for a given speech community. However, this may never lead to an underestimation of the role and importance of monolingual descriptive dictionaries in the early lexicographic development of a language. In the planning of first descriptive dictionaries the choice of the proper subtype and a consistent application of theoretical principles should be regarded as of extreme importance. Even the compilation of a restricted descriptive dictionary should be done according to similar theoretical principles as those applying to comprehensive dictionaries. This contribution indicates a number of dilemmas confronting the lexicographer during the compilation of restricted monolingual descriptive dictionaries. Attention is given to the role of lexicographic functions and the choice and presentation of lexicographic data, with special reference to the presentation of certain types of polysemous senses which are subjected to frequency of use restrictions. Emphasis is placed on the value of a heterogeneous article structure and a micro-architecture in the articles of restricted dictionaries.

    Keywords: ACCESS STRUCTURE, DATA DISTRIBUTION, FRAME STRUCTURE, FRE-QUENCY OF USE, HETEROGENEOUS ARTICLE STRUCTURE, LEXICOGRAPHIC FUNC-TIONS, LEXICOGRAPHIC PROCESS, MICRO-ARCHITECTURE, MONOLINGUAL DICTION-ARY, POLYSEMY, SEMANTIC DATA, TEXT BLOCK, USER-FRIENDLINESS, USER-PERSPEC-TIVE, VERTICAL ARCHITECTONIC EXTENSION

    Opsomming: Dissipline, dilemmas, besluite en dataverspreiding in die beplanning en samestelling van eentalige woordeboeke. Tweetalige woordeboeke speel 'n belangrike rol in die standaardisering van taal en is dikwels die eerste woordeboektipe wat vir 'n bepaalde taalgemeenskap saamgestel word. Dit mag egter nie tot 'n geringskatting lei van die rol en waarde van eentalige verklarende woordeboeke in die

  8. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  9. Data compilations for primary production, herbivory, decomposition, and export for different types of marine communities, 1962-2002 (NODC Accession 0054500)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a compilation of published data on primary production, herbivory, and nutrient content of primary producers in pristine communities of...

  10. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  11. Compilation of field-scale caisson data on solute transport in the unsaturated zone

    International Nuclear Information System (INIS)

    Polzer, W.L.; Essington, E.H.; Fuentes, H.R.; Nyhan, J.W.

    1986-11-01

    Los Alamos National Laboratory has conducted technical support studies to assess siting requirements mandated by Nuclear Regulatory Commission in 10 CFR Part 61. Field-scale transport studies were conducted under unsaturated moisture conditions and under steady and unsteady flow conditions in large caissons located and operated in a natural (field) environment. Moisture content, temperature, flow rate, base-line chemical, tracer influent, and tracer breakthrough data collected during tracer migration studies in the caisson are compiled in tables and graphs. Data suggest that the imposition of a period of drainage (influent solution flow was stopped) may cause an increase in tracer concentration in the soil solution at various sampling points in the caisson. Evaporation during drainage and diffusion of the tracers from immobile to mobile water are two phenomena that could explain the increase. Data also suggest that heterogeneity of sorption sites may increase the variability in transport of sorbing tracers compared with nonsorbing tracers

  12. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  13. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  14. Energy crops. Data for planning of energy crop cultivation. KTBL data compilation with internet services; Energiepflanzen. Daten fuer die Planung des Energiepflanzenanbaus. KTBL-Datensammlung mit Internetangebot

    Energy Technology Data Exchange (ETDEWEB)

    Eckel, H.; Grube, J.; Zimmer, E. (comps.)

    2006-07-01

    Based on the KTBL data compilation ''Betriebsplanung Landwirtschaft'', this data compilation (''Datensammlung Energiepflanzen'') provides comprehensive information on the cultivation of energy crops and production planning. Production techniques are outlined up to the final step of provision to the consumer, so that full-scale cost calculation is possible. Hints for cultivation are presented which take into account the differences between food and fodder crop cultivation. Rare crops are gone into for which little experience is available but which have great potential for utilisation in agriculture. Energetic utilisation is a field for a wider range of crops and with new options for crop rotation. These are discussed in two separate chapters. There is also information on legal aspects of energy crop production, relevant standards, and quality requirements on substrates for energetic use and for secondary harvesting. (orig.)

  15. Palaeoecological studies as a source of peat depth data: A discussion and data compilation for Scotland

    Directory of Open Access Journals (Sweden)

    J. Ratcliffe

    2016-06-01

    Full Text Available The regional/national carbon (C stock of peatlands is often poorly characterised, even for comparatively well-studied areas. A key obstacle to better estimates of landscape C stock is the scarcity of data on peat depth, leading to simplistic assumptions. New measurements of peat depth become unrealistically resource-intensive when considering large areas. Therefore, it is imperative to maximise the use of pre-existing datasets. Here we propose that one potentially valuable and currently unexploited source of peat depth data is palaeoecological studies. We discuss the value of these data and present an initial compilation for Scotland (United Kingdom which consists of records from 437 sites and yields an average depth of 282 cm per site. This figure is likely to be an over-estimate of true average peat depth and is greater than figures used in current estimates of peatland C stock. Depth data from palaeoecological studies have the advantages of wide distribution, high quality, and often the inclusion of valuable supporting information; but also the disadvantage of spatial bias due to the differing motivations of the original researchers. When combined with other data sources, each with its own advantages and limitations, we believe that palaeoecological datasets can make an important contribution to better-constrained estimates of peat depth which, in turn, will lead to better estimates of peatland landscape carbon stock.

  16. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  17. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  18. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2009-10-08

    Between 1951 and 1992, underground nuclear weapons testing was conducted at 828 sites on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  19. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  20. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    International Nuclear Information System (INIS)

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  1. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  2. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the ecotoxicological effect concentrations, the occurrence

  3. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  4. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  5. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1986-04-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section list names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  6. Compilation of kinetic data for geochemical calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.; Savage, D.; Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the mineralogical and

  7. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  8. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  9. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  10. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  11. Summary report of the 1. research co-ordination meeting on compilation and evaluation of photonuclear data for applications

    International Nuclear Information System (INIS)

    1997-04-01

    The present report contains the summary of the first Research Co-ordination Meeting on ''Compilation and Evaluation of Photonuclear Data for Applications'', held in Obninsk, Russia, from 3 to 6 December 1996. The project aims to produce a Technical Document on Photonuclear Data Library for Applications and to develop an IAEA Photonuclear Data Library. Summarized are the conclusions and recommendations of the meeting together with a detailed list of actions. Attached is the information sheet on the project, the agenda of the meeting and the list of participants along with extended abstracts of their presentations. Refs, figs, tabs

  12. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1987-09-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The INDC Secretariat tries to maintain this list up-to-date in order to facilitate an efficient interchange of information on nuclear data topics. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  13. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the

  14. Progress in fission product nuclear data. Information about activities in the field of measurements and compilations/evaluations of fission product nuclear data (FPND)

    International Nuclear Information System (INIS)

    Lammer, G.

    1978-07-01

    This is the fourth issue of a report series on Fission Product Nuclear Data (FPND) which is published by the Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA). The purpose of this series is to inform scientists working on FPND, or using such data, about all activities in this field which are planned, ongoing, or have recently been completed. The main part of this report consists of unaltered original contributions which the authors have sent to IAEA/NDS. The types of activities being included in this report are measurements, compilations and evaluations of: Fission product yields (neutron induced and spontaneous fission); neutron reaction cross sections of fission products; data related to the radioactive decay of fission products; delayed neutron data of fission products; and lumped fission product data (decay heat, absorption etc.)

  15. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  16. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  17. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  18. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  19. USGS compilation of geographic information system (GIS) data of coal mines and coal-bearing areas in Mongolia

    Science.gov (United States)

    Trippi, Michael H.; Belkin, Harvey E.

    2015-09-10

    Geographic information system (GIS) information may facilitate energy studies, which in turn provide input for energy policy decisions. The U.S. Geological Survey (USGS) has compiled GIS data representing coal mines, deposits (including those with and without coal mines), occurrences, areas, basins, and provinces of Mongolia as of 2009. These data are now available for download, and may be used in a GIS for a variety of energy resource and environmental studies of Mongolia. Chemical data for 37 coal samples from a previous USGS study of Mongolia (Tewalt and others, 2010) are included in a downloadable GIS point shapefile and shown on the map of Mongolia. A brief report summarizes the methodology used for creation of the shapefiles and the chemical analyses run on the samples.

  20. Toward a Last Interglacial Compilation Using a Tephra-based Chronology: a Future Reference For Model-data Comparison

    Science.gov (United States)

    Bazin, L.; Govin, A.; Capron, E.; Nomade, S.; Lemieux-Dudon, B.; Landais, A.

    2017-12-01

    The Last Interglacial (LIG, 129-116 ka) is a key period to decipher the interactions between the different components of the climate system under warmer-than-preindustrial conditions. Modelling the LIG climate is now part of the CMIP6/PMIP4 targeted simulations. As a result, recent efforts have been made to propose surface temperature compilations focusing on the spatio-temporal evolution of the LIG climate, and not only on its peak warmth as previously proposed. However, the major limitation of these compilations remains in the climatic alignment of records (e.g. temperature, foraminiferal δ18O) that is performed to define the sites' chronologies. Such methods prevent the proper discussion of phase relationship between the different sites. Thanks to recent developments of the Bayesian Datice dating tool, we are now able to build coherent multi-archive chronologies with a proper propagation of the associated uncertainties. We make the best use of common tephra layers identified in well-dated continental archives and marine sediment cores of the Mediterranean region to propose a coherent chronological framework for the LIG independent of any climatic assumption. We then extend this precise chronological context to the North Atlantic as a first step toward a global coherent compilation of surface temperature and stable isotope records. Based on this synthesis, we propose guidelines for the interpretation of different proxies measured from different archives that will be compared with climate model parameters. Finally, we present time-slices (e.g. 127 ka) of the preliminary regional synthesis of temperature reconstructions and stable isotopes to serve as reference for future model-data comparison of the up-coming CMIP6/PMIP4 LIG simulations.

  1. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  2. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  3. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  4. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  5. Compilation of data for the analysis of radionuclide migration from SFL 3-5

    International Nuclear Information System (INIS)

    Skagius, K.; Pettersson, Michael; Wiborgh, M.; Albinsson, Yngve; Holgersson, Stellan

    1999-12-01

    A preliminary safety assessment of the deep repository for long-lived, low and intermediate level waste, SFL 3-5, has been made. This report contains a compilation of data selected for the calculations of the migration of radionuclides and toxic metals from the waste to the biosphere. It also contains the data needed for the next step, which is to calculate dose to man from the far-field release figures. In the preliminary safety assessment it is assumed that SFL 3-5 is located in connection to the deep repository for spent fuel. This makes it possible to utilise site-specific information derived within the safety assessment of the deep repository for spent fuel, SR 97, for the sites Aberg, Beberg and Ceberg. When information from SR 97 is utilised, the values selected are as far as possible those proposed as a 'reasonable estimate' for the migration calculations in SR 97. The selection of values for parameters specific for the calculation of migration from the SFL 3-5 repository is in general on the pessimistic side. The uncertainty in the selected values is discussed and if possible also quantified

  6. Compilation of data for the analysis of radionuclide migration from SFL 3-5

    Energy Technology Data Exchange (ETDEWEB)

    Skagius, K.; Pettersson, Michael; Wiborgh, M. [Kemakta Konsult AB, Stockholm (Sweden); Albinsson, Yngve; Holgersson, Stellan [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1999-12-01

    A preliminary safety assessment of the deep repository for long-lived, low and intermediate level waste, SFL 3-5, has been made. This report contains a compilation of data selected for the calculations of the migration of radionuclides and toxic metals from the waste to the biosphere. It also contains the data needed for the next step, which is to calculate dose to man from the far-field release figures. In the preliminary safety assessment it is assumed that SFL 3-5 is located in connection to the deep repository for spent fuel. This makes it possible to utilise site-specific information derived within the safety assessment of the deep repository for spent fuel, SR 97, for the sites Aberg, Beberg and Ceberg. When information from SR 97 is utilised, the values selected are as far as possible those proposed as a 'reasonable estimate' for the migration calculations in SR 97. The selection of values for parameters specific for the calculation of migration from the SFL 3-5 repository is in general on the pessimistic side. The uncertainty in the selected values is discussed and if possible also quantified.

  7. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  8. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  9. Compilation of data concerning know and suspected water hammer events in nuclear power plants, CY 1969

    International Nuclear Information System (INIS)

    Chapman, R.L.; Christensen, D.D.; Dafoe, R.E.; Hanner, O.M.; Wells, M.E.

    1981-05-01

    This report compiles data concerning known and suspected water hammer events reported by BWR and PWR power plants in the United States from January 1, 1969, to May 1, 1981. This information is summarized for each event and is tabulated for all events by plant, plant type, year of occurrence, type of water hammer, system affected, basis/cause for the event, and damage incurred. Information is also included from other events not specifically identified as water hammer related. These other events involved vibration and/or system components similar to those involved in the water hammer events. The other events are included to ensure completeness of the report, but are not used to point out particular facts or trends. This report does not evaluate findings abstracted from the data

  10. Deuterium in the water cycle of the Schirmacher Oasis (Dronning Maud Land, East Antarctica). A data compilation

    International Nuclear Information System (INIS)

    Kowski, P.; Richter, W.

    1988-01-01

    The Schirmacher Oasis (Dronning Maud Land) - one of the rock deserts of the South Polar region - is situated on the coast of the Antarctic continent, between inland and shelf ice. The data compilation contains results of deuterium studies from different parts of the local water cycle and is arranged according to the main parts: precipitation and atmospheric moisture, both collected near Novolazarevskaya Station, lake water, surface snow and ice, shallow drill cores of snow and ice, and from melt water runoff. Finally, monthly means of precipitation and atmospheric moisture are given. (author)

  11. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  12. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  13. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  14. Compilation of new and previously published geochemical and modal data for Mesoproterozoic igneous rocks of the St. Francois Mountains, southeast Missouri

    Science.gov (United States)

    du Bray, Edward A.; Day, Warren C.; Meighan, Corey J.

    2018-04-16

    The purpose of this report is to present recently acquired as well as previously published geochemical and modal petrographic data for igneous rocks in the St. Francois Mountains, southeast Missouri, as part of an ongoing effort to understand the regional geology and ore deposits of the Mesoproterozoic basement rocks of southeast Missouri, USA. The report includes geochemical data that is (1) newly acquired by the U.S. Geological Survey and (2) compiled from numerous sources published during the last fifty-five years. These data are required for ongoing petrogenetic investigations of these rocks. Voluminous Mesoproterozoic igneous rocks in the St. Francois Mountains of southeast Missouri constitute the basement buried beneath Paleozoic sedimentary rock that is over 600 meters thick in places. The Mesoproterozoic rocks of southeast Missouri represent a significant component of approximately 1.4 billion-year-old (Ga) igneous rocks that crop out extensively in North America along the southeast margin of Laurentia and subsequent researchers suggested that iron oxide-copper deposits in the St. Francois Mountains are genetically associated with ca. 1.4 Ga magmatism in this region. The geochemical and modal data sets described herein were compiled to support investigations concerning the tectonic setting and petrologic processes responsible for the associated magmatism.

  15. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  16. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  17. MIRNA-DISTILLER: a stand-alone application to compile microRNA data from databases

    Directory of Open Access Journals (Sweden)

    Jessica K. Rieger

    2011-07-01

    Full Text Available MicroRNAs (miRNA are small non-coding RNA molecules of ~22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3’-UTR of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp

  18. MIRNA-DISTILLER: A Stand-Alone Application to Compile microRNA Data from Databases.

    Science.gov (United States)

    Rieger, Jessica K; Bodan, Denis A; Zanger, Ulrich M

    2011-01-01

    MicroRNAs (miRNA) are small non-coding RNA molecules of ∼22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3'-untranslated region of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp.

  19. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  20. Thermodynamic data for modeling acid mine drainage problems: compilation and estimation of data for selected soluble iron-sulfate minerals

    Science.gov (United States)

    Hemingway, Bruch S.; Seal, Robert R.; Chou, I-Ming

    2002-01-01

    Enthalpy of formation, Gibbs energy of formation, and entropy values have been compiled from the literature for the hydrated ferrous sulfate minerals melanterite, rozenite, and szomolnokite, and a variety of other hydrated sulfate compounds. On the basis of this compilation, it appears that there is no evidence for an excess enthalpy of mixing for sulfate-H2O systems, except for the first H2O molecule of crystallization. The enthalpy and Gibbs energy of formation of each H2O molecule of crystallization, except the first, in the iron(II) sulfate - H2O system is -295.15 and -238.0 kJ?mol-1, respectively. The absence of an excess enthalpy of mixing is used as the basis for estimating thermodynamic values for a variety of ferrous, ferric, and mixed-valence sulfate salts of relevance to acid-mine drainage systems.

  1. EPOCA/EUR-OCEANS data compilation on the biological and biogeochemical responses to ocean acidification

    Directory of Open Access Journals (Sweden)

    A.-M. Nisumaa

    2010-07-01

    Full Text Available The uptake of anthropogenic CO2 by the oceans has led to a rise in the oceanic partial pressure of CO2, and to a decrease in pH and carbonate ion concentration. This modification of the marine carbonate system is referred to as ocean acidification. Numerous papers report the effects of ocean acidification on marine organisms and communities but few have provided details concerning full carbonate chemistry and complementary observations. Additionally, carbonate system variables are often reported in different units, calculated using different sets of dissociation constants and on different pH scales. Hence the direct comparison of experimental results has been problematic and often misleading. The need was identified to (1 gather data on carbonate chemistry, biological and biogeochemical properties, and other ancillary data from published experimental data, (2 transform the information into common framework, and (3 make data freely available. The present paper is the outcome of an effort to integrate ocean carbonate chemistry data from the literature which has been supported by the European Network of Excellence for Ocean Ecosystems Analysis (EUR-OCEANS and the European Project on Ocean Acidification (EPOCA. A total of 185 papers were identified, 100 contained enough information to readily compute carbonate chemistry variables, and 81 data sets were archived at PANGAEA – The Publishing Network for Geoscientific & Environmental Data. This data compilation is regularly updated as an ongoing mission of EPOCA.

    Data access: http://doi.pangaea.de/10.1594/PANGAEA.735138

  2. Annual accumulation over the Greenland ice sheet interpolated from historical and newly compiled observation data

    Science.gov (United States)

    Shen, Dayong; Liu, Yuling; Huang, Shengli

    2012-01-01

    The estimation of ice/snow accumulation is of great significance in quantifying the mass balance of ice sheets and variation in water resources. Improving the accuracy and reducing uncertainty has been a challenge for the estimation of annual accumulation over the Greenland ice sheet. In this study, we kriged and analyzed the spatial pattern of accumulation based on an observation data series including 315 points used in a recent research, plus 101 ice cores and snow pits and newly compiled 23 coastal weather station data. The estimated annual accumulation over the Greenland ice sheet is 31.2 g cm−2 yr−1, with a standard error of 0.9 g cm−2 yr−1. The main differences between the improved map developed in this study and the recently published accumulation maps are in the coastal areas, especially southeast and southwest regions. The analysis of accumulations versus elevation reveals the distribution patterns of accumulation over the Greenland ice sheet.

  3. Compilation of criticality data involving thorium or 233U and light water moderation

    Energy Technology Data Exchange (ETDEWEB)

    Gore, B.F.

    1978-07-01

    The literature has been searched for criticality data for light water moderated systems which contain thorium or /sup 233/U, and data found are compiled herein. They are from critical experiments, extrapolations, and exponential experiments performed with homogeneous solutions and metal spheres of /sup 233/U; with lattices of fuel rods containing highly enriched /sup 235/UO/sub 2/ - ThO/sub 2/ and /sup 233/UO/sub 2/ - ThO/sub 2/; and with arrays of cyclinders of /sup 233/U solutions. The extent of existing criticality data has been compared with that necessary to implement a thorium-based fuel cycle. No experiments have been performed with any solutions containing thorium. Neither do data exist for homogeneous /sup 233/U systems with H/U < 34, except for solid metal systems. Arrays of solution cylinders up to 3 x 3 x 3 have been studied. Data for solutions containing fixed or soluble poisons are very limited. All critical lattices using /sup 233/UO/sub 2/ - ThO/sub 2/ fuels (LWBR program) were zoned radially, and in most cases axially also. Only lattice experiments using /sup 235/UO/sub 2/ - ThO/sub 2/ fuels have been performed using a single fuel rod type. Critical lattices of /sup 235/UO/sub 2/ - ThO/sub 2/ rods poisoned with boron have been measured, but only exponential experiments have been performed using boron-poisoned lattices of /sup 233/UO/sub 2/ - ThO/sub 2/ rods. No criticality data exist for denatured fuels (containing significant amounts of /sup 238/U) in either solution or lattice configurations.

  4. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  5. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  6. Irradiation of spices, herbs and other vegetable seasonings: A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1992-02-01

    This publication contains a compilation of all available scientific and technical data on the irradiation of spices, herbs and other vegetable seasonings. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The Compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of the food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International Conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. It is hoped that the information contained in this publication will assist governments in considering requests for the approval of radiation treatment of spices, herbs and other vegetable seasonings, or requests for authorization to import such irradiated products. Refs and tabs

  7. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  8. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  9. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  10. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  11. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  12. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  13. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  14. Compilation of TFTR materials data

    International Nuclear Information System (INIS)

    Havener, W.J.

    1975-12-01

    In order to document the key thermophysical property data used in the conceptual design of Tokamak Fusion Test Reactor (TFTR) systems and components, a series of data packages has been prepared. It is expected that data for additional materials will be added and the information already provided will be updated to provide a project-wide data base

  15. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  16. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  17. Lye-data-compiled-scihub

    Data.gov (United States)

    U.S. Environmental Protection Agency — The data contained in this worksheet provides the quantitative detection of potentially pathogenic fungi in treated and untreated rainwater samples. This dataset is...

  18. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  19. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in Mizunami group and Toki granite. Fiscal year 2012

    International Nuclear Information System (INIS)

    Ohmori, Kazuaki; Iwatsuki, Teruki; Shingu, Shinya; Masuda, Kaoru; Aosai, Daisuke; Inui, Michiharu

    2014-03-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2012. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  20. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  1. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  2. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  3. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  4. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  5. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  6. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  7. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  8. Compiled data set of exact NOE distance limits, residual dipolar couplings and scalar couplings for the protein GB3

    Directory of Open Access Journals (Sweden)

    Beat Vögeli

    2015-12-01

    Full Text Available We compiled an NMR data set consisting of exact nuclear Overhauser enhancement (eNOE distance limits, residual dipolar couplings (RDCs and scalar (J couplings for GB3, which forms one of the largest and most diverse data set for structural characterization of a protein to date. All data have small experimental errors, which are carefully estimated. We use the data in the research article Vogeli et al., 2015, Complementarity and congruence between exact NOEs and traditional NMR probes for spatial decoding of protein dynamics, J. Struct. Biol., 191, 3, 306–317, doi:10.1016/j.jsb.2015.07.008 [1] for cross-validation in multiple-state structural ensemble calculation. We advocate this set to be an ideal test case for molecular dynamics simulations and structure calculations.

  9. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  10. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  11. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  12. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  13. The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms

    Science.gov (United States)

    Balch, William; Evans, Robert; Brown, Jim; Feldman, Gene; Mcclain, Charles; Esaias, Wayne

    1992-01-01

    Global pigment and primary productivity algorithms based on a new data compilation of over 12,000 stations occupied mostly in the Northern Hemisphere, from the late 1950s to 1988, were tested. The results showed high variability of the fraction of total pigment contributed by chlorophyll, which is required for subsequent predictions of primary productivity. Two models, which predict pigment concentration normalized to an attenuation length of euphotic depth, were checked against 2,800 vertical profiles of pigments. Phaeopigments consistently showed maxima at about one optical depth below the chlorophyll maxima. CZCS data coincident with the sea truth data were also checked. A regression of satellite-derived pigment vs ship-derived pigment had a coefficient of determination. The satellite underestimated the true pigment concentration in mesotrophic and oligotrophic waters and overestimated the pigment concentration in eutrophic waters. The error in the satellite estimate showed no trends with time between 1978 and 1986.

  14. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  15. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  16. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  17. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  18. Report on the second consultants' meeting of nuclear reaction data centers Kiev, USSR, 11-16 April 1977. Including the thirteenth four-center meeting and the third meeting on charged particle nuclear data compilation

    International Nuclear Information System (INIS)

    Lemmel, H.D.

    1977-10-01

    This second ''NRDC meeting'' combined the 13th ''four centers meeting'' (consultants' meeting of the four neutron nuclear data centers) with the third ''CPND meeting'' (consultants' meeting on charged particle nuclear data compilation). In Part I of the meeting, the neutron data centers held a special session on neutron data matters, in particular on the jointly operated neutron data index CINDA, whereas all items of more general interest, in particular the data exchange system EXFOR, were treated in Part II of the meeting

  19. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  20. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  1. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  2. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  3. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  4. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  5. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  6. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  7. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  8. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  9. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  10. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  11. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  12. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  13. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  14. Compilation of Water-Resources Data and Hydrogeologic Setting for Brunswick County, North Carolina, 1933-2000

    Science.gov (United States)

    Fine, Jason M.; Cunningham, William L.

    2001-01-01

    Water-resources data were compiled for Brunswick County, North Carolina, to describe the hydrologic conditions of the County. Hydrologic data collected by the U.S. Geological Survey as well as data collected by other governmental agencies and reviewed by the U.S. Geological Survey are presented. Data from four weather stations and two surface-water stations are summarized. Data also are presented for land use and land cover, soils, geology, hydrogeology, 12 continuously monitored ground-water wells, 73 periodically measured ground-water wells, and water-quality measurements from 39 ground-water wells. Mean monthly precipitation at the Longwood, Shallotte, Southport, and Wilmington Airport weather stations ranged from 2.19 to 7.94 inches for the periods of record, and mean monthly temperatures at the Longwood, Southport, and Wilmington Airport weather stations ranged from 43.4 to 80.1 degrees Fahrenheit for the periods of record. An evaluation of land-use and land-cover data for Brunswick County indicated that most of the County is either forested land (about 57 percent) or wetlands (about 29 percent). Cross sections are presented to illustrate the general hydrogeology beneath Brunswick County. Water-level data for Brunswick County indicate that water levels ranged from about 110 feet above mean sea level to about 22 feet below mean sea level. Chloride concentrations measured in aquifers in Brunswick County ranged from near 0 to 15,000 milligrams per liter. Chloride levels in the Black Creek and Cape Fear aquifers were measured at well above the potable limit for ground water of 250 milligrams per liter set by the U.S. Environmental Protection Agency for safe drinking water.

  15. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  16. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  17. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  18. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  19. Northern hemisphere mid-latitude geomagnetic anomaly revealed from Levantine Archaeomagnetic Compilation (LAC).

    Science.gov (United States)

    Shaar, R.; Tauxe, L.; Agnon, A.; Ben-Yosef, E.; Hassul, E.

    2015-12-01

    The rich archaeological heritage of Israel and nearby Levantine countries provides a unique opportunity for archaeomagnetic investigation in high resolution. Here we present a summary of our ongoing effort to reconstruct geomagnetic variations of the past several millennia in the Levant at decadal to millennial resolution. This effort at the Southern Levant, namely the "Levantine Archaeomagnetic Compilation" (LAC), presently consists of data from over 650 well-dated archaeological objects including pottery, slag, ovens, and furnaces. In this talk we review the methodological challenges in achieving a robust master secular variation curve with realistic error estimations from a large number of different datasets. We present the current status of the compilation, including the southern and western Levant LAC data (Israel, Cyprus, and Jordan) and other published north-eastern Levant data (Syria and southern Turkey), and outline the main findings emerging from these data. The main feature apparent from the new compilation is an extraordinary intensity high that developed over the Levant region during the first two millennia BCE. The climax of this event is a double peak intensity maximum starting at ca. 1000 BCE and ending at ca. 735 BCE, accompanied with at least two events of geomagnetic spikes. Paleomagnetic directions from this period demonstrate anomalies of up to 20 degrees far from the averaged GAD field. This leads us to postulate that the maximum in the intensity is a manifestation of an intense mid-latitude local positive geomagnetic anomaly that persisted for over two centuries.

  20. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  1. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  2. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  3. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  4. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  5. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  6. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  7. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  8. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  9. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami Group and the Toki Granite. Fiscal year 2014

    International Nuclear Information System (INIS)

    Hayashida, Kazuki; Munemoto, Takashi; Iwatsuki, Teruki; Aosai, Daisuke; Inui, Michiharu

    2016-06-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2014. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  10. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami group and the Toki granite. Fiscal year 2013

    International Nuclear Information System (INIS)

    Ohmori, Kazuaki; Hasegawa, Takashi; Munemoto, Takashi; Iwatsuki, Teruki; Masuda, Kaoru; Aosai, Daisuke; Inui, Michiharu

    2014-12-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2013. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  11. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  12. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  13. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  14. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  15. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    International Nuclear Information System (INIS)

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  16. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  17. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  18. Mineralogy, geochemistry, porosity and redox properties of rocks from Forsmark. Compilation of data from the regional model volume for SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Sandstroem, Bjoern (WSP Sverige AB, Stockholm (Sweden)); Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden))

    2009-11-15

    This report is a compilation of the data acquired during the Forsmark site investigation programme on the mineralogy, geochemistry, redox properties and porosity of different rock types at Forsmark. The aim is to provide a final summary of the available data for use during the SR-Site modelling work. Data presented in this report represent the regional model volume and have previously been published in various SKB reports. The data have been extracted from the SKB database Sicada and are presented as calculated median values, data range and lower/upper quartile. The representativity of all samples used for the calculations have been evaluated and data from samples where there is insufficient control on the rock type have been omitted. Rock samples affected by alteration have been omitted from the unaltered samples and are presented separately based on type of alteration (e.g. oxidised or albitized rock)

  19. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs

  20. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs.

  1. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami group and the Toki granite. Fiscal year 2015

    International Nuclear Information System (INIS)

    Hayashida, Kazuki; Kato, Toshihiro; Munemoto, Takashi; Kubota, Mitsuru; Iwatsuki, Teruki; Aosai, Daisuke; Inui, Michiharu

    2017-03-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect of excavation and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2015. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method and analytical method) and methodology for quality control are described. (author)

  2. Data compilation for the 1984 interim report of the Scientific Advisory Council on Forest Decline/Air Pollutions of the Federal German Government and Laender

    International Nuclear Information System (INIS)

    1984-01-01

    The data compilation contains contributions towards an inventory of damage and counter-measures at the forestry level, or investigations into causes and effects in the case of a direct impact on the vegetation, via the soil or interactions between trees, as well as regarding air pollutants and technical measures to reduce emissions (individual entries for parts C2). (DG) [de

  3. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  4. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  5. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  6. SERKON program for compiling a multigroup library to be used in BETTY calculation

    International Nuclear Information System (INIS)

    Nguyen Phuoc Lan.

    1982-11-01

    A SERKON-type program was written to compile data sets generated by FEDGROUP-3 into a multigroup library for BETTY calculation. A multigroup library was generated from the ENDF/B-IV data file and tested against the TRX-1 and TRX-2 lattices with good results. (author)

  7. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Rose, P.F.; Daly, A.

    1987-01-01

    This request list summarizes the current needs of the US nuclear energy programs and other applied technologies for experimentally measured nuclear data. The request list is ordered by target nucleus (isotope) and then reaction type (quantity). An attempt has been made to describe the quantity in standard notation. An appendix contains a glossary of the symbols used with a short explanatory text. Because of the changing and continuing character of the need for data request information, as well as the probability that current measurements may satisfy a portion of the request, this report is to be regarded as a working document. In fact, it is maintained as a data base by the National Nuclear Data Center. Procedures for submitting data request, priority assignments, and the DOE/NDC Committee membership are included

  8. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  9. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File

  10. Run-Time and Compiler Support for Programming in Adaptive Parallel Environments

    Directory of Open Access Journals (Sweden)

    Guy Edjlali

    1997-01-01

    Full Text Available For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have developed a run-time library to provide this support. We discuss how the run-time library can be used by compilers of high-performance Fortran (HPF-like languages to generate code for an adaptive environment. We present performance results for a Navier-Stokes solver and a multigrid template run on a network of workstations and an IBM SP-2. Our experiments show that if the number of processors is not varied frequently, the cost of data redistribution is not significant compared to the time required for the actual computation. Overall, our work establishes the feasibility of compiling HPF for a network of nondedicated workstations, which are likely to be an important resource for parallel programming in the future.

  11. Compilation of the FY 1999 Department of the Navy Working Capital Fund Financial Statements

    National Research Council Canada - National Science Library

    2000-01-01

    ...) Cleveland Center consistently and accurately compiled and consolidated financial data received from Navy field organizations and other sources to prepare the FY 1999 Navy Working Capital Fund financial statements...

  12. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  13. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  14. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  15. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  16. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  17. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  18. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  19. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  20. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  1. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  2. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  3. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  4. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  5. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains the storage coefficient, porosity, compressibility and fracture data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. These sites are the following: Stripa Mine, Sweden; Finnsjon, Kamlunge, Fjallveden, Gidea, Svartboberget, Sweden; Olkiluoto, Loviisa, Lavia, Finland; Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington; Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided. The fracture data for the first three of the sites listed above are contained in this volume. The fracture data for the remaining research research sites are discussed in Volume 4

  6. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  7. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  8. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  9. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  10. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  11. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  12. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  13. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  14. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  15. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  16. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  17. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors

  18. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  19. The KNK II/1 fuel assembly NY-205: Compilation of the irradiation history and the fuel and fuel pin fabrication data of the INTERATOM data bank system BESEX

    International Nuclear Information System (INIS)

    Patzer, G.; Geier, F.

    1988-01-01

    The fuel assembly NY-205 has been irradiated during the first and the second core of KNK II with a total residence time of 832 equivalent full-power days. A maximum burnup of 175.000 MWd/tHM or 18.6 % was reached with a maximum steel damage of 66 dpa-NRT. For the cladding the materials 1.4970 and 1.4981 have been used in different metallurgical conditions, and for the Uranium/Plutonium mixed- oxide fuel the most important variants of the major fabrication parameters had been realized. The assembly will be brought to the Hot Cells of the KfK Karlsruhe for post-irradiation examination in February 1988, so that the knowledge of the fabrication data is of interest for the selection of fuel pins and for the evaluation of the examination results. Therefore this report compiles the fuel and fuel pin fabrication data from the INTERATOM data bank system BESEX and additionally, an overview of the irradiation history of the assembly is given [de

  20. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  1. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  2. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  3. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  4. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  5. Compilation of carbon-14 data

    International Nuclear Information System (INIS)

    Paasch, R.A.

    1985-01-01

    A review and critical analysis was made of the original sources of carbon-14 in the graphite moderator and reflector zones of the eight Hanford production reactors, the present physical and chemical state of the carbon-14, pathways (other than direct combustion) by which the carbon-14 could be released to the biosphere, and the maximum rate at which it might be released under circumstances which idealistically favor the release. Areas of uncertainty are noted and recommendations are made for obtaining additional data in three areas: (1) release rate of carbon-14 from irradiated graphite saturated with aerated water; (2) characterization of carbon-14 deposited outside the moderator and reflector zones; and (3) corrosion/release rate of carbon-14 from irradiated steel and aluminum alloys

  6. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  7. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  8. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index.

  9. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index.

  10. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    International Nuclear Information System (INIS)

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index

  11. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    International Nuclear Information System (INIS)

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index

  12. 1976 compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1977-01-01

    This list of currently existing National Nuclear Data Committees, and their memberships, is published with the object of promoting the interaction and enhance the awareness of nuclear data activities in IAEA Member States. The following Member States have indicated the existence of a nuclear data committee in their countries: Bangladesh, Bolivia, Bulgaria, France, Hungary, India, Japan, Romania, Sweden, USSR, United Kingdom, USA, Yugoslavia

  13. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  14. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  15. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  16. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  17. Compiling for Novel Scratch Pad Memory based Multicore Architectures for Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Aviral

    2016-02-05

    The objective of this proposal is to develop tools and techniques (in the compiler) to manage data of a task and communication among tasks on the scratch pad memory (SPM) of the core, so that any application (a set of tasks) can be executed efficiently on an SPM based manycore architecture.

  18. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1984, July-September. Volume 9, No. 3

    International Nuclear Information System (INIS)

    1984-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. These precede the following indexes: Contractor Report Number, Personal Author, Subject, NRC Originating Organization (Staff Reports), NRC Contract Sponsor (Contractor Reports), Contractor, and Licensed Facility

  19. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  20. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  1. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  2. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  3. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  4. A compilation of structure functions in deep inelastic scattering

    International Nuclear Information System (INIS)

    Gehrmann, T.; Roberts, R.G.; Whalley, M.R.

    1999-01-01

    A compilation of all the available data on the unpolarized structure functions F 2 and xF 3 , R=(σ L /σ T ), the virtual photon asymmetries A 1 and A 2 and the polarized structure functions g 1 and g 2 , from deep inelastic lepton scattering off protons, deuterium and nuclei is presented. The relevant experiments at CERN, DESY, Fermilab and SLAC from 1991, the date of our earlier review [1], to the present day are covered. A brief general theoretical introduction is given followed by the data presented both in tabular and graphical form and, for the F 2 and xF 3 data, the predictions based on the MRST98 and CTEQ4 parton distribution functions are also displayed. All the data in this review, together with data on a wide variety of other reactions, can be found in and retrieved from the Durham-RAL HEP Databases on the World-Wide-Web (http://durpdg.dur.ac.uk/HEPDATA). (author)

  5. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  6. Neutron-induced activation measurements and EXFOR compilations in the energy range up to 20 MeV

    International Nuclear Information System (INIS)

    Semkova, V.; Otuka, N.

    2015-01-01

    Accurate neutron-induced activation cross-section data are of interest in many fields of science and applications. Such data are needed for calculations and analysis of neutron transport, activation of materials, gas production and radiation damage, dose rates etc. Experimental data provide bases for the parameterization of reaction cross section calculations, and for the assessment of nuclear models and evaluated data libraries. Activation technique in combination with gamma spectrometry is well known and widely used method for neutron-induced reaction cross-section measurements. However, in some cases considerable differences exist between the results from different experiments. A careful consideration of the all factors that may affect each particular measurement is needed in order to obtain reliable data. Measured data are of little value until they are made conveniently available for users and evaluators. The International Network of Nuclear Reaction Data Centres (NRDC) collaborates in collection, compilation and dissemination of experimental nuclear reaction data in the EXFOR data library. In the present work some aspects of the 58 Ni(n,p) 58 Co activation cross-section measurements at two different experimental facilities and EXFOR compilation files will be presented

  7. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  8. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  9. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  10. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  11. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  12. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  13. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  14. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  15. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  16. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  17. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  18. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  19. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  20. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  1. The National Assessment of Shoreline Change: a GIS compilation of vector shorelines and associated shoreline change data for the U.S. southeast Atlantic coast

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.

    2006-01-01

    The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Southeast Atlantic Coast (Florida, Georgia, South Carolina, North Carolina). These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates of shorelines and shoreline change rates can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the U.S. Southeast Atlantic Coast is the second in a series that already includes the Gulf of Mexico, and will eventually include the Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1997-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change for the U.S. Southeast Atlantic Coast at http://pubs.usgs.gov/of/2005/1401/ to get additional

  2. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  3. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  4. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains a continuation of the fracture data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. The sites discussed in this volume are the following: Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided

  5. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  6. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  7. Compilation of data on the release of radioactive substances in the vent air of nuclear power plants in the Federal Republic of Germany in 1975

    International Nuclear Information System (INIS)

    Winkelmann, I.; Endrulat, H.J.; Haubelt, R.; Westpfahl, U.

    1976-04-01

    The present compilation of data on the release of radioactive substances in the vent air of nuclear power plants in the FRG is a continuation of a report series on aerosol filter and iodine filter samples from the exhaust air control systems of the nuclear power plants Gundremmingen, Obrigheim, Wuergassen, Stade, Lingen and Biblis A. The reports have been issued by the Federal public health office since 1972. This report is supplemented by annual release values on radioactive noble gases, on short- and long-lived aerosols, and on gaseous 131 I, supplied by the individual nuclear power plants as in previous years on uniform questionnaires. Data on the release of tritium are also available from some nuclear power plants. (orig.) [de

  8. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains the permeability data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. These sites are the following: Stripa Mine, Sweden; Finnsjon, Kamlunge, Fjallveden, Gidea, Svartboberget, Sweden; Olkiluoto, Loviisa, Lavia, Finland; Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington; Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided

  9. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  10. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  11. Compilation of FY 1995 and FY 1996 DOD Financial Statements at the Defense Finance and Accounting Service, Indianapolis Center

    National Research Council Canada - National Science Library

    1996-01-01

    The audit objective was to determine whether the Defense Finance and Accounting Service, Indianapolis Center, consistently and accurately compiled financial data from field entities and other sources...

  12. Compilation of radiation damage test data

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Tavlet, M.

    1989-01-01

    This report summarizes radiation damage test data on commercially available organic cable insulation and jacket materials: Ethylene-propylene rubbers, polyethylenes, polyurethanes, silicone rubbers, and copolymers based on polyethylene. The materials have been irradiated either in a nuclear reactor, or with a cobalt-60 source, or in the CERN accelerators, at different dose rates. The absorbed doses were between 10 3 and 5x10 6 Gy. Mechanical properties, e.g. tensile strength, elongation at break, and hardness, have been tested on irradiated and non-irradiated samples, according to the recommendations of the International Electrotechnical Commission. The results are presented in the form of tables and graphs to show the effect of the absorbed dose on the measured properties. (orig.)

  13. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  14. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  15. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  16. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  17. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  18. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    International Nuclear Information System (INIS)

    Harrington, S.J.

    2011-01-01

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  19. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  20. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  1. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  2. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  3. Compilation of streamflow statistics calculated from daily mean streamflow data collected during water years 1901–2015 for selected U.S. Geological Survey streamgages

    Science.gov (United States)

    Granato, Gregory E.; Ries, Kernell G.; Steeves, Peter A.

    2017-10-16

    Streamflow statistics are needed by decision makers for many planning, management, and design activities. The U.S. Geological Survey (USGS) StreamStats Web application provides convenient access to streamflow statistics for many streamgages by accessing the underlying StreamStatsDB database. In 2016, non-interpretive streamflow statistics were compiled for streamgages located throughout the Nation and stored in StreamStatsDB for use with StreamStats and other applications. Two previously published USGS computer programs that were designed to help calculate streamflow statistics were updated to better support StreamStats as part of this effort. These programs are named “GNWISQ” (Get National Water Information System Streamflow (Q) files), updated to version 1.1.1, and “QSTATS” (Streamflow (Q) Statistics), updated to version 1.1.2.Statistics for 20,438 streamgages that had 1 or more complete years of record during water years 1901 through 2015 were calculated from daily mean streamflow data; 19,415 of these streamgages were within the conterminous United States. About 89 percent of the 20,438 streamgages had 3 or more years of record, and about 65 percent had 10 or more years of record. Drainage areas of the 20,438 streamgages ranged from 0.01 to 1,144,500 square miles. The magnitude of annual average streamflow yields (streamflow per square mile) for these streamgages varied by almost six orders of magnitude, from 0.000029 to 34 cubic feet per second per square mile. About 64 percent of these streamgages did not have any zero-flow days during their available period of record. The 18,122 streamgages with 3 or more years of record were included in the StreamStatsDB compilation so they would be available via the StreamStats interface for user-selected streamgages. All the statistics are available in a USGS ScienceBase data release.

  4. Compilation of the FY 1998 Army General Fund Financial Statements at the Defense Finance and Accounting Service Indianapolis Center

    National Research Council Canada - National Science Library

    1999-01-01

    Our objective was to determine whether the DFAS Indianapolis Center consistently and accurately compiled financial data from field activities and other sources for the FY 1998 Army General Fund financial statements...

  5. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  6. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  7. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  8. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  9. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  10. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  11. Low-temperature geothermal water in Utah: A compilation of data for thermal wells and springs through 1993

    Energy Technology Data Exchange (ETDEWEB)

    Blackett, R.E.

    1994-07-01

    The Geothermal Division of DOE initiated the Low-Temperature Geothermal Resources and Technology Transfer Program, following a special appropriation by Congress in 1991, to encourage wider use of lower-temperature geothermal resources through direct-use, geothermal heat-pump, and binary-cycle power conversion technologies. The Oregon Institute of Technology (OIT), the University of Utah Research Institute (UURI), and the Idaho Water Resources Research Institute organized the federally-funded program and enlisted the help of ten western states to carry out phase one. This first phase involves updating the inventory of thermal wells and springs with the help of the participating state agencies. The state resource teams inventory thermal wells and springs, and compile relevant information on each sources. OIT and UURI cooperatively administer the program. OIT provides overall contract management while UURI provides technical direction to the state teams. Phase one of the program focuses on replacing part of GEOTHERM by building a new database of low- and moderate-temperature geothermal systems for use on personal computers. For Utah, this involved (1) identifying sources of geothermal date, (2) designing a database structure, (3) entering the new date; (4) checking for errors, inconsistencies, and duplicate records; (5) organizing the data into reporting formats; and (6) generating a map (1:750,000 scale) of Utah showing the locations and record identification numbers of thermal wells and springs.

  12. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  13. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  14. Atomic data for controlled fusion research. Volume IV. Spectroscopic data for iron

    Energy Technology Data Exchange (ETDEWEB)

    Wiese, W.L. (ed.)

    1985-02-01

    Comprehensive spectroscopic data tables are presented for all ions of Fe. Tables of ionization potentials, wave lengths of spectral lines, atomic energy levels, and transition probabilities are given which were excerpted from general critical compilations. All utilized compilations are less than five years old and include data on electric dipole as well as magnetic dipole transitions.

  15. Atomic data for controlled fusion research. Volume IV. Spectroscopic data for iron

    International Nuclear Information System (INIS)

    Wiese, W.L.

    1985-02-01

    Comprehensive spectroscopic data tables are presented for all ions of Fe. Tables of ionization potentials, wave lengths of spectral lines, atomic energy levels, and transition probabilities are given which were excerpted from general critical compilations. All utilized compilations are less than five years old and include data on electric dipole as well as magnetic dipole transitions

  16. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  17. Remeasurement and compilation of excitation function of proton induced reactions on iron for activation techniques

    International Nuclear Information System (INIS)

    Takacs, S.; Vasvary, L.; Tarkanyi, F.

    1994-01-01

    Excitation functions of proton induced reactions on nat Fe(p, xn) 56 Co have been remeasured in the energy region up to 18 MeV using stacked foil technique and standard high resolution gamma-ray spectrometry at the Debrecen MGC-20E cyclotron. Compilation of the available data measured between 1959 and 1993 has been made. The corresponding excitation functions have been reviewed, critical comparison of all the available data was done to obtain the most accurate data set. The feasibility of the evaluated data set was checked by reproducing experimental calibration curves for TLA by calculation. (orig.)

  18. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  19. Significance of data acquisition for a risk concept

    International Nuclear Information System (INIS)

    Hoemke, P.

    1975-01-01

    A risk assessment for the safety of nuclear power plants is now being developed in the FRG. This paper deals with the influence of input data on the estimation of risks and presents the data compilation of reliability data in the prototype compilation IRS-RWE. It will be stated that data compilation in power plants gives reasonable results and is an essential requirement for the introduction of risk estimation in nuclear safety. (orig.) [de

  20. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  1. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-07-28

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLS compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.

  2. Japan Nuclear Reaction Data Centre (JCPRG) Progress Report

    International Nuclear Information System (INIS)

    2011-01-01

    In this report, we give a brief review of the activities carried out by the ''Japan Nuclear Reaction Data Centre (JCPRG)'' since the last NRDC meeting in 2009. The main subjects of our activities are; (1) reaction data compilation, (2) evaluation of the astrophysical nuclear reaction data for light nuclei, and (3) cooperation of nuclear data activities in Asia. Our activities in detail are as follows. a) New reaction data compilation (NRDF and EXFOR) b) Conversion of old NRDF to EXFOR c) Bibliography compilation (CINDA) d) Evaluation of astrophysical nuclear reaction data based on theoretical calculations for light nuclei e) Collaboration among nuclear data physicists in Asia for the EXFOR compilation to form a stable base f) Database maintenance and services (NRDF, EXFOR/ENDF and CINDA) g) Development of software systems (GSYS) h) Customer services

  3. Experiences in Data-Parallel Programming

    Directory of Open Access Journals (Sweden)

    Terry W. Clark

    1997-01-01

    Full Text Available To efficiently parallelize a scientific application with a data-parallel compiler requires certain structural properties in the source program, and conversely, the absence of others. A recent parallelization effort of ours reinforced this observation and motivated this correspondence. Specifically, we have transformed a Fortran 77 version of GROMOS, a popular dusty-deck program for molecular dynamics, into Fortran D, a data-parallel dialect of Fortran. During this transformation we have encountered a number of difficulties that probably are neither limited to this particular application nor do they seem likely to be addressed by improved compiler technology in the near future. Our experience with GROMOS suggests a number of points to keep in mind when developing software that may at some time in its life cycle be parallelized with a data-parallel compiler. This note presents some guidelines for engineering data-parallel applications that are compatible with Fortran D or High Performance Fortran compilers.

  4. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  5. Radioactive waste management profiles. Compilation from the Waste Management Database. No. 3

    International Nuclear Information System (INIS)

    2000-07-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, archival and dissemination of information about radioactive waste management in Member States. This current report is a summary and compilation of waste management collected from Member States from February 1998 to December 1999 in response to the Agency's 1997/98 WMDB Questionnaire. Member States were asked to report waste accumulations up to the end of 1996 and to predict waste accumulations up to the end of 2014

  6. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  7. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  8. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  9. Compiled records of carbon isotopes in atmospheric CO2 for historical simulations in CMIP6

    Directory of Open Access Journals (Sweden)

    H. Graven

    2017-12-01

    Full Text Available The isotopic composition of carbon (Δ14C and δ13C in atmospheric CO2 and in oceanic and terrestrial carbon reservoirs is influenced by anthropogenic emissions and by natural carbon exchanges, which can respond to and drive changes in climate. Simulations of 14C and 13C in the ocean and terrestrial components of Earth system models (ESMs present opportunities for model evaluation and for investigation of carbon cycling, including anthropogenic CO2 emissions and uptake. The use of carbon isotopes in novel evaluation of the ESMs' component ocean and terrestrial biosphere models and in new analyses of historical changes may improve predictions of future changes in the carbon cycle and climate system. We compile existing data to produce records of Δ14C and δ13C in atmospheric CO2 for the historical period 1850–2015. The primary motivation for this compilation is to provide the atmospheric boundary condition for historical simulations in the Coupled Model Intercomparison Project 6 (CMIP6 for models simulating carbon isotopes in the ocean or terrestrial biosphere. The data may also be useful for other carbon cycle modelling activities.

  10. Compiled records of carbon isotopes in atmospheric CO2 for historical simulations in CMIP6

    Science.gov (United States)

    Graven, Heather; Allison, Colin E.; Etheridge, David M.; Hammer, Samuel; Keeling, Ralph F.; Levin, Ingeborg; Meijer, Harro A. J.; Rubino, Mauro; Tans, Pieter P.; Trudinger, Cathy M.; Vaughn, Bruce H.; White, James W. C.

    2017-12-01

    The isotopic composition of carbon (Δ14C and δ13C) in atmospheric CO2 and in oceanic and terrestrial carbon reservoirs is influenced by anthropogenic emissions and by natural carbon exchanges, which can respond to and drive changes in climate. Simulations of 14C and 13C in the ocean and terrestrial components of Earth system models (ESMs) present opportunities for model evaluation and for investigation of carbon cycling, including anthropogenic CO2 emissions and uptake. The use of carbon isotopes in novel evaluation of the ESMs' component ocean and terrestrial biosphere models and in new analyses of historical changes may improve predictions of future changes in the carbon cycle and climate system. We compile existing data to produce records of Δ14C and δ13C in atmospheric CO2 for the historical period 1850-2015. The primary motivation for this compilation is to provide the atmospheric boundary condition for historical simulations in the Coupled Model Intercomparison Project 6 (CMIP6) for models simulating carbon isotopes in the ocean or terrestrial biosphere. The data may also be useful for other carbon cycle modelling activities.

  11. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    Science.gov (United States)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  12. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  13. An international neutron data system

    International Nuclear Information System (INIS)

    1969-01-01

    The report gives the results of group deliberations based on fourteen submitted papers contributed by data centres and by experimenters and evaluators as data-centre users, at the Panel on neutron data compilation organized by IAEA and held in Brookhaven between 10-14 February 1969. The report record the Panel's view on the current and future needs for nuclear data compilation and on the role of the world's principal neutron data centres

  14. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  15. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  16. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  17. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  18. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  19. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  20. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  1. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  2. Reactor fuel performance data file, 1985 edition

    International Nuclear Information System (INIS)

    Harayama, Yasuo; Fujita, Misao; Watanabe, Kohji.

    1986-07-01

    In safety evaluation and integrity studies of reactor fuel, data on fuel performance are the most basic materials. The Fuel Reliability Laboratory No.1 has obtained the fuel performance data by joining in some international programs to study the safety and integrity of fuel. Those data have only used for the studies in the above two fields. However, if the data are rearranged and compiled in a easily usable form, they can be utilized in other field of studies. Then, a 'data file' on fuel performance is beeing compiled by adding data from open literatures to those obtained in international programs. The present report is prepared on the basis of the data file compiled by March in 1986. (author)

  3. A methodology to compile food metrics related to diet sustainability into a single food database: Application to the French case.

    Science.gov (United States)

    Gazan, Rozenn; Barré, Tangui; Perignon, Marlène; Maillot, Matthieu; Darmon, Nicole; Vieux, Florent

    2018-01-01

    The holistic approach required to assess diet sustainability is hindered by lack of comprehensive databases compiling relevant food metrics. Those metrics are generally scattered in different data sources with various levels of aggregation hampering their matching. The objective was to develop a general methodology to compile food metrics describing diet sustainability dimensions into a single database and to apply it to the French context. Each step of the methodology is detailed: indicators and food metrics identification and selection, food list definition, food matching and values assignment. For the French case, nutrient and contaminant content, bioavailability factors, distribution of dietary intakes, portion sizes, food prices, greenhouse gas emission, acidification and marine eutrophication estimates were allocated to 212 commonly consumed generic foods. This generic database compiling 279 metrics will allow the simultaneous evaluation of the four dimensions of diet sustainability, namely health, economic, social and environmental, dimensions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  5. Compilation of radiation damage test data. I

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Stolarz-Izycka, A.

    1979-01-01

    This report summarizes radiation damage test data on commercially available organic cable insulation and jacket materials: ethylene-propylene rubber, Hypalon, neoprene rubber, polyethylene, polyurethane, polyvinylchloride, silicone rubber, etc. The materials have been irradiated in a nuclear reactor to integrated absorbed doses from 5 X 10 5 to 5 X 10 6 Gy. Mechanical properties, e.g. tensile strength, elongation at break, and hardness, have been tested on irradiated and non-irradiated samples. The results are presented in the form of tables and graphs, to show the effect of the absorbed dose on the measured properties. (Auth.)

  6. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    Energy Technology Data Exchange (ETDEWEB)

    De Supinski, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Caliga, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  7. Fatigue data compilation and evaluation of fatigue on design

    International Nuclear Information System (INIS)

    Nyilas, A.

    1985-05-01

    The aim of this report is a review of the available fatigue data of various materials necessary for the design of large superconducting magnets for fusion. One of the primary objectives of this work is to present a broad outline of the low temperature fatigue data of relevant materials within the scope of available data. Besides the classical fatigue data of materials the fatigue crack propagation measurements are outlined widely. The existing recommendations for the design of cryogenic structures are described. A brief introduction of fracture mechanics as well as a historical background of the development of our present day understanding of fatigue has been done. (orig.) [de

  8. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    Science.gov (United States)

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  9. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  10. Activities of the cross-section compilation and evaluation centers at the Brookhaven National Laboratory

    International Nuclear Information System (INIS)

    Chernick, J.

    1967-01-01

    The growth of the compilation and evaluation efforts at the Brookhaven National Laboratory are reviewed. The current work of the Sigma Center is discussed, including the status of the publication of supplements to BNL-325 and the current state of the SCISRS-I tape. Future needs for BNL-325 type publications and SCISRS-II cross-section tapes are outlined. The history of the Cross-Section Evaluation Center at the Brookhaven National Laboratory is similarly reviewed. The status of current work is discussed, including the growth of the ENDF/A tape. The status of US efforts to produce a cross-section tape (ENDF7B) at an early date to satisfy the needs of US reactor designers is discussed. The continued importance of integral experiments and their accurate analysis to provide checks of the cross-section tapes is pointed out. The role of the Brookhaven National Laboratory in collaboration on an international basis is reviewed, including its current relationship to the ENEA Neutron Data Compilation Centre, the International Atomic Energy Agency and other nuclear centres. (author)

  11. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  12. Using Multi-Disciplinary Data to Compile a Hydrocarbon Budget for GC600, a Natural Seep in the Gulf of Mexico

    Science.gov (United States)

    MacDonald, I. R.; Johansen, C.; Marty, E.; Natter, M.; Silva, M.; Hill, J. C.; Viso, R. F.; Lobodin, V.; Diercks, A. R.; Woolsey, M.; Macelloni, L.; Shedd, W. W.; Joye, S. B.; Abrams, M.

    2016-12-01

    Fluid exchange between the deep subsurface and the overlying ocean and atmosphere occurs at hydrocarbon seeps along continental margins. Seeps are key features that alter the seafloor morphology and geochemically affect the sediments that support chemosynthetic communities. However, the dynamics and discharge rates of hydrocarbons at cold seeps remain largely unconstrained. Here we merge complementary geochemical (oil fingerprinting), geophysical (seismic, subbottom, backscatter, multibeam) and video/imaging (Video Time Lapse Camera, DSV ALVIN video) data sets to constrain pathways and magnitudes of hydrocarbon fluxes from the source rock to the seafloor at a well-studied, prolific seep site in the Northern Gulf of Mexico (GC600). Oil fingerprinting showed compositional similarities for samples from the following collections: the reservoir, an active vent, and the sea-surface. This was consistent with reservoir structures and pathways identified in seismic data. Video data, which showed the spatial distribution of seep indicators such as bacteria mats, or hydrate outcrops at the sediment interface, were combined with known hydrocarbon fluxes from the literature and used to quantify the total hydrocarbon fluxes in the seep domain. Using a systems approach, we combined data sets and published values at various scales and resolutions to compile a preliminary hydrocarbon budget for the GC600 seep site. Total estimated in-flow of hydrocarbons was 2.07 x 109 mol/yr. The combined total of out-flow and sequestration amounted to 7.56 x 106 mol/yr leaving a potential excess (in-flow - out-flow) of 2.06 x 109 mol/yr. Thus quantification of the potential out-flow from the seep domains based on observable processes does not equilibrate with the theoretical inputs from the reservoir. Processes that might balance this budget include accumulation of gas hydrate and sediment free-gas, as well as greater efficiency of biological sinks.

  13. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  14. Compilation of radiation damage test data. II

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Stolarz-Izycka, A.

    1979-01-01

    This report summarizes radiation damage test data on thermosetting and thermoplastic resins, with the main emphasis on epoxy resins used for magnet coil insulations. Also, other materials such as polyesters, phenolics, polyurethanes, silicones, etc., are represented. The materials have been irradiated in a nuclear reactor to integrated absorbed doses between 5x10 6 Gy and 1x10 8 Gy. The mechanical properties, e.g. the flexural strength, deflection at break, and tangent modulus of elasticity, have been measured on irradiated and non-irradiated samples. The results are given as variation of these parameters versus absorbed dose and are presented in the form of tables and graphs. The tested materials are catalogued in alphabetical order. (Auth.)

  15. Development of a computerized data base for low-level radioactive waste leaching data: Topical report

    International Nuclear Information System (INIS)

    Dougherty, D.R.; Colombo, P.

    1986-09-01

    This report documents the development of a computerized data base (db) of leaching data for solidified low-level radioactive waste (LLW) forms. Brookhaven National Lab performed this work under contract with the US Department of Energy's Low-Level Waste Management Program as part of an effort to develop an accelerated leach test(s) that can be used to predict leachabilities of LLW forms over long time periods, i.e., hundreds of years. The accelerated leach test(s) is (are) to be developed based on knowledge of leaching mechanisms and factors that affect leaching. Although developed specifically for the Accelerated Leach Test(s) Program, this db may be useful to others concerned with the management of low-level waste. The db is being developed to provide efficient data compilation and analysis capabilities. The data compiled in the db, which include data from the Accelerated Leach Test(s) Program and selected data from the literature, have been selected to elucidate leaching mechanisms and factors that affect leaching and are not meant to be a comprehensive compilation of leaching data. This report presents the data compilation aspect of the db. It does not present the programmatic results obtained from analysis of the data regarding leaching mechanisms and factors that affect leaching, which will be presented in reports from the Accelerated Leach Test(s) Program. 6 refs

  16. Regulatory and technical reports (Abstract Index Journal). Compilation for first quarter 1986, January-March. Volume 11, No. 1

    International Nuclear Information System (INIS)

    1986-04-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission staff and its contractors, as well as conference proceedings. Entries are indexed by contractor report number, personal author, subject, NRC originating organization, NRC contract sponsor, contractor, and licensed facility

  17. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  18. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  19. IDEAS international contamination database: a compilation of published internal contamination cases. A tool for the internal dosimetry community

    International Nuclear Information System (INIS)

    Hurtgen, C.

    2007-01-01

    The aim of the IDEAS project was to develop General Guidelines for the Assessment of Internal Dose from Monitoring Data. The project was divided into 5 Work Packages for the major tasks. Work Package 1 entitled Collection of incorporation cases was devoted to the collection of data by means of bibliographic research (survey of the open literature), contacting and collecting data from specific organisations and using information from existing databases on incorporation cases. To ensure that the guidelines would be applicable to a wide range of practical situations, a database of cases of internal contamination including monitoring data suitable for dose assessment was compiled. The IDEAS Bibliography database and the IDEAS Internal Contamination database were prepared and some reference cases were selected for use in Work Package 3. The other Work packages of the IDEAS Project (WP-2 Preparation of evaluation software, WP-3 Evaluation of incorporation cases, WP-4 Development of the general guidelines and WP-5 Practical testing of general guidelines) have been described in detail elsewhere and can be found on the IDEAS website. A search for reference from the open literature, which contained information on cases of internal contamination from which intake and committed doses could be assessed, has been compiled into a database. The IDEAS Bibliography Database includes references to papers which might (but were not certain to) contain such information, or which included references to papers which contained such information. This database contains the usual bibliographical information: authors' name(s), year of publication, title of publication and the journal or report number. Up to now, a comprehensive Bibliography Database containing 563 references has been compiled. Not surprisingly more than half of the references are from Health Physics and Radiation Protection Dosimetry Journals.The next step was for the partners of the IDEAS project to obtain the references

  20. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  1. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  2. Compilation and evaluation of fission yield nuclear data

    International Nuclear Information System (INIS)

    Lammer, M.

    1991-09-01

    The task of this meeting was to review the progress made since the previous meeting on fission yield evaluation and to define the tasks for an IAEA Co-ordinated Research Programme in detail. Improvements have been noted in measured data, model calculations and the situation of fission yield evaluation. Tabs

  3. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  4. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  5. NACRE II: an update of the NACRE compilation of charged-particle-induced thermonuclear reaction rates for nuclei with mass number A<16

    International Nuclear Information System (INIS)

    Xu, Y.; Takahashi, K.; Goriely, S.; Arnould, M.; Ohta, M.; Utsunomiya, H.

    2013-01-01

    An update of the NACRE compilation [3] is presented. This new compilation, referred to as NACRE II, reports thermonuclear reaction rates for 34 charged-particle induced, two-body exoergic reactions on nuclides with mass number A 6 ≲T⩽10 10 K range. Along with the ‘adopted’ rates, their low and high limits are provided. The new rates are available in electronic form as part of the Brussels Library (BRUSLIB) of nuclear data. The NACRE II rates also supersede the previous NACRE rates in the Nuclear Network Generator (NETGEN) for astrophysics. [ (http://www.astro.ulb.ac.be/databases.html)

  6. Argonne Nuclear Data Program

    Energy Technology Data Exchange (ETDEWEB)

    Kondev, F. [US Nuclear Data Program, U.S. DOE/SC (United States)

    2013-08-15

    Nuclear Data Compilations and Evaluations: - Nuclear structure and decay data compilations and evaluations for the International NSDD network (ENSDF and XUNDL); - AME12 and NuBase12 - in collaboration with G. Audi and M. MacCormick, CSNSM (Orsay), M. Wang, IMP (Lanzhou) and B. Pfeiffer, GSI (Darmstadt) - presentation by M. Wang; - DDEP coordinator - completed; - Horizontal nuclear data evaluation activities -IAEA CRP's, Isomers, Medical Isotopes; Complementary ND research Activities: - CARIBU, FRIB and other RIB facilities, Gretina, IAEA-CRP - emphasis on nuclear structure physics and astrophysics, and their intersection with applied nuclear physics programs.

  7. NNDC [National Nuclear Data Center] support for fusion nuclear data needs

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1988-01-01

    The National Data Center (NNDC) located at Brookhaven National Laboratory is an outgrowth of the Sigma Center founded by D.J. Hughes to compile low energy neutron reaction data in the 1950's. The center has played a lead role in the production of evaluated nuclear data (ENDF/B) for the United States nuclear power program. This data file, now in its sixth version, is produced as a cooperative effort of many DOE funded organizations via the Cross Section Evaluation Working Group (GSEWG). The NNDC's role, in addition to providing the structure and leadership for CSEWG, is to supply compiled bibliographic and experimental data and provide file processing, checking, distribution and documentation services. In the past, the NNDC has also produced nuclear data evaluations.lt. slash

  8. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  9. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  10. Exploratory study of nuclear reaction data utility framework of Japan charged particle reaction data group (JCPRG)

    International Nuclear Information System (INIS)

    Masui, Hiroshi; Ohnishi, Akira; Kato, Kiyoshi; Ohbayasi, Yosihide; Aoyama, Shigeyoshi; Chiba, Masaki

    2002-01-01

    Compilation, evaluation and dissemination are essential pieces of work for the nuclear data activities. We, Japan charged particle data group, have researched the utility framework for the nuclear reaction data on the basis of recent progress of computer and network technologies. These technologies will be not only for the data dissemination but for the compilation and evaluation assistance among the many corresponding researchers of all over the world. In this paper, current progress of our research and development is shown. (author)

  11. Migration chemistry and behaviour of iodine relevant to geological disposal of radioactive wastes. A literature review with a compilation of sorption data

    International Nuclear Information System (INIS)

    Liu, Y.; Gunten, H.R. von

    1988-09-01

    This report reviews the literature on iodine migration, chemistry and behaviour in the environment up to November 1987. It deals mainly with 129 I released from a land repository, with particular consideration of the Swiss scenario for the disposal of low- and medium-level radioactive waste. As a background to this review, the basic properties of radioiodine, its distribution, circulation in nature and radiological impact are presented. A large number of sorption and diffusion data for iodine on rocks, sediments, minerals, cements and other materials have been compiled from many different laboratories. Based on this information, an assessment of the sorption and retardation of radioiodine in geomedia is made and methodologies for obtaining sorption distribution ratios (R D values) are discussed. The review also covers natural analogue studies of 129 I, retardation of iodine by cement barriers and the possible influences of organic compounds and microorganisms on the behaviour of iodine. Some possibilities for further research on diffusion measurements and near-field chemistry of radioiodine are outlined. (author) 259 refs., 9 figs., 32 tabs

  12. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  13. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  14. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  15. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  16. Development of automatic cross section compilation system for MCNP

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Sakurai, Kiyoshi

    1999-01-01

    A development of a code system to automatically convert cross-sections for MCNP is in progress. The NJOY code is, in general, used to convert the data compiled in the ENDF format (Evaluated Nuclear Data Files by BNL) into the cross-section libraries required by various reactor physics codes. While the cross-section library: FSXLIB-J3R2 was already converted from the JENDL-3.2 version of Japanese Evaluated Nuclear Data Library for a continuous energy Monte Carlo code MCNP, the library keeps only the cross-sections at room temperature (300 K). According to the users requirements which want to have cross-sections at higher temperature, say 600 K or 900 K, a code system named 'autonj' is under development to provide a set of cross-section library of arbitrary temperature for the MCNP code. This system can accept any of data formats adopted JENDL that may not be treated by NJOY code. The input preparation that is repeatedly required at every nuclide on NJOY execution is greatly reduced by permitting the conversion process of as many nuclides as the user wants in one execution. A few MCNP runs were achieved for verification purpose by using two libraries FSXLIB-J3R2 and the output of autonj'. The almost identical MCNP results within the statistical errors show the 'autonj' output library is correct. In FY 1998, the system will be completed, and in FY 1999, the user's manual will be published. (K. Tsuchihashi)

  17. Compilation of LLNL CUP-2 Data

    Energy Technology Data Exchange (ETDEWEB)

    Eppich, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kips, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lindvall, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-31

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived from the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results

  18. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  19. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1985-01-01

    Topics discussed include: the status of mass-chain evaluations, remote terminal access, other US Nuclear Data Network publications, formats and procedures subcommittee report, keyword follow-up (Phys. Rev. C), and atomic data and nuclear data tables

  20. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  1. Hydrocarbon solvent exposure data: compilation and analysis of the literature.

    Science.gov (United States)

    Caldwell, D J; Armstrong, T W; Barone, N J; Suder, J A; Evans, M J

    2000-01-01

    An occupational exposure database for hydrocarbon solvent end-use applications was constructed from the published literature. The database provides exposure assessment information for such purposes as regulatory risk assessments, support of industry product stewardship initiatives, and identification of applications in which limited exposure data are available. It is quantitative, documented, and based on credible data. Approximately 350 articles containing quantitative hydrocarbon solvent exposure data were identified using a search of computer databases of published literature. Many articles did not report sufficient details of the exposure data for inclusion in the database (e.g., full-shift exposure or task-based exposure data). Others were excluded because only limited summary statistics were provided, which precluded statistical analysis of the data (e.g., arithmetic mean concentration presented, but no sample number). Following evaluation, 16,880 hydrocarbon solvent exposure measurements from 99 articles were entered into a database for analysis. Methods used to identify and evaluate published solvent exposure data are described along with more detailed analysis of worker exposure to hydrocarbon solvents in three major end-use applications: painting and coating, printing, and adhesives. Solvent exposures were evaluated against current ACGIH threshold limit values (TLVs) and trends were identified. Limited quantitative data are available prior to 1970. In general, reported hydrocarbon solvent exposures decreased fourfold from 1960 to 1998, were below the TLVs applicable to specific hydrocarbon solvents at the time, and on average have been below 40% of the TLV since 1980. The database already has proved valuable; however, the utility of published exposure data could be further improved if authors consistently reported essential data elements and supporting information.

  2. Strontium-90 fluoride data sheet

    Energy Technology Data Exchange (ETDEWEB)

    Fullam, H.T.

    1981-06-01

    This report is a compilation of available data and appropriate literature references on the properties of strontium-90 fluoride and nonradioactive strontium fluoride. The objective of the document is to compile in a single source pertinent data to assist potential users in the development, licensing, and use of /sup 90/SrF/sub 2/-fueled radioisotope heat sources for terrestrial power conversion and thermal applications. The report is an update of the Strontium-90 Fluoride Data Sheet (BNWL-2284) originally issued in April 1977.

  3. Columbia River Component Data Evaluation Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    C.S. Cearlock

    2006-08-02

    The purpose of the Columbia River Component Data Compilation and Evaluation task was to compile, review, and evaluate existing information for constituents that may have been released to the Columbia River due to Hanford Site operations. Through this effort an extensive compilation of information pertaining to Hanford Site-related contaminants released to the Columbia River has been completed for almost 965 km of the river.

  4. Compilation of contract research for the Materials Engineering Branch, Division of Engineering: Annual report for FY 1988

    International Nuclear Information System (INIS)

    1989-05-01

    This compilation of annual reports by contractors to the Materials Engineering Branch of the NRC Office of Research concentrates on achievements in safety research for the primary system of commercial light water power reactors, particularly with regard to reactor vessels, primary system piping, steam generators, nondestructive examination of primary components, and in safety research for decommissioning and decontamination, on-site storage and engineered safety features. The Materials Engineering Branch assembles abbreviated reports from all the branch contractors and publishes them in a single annual report as soon after the end of the year as possible so that the information developed throughout the year can be promptly used in the safety-regulatory process. This report, covering research conducted during Fiscal Year 1988 is the seventh volume of the series of NUREG-0975, ''Compilation of Contractor Research for the Materials Engineering Branch, Division of Engineering.'' Individual projects are processed separately for the data bases

  5. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  6. Data Compilation for AGR-1 Baseline Coated Particle Composite LEU01-46T

    International Nuclear Information System (INIS)

    Hunn, John D.; Lowden, Richard Andrew

    2006-01-01

    This document is a compilation of characterization data for the AGR-1 baseline coated particle composite LEU01-46T, a composite of four batches of TRISO-coated 350 (micro)m 19.7% low enrichment uranium oxide/uranium carbide kernels (LEUCO). The AGR-1 TRISO-coated particles consist of a spherical kernel coated with a ∼ 50% dense carbon buffer layer (100 (micro)m nominal thickness) followed by a dense inner pyrocarbonlayer (40 (micro)m nominal thickness) followed by a SiC layer (35 (micro)m nominal thickness) followed by another dense outer pyrocarbon layer (40 (micro)m nominal thickness). The coated particles, were produced by ORNL for the Advanced Gas Reactor Fuel Development and Qualification (AGR) program to be put into compacts for insertion in the first irradiation test capsule, AGR-1. The kernels were obtained from BWXT and identified as composite (G73D-20-69302). The BWXT kernel lot G73D-20-69302 was riffled into sublots for characterization and coating by ORNL and identified as LEU01-?? (where ?? is a series of integers beginning with 01). Additional particle batches were coated with only buffer or buffer plus inner pyrocarbon (IPyC) layers using similar process conditions as used for the full TRISO batches comprising the LEU01-46T composite. These batches were fabricated in order to qualify that the process conditions used for buffer and IPyC would produce acceptable densities, as described in sections 8 and 9. These qualifying batches used 350 (micro)m natural uranium oxide/uranium carbide kernels (NUCO). The kernels were obtained from BWXT and identified as composite G73B-NU-69300. The use of NUCO surrogate kernels is not expected to significantly effect the densities of the buffer and IPyC coatings. Confirmatory batches using LEUCO kernels from G73D-20-69302 were coated and characterized to verify this assumption. The AGR-1 Fuel Product Specification and Characterization Guidance (INL EDF-4380, Rev. 6) provides the requirements necessary for acceptance

  7. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  8. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  9. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  10. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  11. EXFOR basics: A short guide to the nuclear reaction data exchange format

    International Nuclear Information System (INIS)

    McLane, V.

    1996-07-01

    This manual is intended as a guide to users of nuclear reaction data compiled in the EXFOR format, and is not intended as a complete guide to the EXFOR System. EXFOR is the exchange format designed to allow transmission of nuclear data between the Nuclear Reaction Data Centers. In addition to storing the data and its' bibliographic information, experimental information, including source of uncertainties, is also compiled. The status and history of the data set is also included, e.g., the source of the data, any updates which have been made, and correlations to other data sets. EXFOR is designed for flexibility in order to meet the diverse needs of the nuclear data compilation centers. This format should not be confused with a center-to-user format. Although users may obtain data from the centers in the EXFOR format, other center-to-user formats have been developed to meet the needs of the users within each center's own sphere of responsibility. The exchange format, as outlined, allows a large variety of numerical data tables with explanatory and bibliographic information to be transmitted in an easily machine-readable format (for checking and indicating possible errors) and a format that can be read by personnel (for passing judgment on and correcting any errors indicated by the machine). The data presently included in the EXFOR exchange include: a complete compilation of experimental neutron-induced reaction data, a selected compilation of charged-particle induced reaction data, a selected compilation of photon-induced reaction data

  12. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  13. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  14. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  15. Arctic temperature and moisture trends during the past 2000 years - Progress from multiproxy-paleoclimate data compilations

    Science.gov (United States)

    Kaufman, Darrell; Routson, Cody; McKay, Nicholas; Beltrami, Hugo; Jaume-Santero, Fernando; Konecky, Bronwen; Saenger, Casey

    2017-04-01

    Instrumental climate data and climate-model projections show that Arctic-wide surface temperature and precipitation are positively correlated. Higher temperatures coincide with greater moisture by: (1) expanding the duration and source area for evaporation as sea ice retracts, (2) enhancing the poleward moisture transport, and (3) increasing the water-vapor content of the atmosphere. Higher temperature also influences evaporation rate, and therefore precipitation minus evaporation (P-E), the climate variable often sensed by paleo-hydroclimate proxies. Here, we test whether Arctic temperature and moisture also correlate on centennial timescales over the Common Era (CE). We use the new PAGES2k multiproxy-temperature dataset along with a first-pass compilation of moisture-sensitive proxy records to calculate century-scale composite timeseries, with a focus on longer records that extend back through the first millennium CE. We present a new Arctic borehole temperature reconstruction as a check on the magnitude of Little Ice Age cooling inferred from the proxy records, and we investigate the spatial pattern of centennial-scale variability. Similar to previous reconstructions, v2 of the PAGES2k proxy temperature dataset shows that, prior to the 20th century, mean annual Arctic-wide temperature decreased over the CE. The millennial-scale cooling trend is most prominent in proxy records from glacier ice, but is also registered in lake and marine sediment, and trees. In contrast, the composite of moisture-sensitive (primarily P-E) records does not exhibit a millennial-scale trend. Determining whether fluctuations in the mean state of Arctic temperature and moisture were in fact decoupled is hampered by the difficulty in detecting a significant trend within the relatively small number of spatially heterogeneous multi-proxy moisture-sensitive records. A decoupling of temperature and moisture would indicate that evaporation had a strong counterbalancing effect on precipitation

  16. Nuclear data evaluation and group constant generation for reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Do; Lee, Jong Tae; Min, Byung Joo; Gil, Choong Sup [Korea Atomic Energy Research Inst., Daeduk (Korea, Republic of)

    1991-01-01

    In nuclear or shielding design analysis for reactors or other facilities, nuclear data are one of the primary importances. Research project for nuclear data evaluation and their effective applications has been continuously performed. The objectives of this project are (1) to compile the latest evaluated nuclear data files, (2) to establish their processing code systems, and (3) to evaluate the multi- group constant library using the newly compiled data files and the code systems. As the results of this project, ENDF/B-VI Supplementary File including important nuclides, JENDL-3.1 and JEF-1 were compiled, and ENDF-6 international computer file format for evaluated nuclear data and its processing system NJOY89.31 were tested with ENDF/B-VI data. In order to test an applicability of the newly released data to thermal reactor problems, a number of benchmark calculations were performed, and the results were analyzed. Since preliminary benchmark testing of thermal reactor problems have been made the newly compiled data are expected to be positively used to develop advanced reactors. (Author).

  17. Nuclear data evaluation and group constant generation for reactor analysis

    International Nuclear Information System (INIS)

    Kim, Jung Do; Lee, Jong Tae; Min, Byung Joo; Gil, Choong Sup

    1991-01-01

    In nuclear or shielding design analysis for reactors or other facilities, nuclear data are one of the primary importances. Research project for nuclear data evaluation and their effective applications has been continuously performed. The objectives of this project are (1) to compile the latest evaluated nuclear data files, (2) to establish their processing code systems, and (3) to evaluate the multi- group constant library using the newly compiled data files and the code systems. As the results of this project, ENDF/B-VI Supplementary File including important nuclides, JENDL-3.1 and JEF-1 were compiled, and ENDF-6 international computer file format for evaluated nuclear data and its processing system NJOY89.31 were tested with ENDF/B-VI data. In order to test an applicability of the newly released data to thermal reactor problems, a number of benchmark calculations were performed, and the results were analyzed. Since preliminary benchmark testing of thermal reactor problems have been made the newly compiled data are expected to be positively used to develop advanced reactors. (Author)

  18. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  19. amamutdb.no: A relational database for MAN2B1 allelic variants that compiles genotypes, clinical phenotypes, and biochemical and structural data of mutant MAN2B1 in α-mannosidosis.

    Science.gov (United States)

    Riise Stensland, Hilde Monica Frostad; Frantzen, Gabrio; Kuokkanen, Elina; Buvang, Elisabeth Kjeldsen; Klenow, Helle Bagterp; Heikinheimo, Pirkko; Malm, Dag; Nilssen, Øivind

    2015-06-01

    α-Mannosidosis is an autosomal recessive lysosomal storage disorder caused by mutations in the MAN2B1 gene, encoding lysosomal α-mannosidase. The disorder is characterized by a range of clinical phenotypes of which the major manifestations are mental impairment, hearing impairment, skeletal changes, and immunodeficiency. Here, we report an α-mannosidosis mutation database, amamutdb.no, which has been constructed as a publicly accessible online resource for recording and analyzing MAN2B1 variants (http://amamutdb.no). Our aim has been to offer structured and relational information on MAN2B1 mutations and genotypes along with associated clinical phenotypes. Classifying missense mutations, as pathogenic or benign, is a challenge. Therefore, they have been given special attention as we have compiled all available data that relate to their biochemical, functional, and structural properties. The α-mannosidosis mutation database is comprehensive and relational in the sense that information can be retrieved and compiled across datasets; hence, it will facilitate diagnostics and increase our understanding of the clinical and molecular aspects of α-mannosidosis. We believe that the amamutdb.no structure and architecture will be applicable for the development of databases for any monogenic disorder. © 2015 WILEY PERIODICALS, INC.

  20. A compilation of K+p --> K0 DELTA++ cross sections below 2 GeV/c

    CERN Document Server

    Giacomelli, G; Piccinini, M; Rimondi, F; Serra-Lugaresi, P

    1976-01-01

    Data published up to June 1976 on the quasi-two-body reaction K+p --> K0 DELTA++, with DELTA++ -->ppi+, are compiled for laboratory momenta from 0.7 to 2 GeV/c. They include integrated cross-sections, differencial cross-sections, average and differential density matrix elements, as well as coefficients of the Legendre polynomial expensions of the production differential distributions. The data are presented in the form og graphs and computer-produced tables. The method of computation is the same as in a previous report (CERN-HERA-75-1) on K+N cross-sections below2 GeV/c, to which the reader is referred for details on cards formats, notations, etc.

  1. NEA Data Bank progress report to the nuclear reaction data centres

    International Nuclear Information System (INIS)

    1988-01-01

    The report presents shortly the activities of the NEA Data Bank concerning neutron data compilation, the Joint Evaluated File, computer program services, validation of nuclear model codes, nuclear waste management and reactor safety

  2. Compilation and evaluation of high energy γ-ray standards from nuclear reactions. Work performed under the coordinated research project 'Update of X- and γ-ray decay data standards for detector calibration'

    International Nuclear Information System (INIS)

    Marcinkowski, A.; Marianski, B.

    1999-02-01

    The report presents the following aspects needed for the compilation and evaluation of high energy γ-ray standards from nuclear reactions: evaluation of emission probabilities of γ-rays with energies 4.44 MeV and 15.11 MeV from 12 C * , preparation of the list of reactions suitable for production of the above mentioned excited radionuclide, and compilation and evaluation of cross sections for these reactions, including inelastic proton scattering on 12 C and radiative capture on 11 B

  3. Development of a computerized data base for low-level waste leaching data

    International Nuclear Information System (INIS)

    Dougherty, D.R.; Colombo, P.

    1987-01-01

    A computerized data base (db) of low-level waste (LLW) leaching data is being compiled by Brookhaven National Laboratory under contract to the DOE Low-Level Waste Management Program. Although this db is being compiled as part of an effort to develop accelerated leach test procedures for LLW forms, other involved in LLW management may find it useful. The db is implemented on an IBM PC XT and is self-contained in that its data manipulation and analysis programs are not proprietary (i.e., need not be purchased). The db includes data from the Accelerated Leach Test(s) Program plus selected literature data, which have been selected based on criteria that include completeness of the experimental description and elucidation of leaching mechanisms. 6 references, 5 figures, 3 tables

  4. Regulatory and technical reports, compilation for 1979. Volume 4. Bibliographical report Jan-Dec 79

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzie, L.; Aragon, R.

    1980-07-01

    The compilation lists formal regulatory and technical reports issued in 1979 by the U.S. Nuclear Regulatory Commission (NRC) staff and by NRC contractors. The compilation is divided into three major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The first portion of this sequential section lists staff reports, the second portion lists NRC-sponsored conference proceedings, and the third lists contractor reports. Each report citation in the sequential section contains full bibliographic information

  5. Physics data bases and their use

    International Nuclear Information System (INIS)

    Gault, F.D.

    1981-01-01

    The physics data base is examined as a passive archive, an active marketing device for new data, and as a resource centre producing informal commentary through the work of its compilers. The relative importance to its function of wide availability through networks, ease of retrieval of the desired data, and packages for manipulating and displaying the information retrieved is considered. Throughout examples are drawn from various compilations including the elementary particle data bases maintained by the Particle Data Group in the UK at the Rutherford and Appleton Laboratory. The future of the physics data base in supporting but not supplanting the publication of papers on experimental physics is discussed. (orig.)

  6. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  7. SAR data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Compilation of the data that go into the annual marine mammal stock assessment reports, including abundance and bycatch estimates, PBR calculations, bibliographic...

  8. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  9. Springer Handbook of Condensed Matter and Materials Data

    CERN Document Server

    Martienssen, Werner

    2005-01-01

    Condensed Matter and Materials Science are two of the most active fields of applied physics, with a stream of discoveries in areas from superconductivity and magnetism to the optical, electronic and mechanical properties of materials. While a huge amount of data has been compiled and spread over numerous reference works, no single volume compiles the most used information. Springer Handbook of Condensed Matter and Materials Data provides a concise compilation of data and functional relationships from the fields of solid-state physics and materials in this 1200-page volume. The data, encapsulated in over 750 tables and 1025 illustrations, have been selected and extracted primarily from the extensive high-quality data collection Landolt-Börnstein and also from other systematic data sources and recent publications of physical and technical property data. Many chapters are authored by Landolt-Börnstein editors, including the editors of this Springer Handbook. Key Topics Fundamental Constants The International S...

  10. Compilation of the abstracts of nuclear computer codes available at CPD/IPEN

    International Nuclear Information System (INIS)

    Granzotto, A.; Gouveia, A.S. de; Lourencao, E.M.

    1981-06-01

    A compilation of all computer codes available at IPEN in S.Paulo are presented. These computer codes are classified according to Argonne National Laboratory - and Energy Nuclear Agency schedule. (E.G.) [pt

  11. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, Mahmut Taylan [PSU; Choudary, Alok [Northwestern; Thakur, Rajeev [ANL

    2014-03-01

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.

  12. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  13. Can Experimental Scientists, Data Evaluators and Compilers, and Nuclear Data Users Understand One Another?

    Energy Technology Data Exchange (ETDEWEB)

    Usachev, L. N. [Institute of Physics and Energetics, Obninsk, USSR (Russian Federation)

    1966-07-01

    The International Atomic Energy Agency organizes conferences on a wide variety of scientific subjects, all of which are of fundamental importance for the development of nuclear power. These include the technology of fuel elements, their stability in neutron fields, and chemical reprocessing as well as reactor physics, mathematical computational methods and the problems of protection and dosimetry. The problem of microscopic nuclear data, an essential aspect of reactor work, is just one of these many subjects. On the other hand, it should be remembered that the possibility of releasing nuclear energy was established in the first place by obtaining nuclear data on the fission process occurring in the uranium nucleus following the capture of a neutron and on the escape of the 2-3 secondary fission neutrons. In early nuclear power work the information provided by nuclear data was of considerable, even of decisive, importance. For example, the information available on the neutron balance in fast reactors showed that such reactors could operate as breeders and thus that it was worth while developing them. Strictly speaking, it is of course difficult to speak of a knowledge of nuclear data at this early period. It is perhaps more accurate to speak of the understanding of and the feeling for such data which grew up on the basis of the existing physical ideas on the fission of the nucleus, radiative capture and neutron scattering. Experimental data were very scanty but for that reason they were particularly valuable.

  14. Can Experimental Scientists, Data Evaluators and Compilers, and Nuclear Data Users Understand One Another?

    International Nuclear Information System (INIS)

    Usachev, L.N.

    1966-01-01

    The International Atomic Energy Agency organizes conferences on a wide variety of scientific subjects, all of which are of fundamental importance for the development of nuclear power. These include the technology of fuel elements, their stability in neutron fields, and chemical reprocessing as well as reactor physics, mathematical computational methods and the problems of protection and dosimetry. The problem of microscopic nuclear data, an essential aspect of reactor work, is just one of these many subjects. On the other hand, it should be remembered that the possibility of releasing nuclear energy was established in the first place by obtaining nuclear data on the fission process occurring in the uranium nucleus following the capture of a neutron and on the escape of the 2-3 secondary fission neutrons. In early nuclear power work the information provided by nuclear data was of considerable, even of decisive, importance. For example, the information available on the neutron balance in fast reactors showed that such reactors could operate as breeders and thus that it was worth while developing them. Strictly speaking, it is of course difficult to speak of a knowledge of nuclear data at this early period. It is perhaps more accurate to speak of the understanding of and the feeling for such data which grew up on the basis of the existing physical ideas on the fission of the nucleus, radiative capture and neutron scattering. Experimental data were very scanty but for that reason they were particularly valuable

  15. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  16. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  17. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    International Nuclear Information System (INIS)

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries

  18. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries. Refs, figs, tabs.

  19. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  20. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  1. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  2. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  3. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    Science.gov (United States)

    Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.

    2018-02-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai'i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai'i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data.

  4. Northern Oklahoma Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (710 records) were compiled by Professor Ahern. This data base was received in June 1992. Principal gravity parameters include latitude,...

  5. Computer programmes development for environment variables setting for use with C compiler

    International Nuclear Information System (INIS)

    Sriyotha, P.; Prasertchiewcharn, N.; Yamkate, P.

    1994-01-01

    Compilers generally need special environment variables that operating system has not given at the beginning. Such an environment variables as COMSPEC, PATH, TMP, LIB, INCLUDE can be used as data exchange among programmes.Those variables normally occupy memories and, in some cases, an 'Out of Environment Space' error message frequently occurs when the user set a new environment variable. We would hate to give up in such a situation that just one variable has gotten too large as well as destroying all environment variables. In order to bring everything down to an earth, we try to save an old environment setting, clear and set a new one. Later on, a new setting shall have been cleared and an old one from a saved setting shall have been restored

  6. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  7. Towards droplet size-aware biochemical application compilation for AM-EWOD biochips

    DEFF Research Database (Denmark)

    Pop, Paul; Alistar, Mirela

    2015-01-01

    a droplet size-aware compilation by proposing a routing algorithm that considers the droplet size. Our routing algorithm is developed for a novel digital microfluidic biochip architecture based on Active Matrix Electrowetting on Dielectric, which uses a thin film transistor array for the electrodes. We also...

  8. 27 CFR 478.24 - Compilation of State laws and published ordinances.

    Science.gov (United States)

    2010-04-01

    ... published ordinances. (a) The Director shall annually revise and furnish Federal firearms licensees with a... Director annually revises the compilation and publishes it as “State Laws and Published Ordinances—Firearms... and published ordinances. 478.24 Section 478.24 Alcohol, Tobacco Products, and Firearms BUREAU OF...

  9. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  10. Statistical Compilation of the ICT Sector and Policy Analysis | Page 3 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  11. A compilation of information on the {sup 31}P(p,{alpha}){sup 28}Si reaction and properties of excited levels in the compound nucleus {sup 32}S

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.E.; Smith, D.L. [Argonne National Lab., IL (United States). Technology Development Div.

    1997-11-01

    This report documents a survey of the literature, and provides a compilation of data contained therein, for the {sup 31}P(p,{alpha}){sup 28}Si reaction. Attention is paid here to resonance states in the compound-nuclear system {sup 32}S formed by {sup 31}P + p, with emphasis on the alpha-particle decay channels, {sup 28}Si + {alpha} which populate specific levels in {sup 28}Si. The energy region near the proton separation energy for {sup 32}S is especially important in this context for applications in nuclear astrophysics. Properties of the excited states in {sup 28}Si are also considered. Summaries of all the located references are provided and numerical data contained in them are compiled in EXFOR format where applicable.

  12. Site locality identification study: Hanford Site. Volume II. Data cataloging

    International Nuclear Information System (INIS)

    1980-07-01

    Data compilation and cataloging for the candidate site locality identification study were conducted in order to provide a retrievable data cataloging system for the present siting study and future site evaluation and licensng processes. This task occurred concurrently with and also independently of other tasks of the candidate site locality identification study. Work in this task provided the data utilized primarily in the development and application of screening and ranking processes to identify candidate site localities on the Hanford Site. The overall approach included two steps: (1) data acquisition and screening; and (2) data compilation and cataloging. Data acquisition and screening formed the basis for preliminary review of data sources with respect to their probable utilization in the candidate site locality identification study and review with respect to the level of completeness and detail of the data. The important working assumption was that the data to be used in the study be based on existing and available published and unpublished literature. The data compilation and cataloging provided the basic product of the Task; a retrievable data cataloging system in the form of an annotated reference list and key word index and an index of compiled data. The annotated reference list and key word index are cross referenced and can be used to trace and retrieve the data sources utilized in the candidate site locality identification study

  13. Idaho State Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (24,284 records) were compiled by the U. S. Geological Survey. This data base was received on February 23, 1993. Principal gravity...

  14. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  15. Technique to increase performance of C-program for control systems. Compiler technique for low-cost CPU; Seigyoyo C gengo program no kosokuka gijutsu. Tei cost CPU no tame no gengo compiler gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Y [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    The software of automotive control systems has become increasingly large and complex. High level languages (primarily C) and the compilers become more important to reduce coding time. Most compilers represent real number in the floating point format specified by IEEE standard 754. Most microprocessors in the automotive industry have no hardware for the operation using the IEEE standard due to the cost requirements, resulting in the slow execution speed and large code size. Alternative formats to increase execution speed and reduce code size are proposed. Experimental results for the alternative formats show the improvement in execution speed and code size. 4 refs., 3 figs., 2 tabs.

  16. Truly nested data-parallelism: compiling SaC for the Microgrid architecture

    NARCIS (Netherlands)

    Herhut, S.; Joslin, C.; Scholz, S.-B.; Grelck, C.; Morazan, M.

    2009-01-01

    Data-parallel programming facilitates elegant specification of concurrency. However, the composability of data-parallel operations so far has been constrained by the requirement to have only at data- parallel operation at runtime. In this paper, we present early results on our work to exploit

  17. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  18. National Geothermal Data System State Contributions by Data Type (Appendix A1-b)

    Energy Technology Data Exchange (ETDEWEB)

    Love, Diane [Executive Office of the State of Arizona (Arizona Geological Survey)

    2015-12-20

    Multipaged spreadsheet listing an inventory of data submissions to the State contributions to the National Geothermal Data System project by services, by state, by metadata compilations, metadata, and map count, including a summary of information.

  19. Andes 1997 Gravity Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Central Andes gravity data (6,151 records) were compiled by Professor Gotze and the MIGRA Group. This data base was received in April, 1997. Principal gravity...

  20. The athena data dictionary and description language

    International Nuclear Information System (INIS)

    Bazan, A.; Ghez, P.; Le Flour, T.; Lieunard, S.; Tull, C.

    2001-01-01

    The authors have developed a data object description tool suite and service for Athena consisting of: a language grammar based upon an extended proper subset of IDL 2.0, a compiler front end based upon this language grammar, JavaCC, and a Java Reflection API-like interface, and several compiler back ends which meet specific needs in ATLAS such as automatic generation of object converters, and data object scripting interfaces. The authors present here details of the work and experience to date on the Athena Definition Language and Athena Data Dictionary