WorldWideScience

Sample records for data compilation

  1. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  2. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  3. Compilation of TFTR materials data

    International Nuclear Information System (INIS)

    Havener, W.J.

    1975-12-01

    In order to document the key thermophysical property data used in the conceptual design of Tokamak Fusion Test Reactor (TFTR) systems and components, a series of data packages has been prepared. It is expected that data for additional materials will be added and the information already provided will be updated to provide a project-wide data base

  4. Lye-data-compiled-scihub

    Data.gov (United States)

    U.S. Environmental Protection Agency — The data contained in this worksheet provides the quantitative detection of potentially pathogenic fungi in treated and untreated rainwater samples. This dataset is...

  5. Compilation of carbon-14 data

    International Nuclear Information System (INIS)

    Paasch, R.A.

    1985-01-01

    A review and critical analysis was made of the original sources of carbon-14 in the graphite moderator and reflector zones of the eight Hanford production reactors, the present physical and chemical state of the carbon-14, pathways (other than direct combustion) by which the carbon-14 could be released to the biosphere, and the maximum rate at which it might be released under circumstances which idealistically favor the release. Areas of uncertainty are noted and recommendations are made for obtaining additional data in three areas: (1) release rate of carbon-14 from irradiated graphite saturated with aerated water; (2) characterization of carbon-14 deposited outside the moderator and reflector zones; and (3) corrosion/release rate of carbon-14 from irradiated steel and aluminum alloys

  6. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  7. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  8. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  9. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  10. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  11. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  12. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  13. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  14. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  15. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  16. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  17. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  18. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  19. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  20. Compilation of kinetic data for geochemical calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.; Savage, D.; Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the mineralogical and

  1. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  2. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  3. 1976 compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1977-01-01

    This list of currently existing National Nuclear Data Committees, and their memberships, is published with the object of promoting the interaction and enhance the awareness of nuclear data activities in IAEA Member States. The following Member States have indicated the existence of a nuclear data committee in their countries: Bangladesh, Bolivia, Bulgaria, France, Hungary, India, Japan, Romania, Sweden, USSR, United Kingdom, USA, Yugoslavia

  4. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Rose, P.F.; Daly, A.

    1987-01-01

    This request list summarizes the current needs of the US nuclear energy programs and other applied technologies for experimentally measured nuclear data. The request list is ordered by target nucleus (isotope) and then reaction type (quantity). An attempt has been made to describe the quantity in standard notation. An appendix contains a glossary of the symbols used with a short explanatory text. Because of the changing and continuing character of the need for data request information, as well as the probability that current measurements may satisfy a portion of the request, this report is to be regarded as a working document. In fact, it is maintained as a data base by the National Nuclear Data Center. Procedures for submitting data request, priority assignments, and the DOE/NDC Committee membership are included

  5. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  6. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains the storage coefficient, porosity, compressibility and fracture data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. These sites are the following: Stripa Mine, Sweden; Finnsjon, Kamlunge, Fjallveden, Gidea, Svartboberget, Sweden; Olkiluoto, Loviisa, Lavia, Finland; Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington; Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided. The fracture data for the first three of the sites listed above are contained in this volume. The fracture data for the remaining research research sites are discussed in Volume 4

  7. Compilation of radiation damage test data. I

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Stolarz-Izycka, A.

    1979-01-01

    This report summarizes radiation damage test data on commercially available organic cable insulation and jacket materials: ethylene-propylene rubber, Hypalon, neoprene rubber, polyethylene, polyurethane, polyvinylchloride, silicone rubber, etc. The materials have been irradiated in a nuclear reactor to integrated absorbed doses from 5 X 10 5 to 5 X 10 6 Gy. Mechanical properties, e.g. tensile strength, elongation at break, and hardness, have been tested on irradiated and non-irradiated samples. The results are presented in the form of tables and graphs, to show the effect of the absorbed dose on the measured properties. (Auth.)

  8. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains the permeability data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. These sites are the following: Stripa Mine, Sweden; Finnsjon, Kamlunge, Fjallveden, Gidea, Svartboberget, Sweden; Olkiluoto, Loviisa, Lavia, Finland; Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington; Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided

  9. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This volume contains a continuation of the fracture data for the research sites discussed in Volume 1 which have been studied in sufficient detail to allow for analysis. The sites discussed in this volume are the following: Climax Granite Nevada Test Site; OCRD Room, Colorado School of Mines; Savannah River Plant, Aiken, South Carolina; Oracle, Arizona; Basalt Waste Isolation Project (BWIP), Hanford, Washington Underground Research Laboratory, AECL, Canada; Atikokan Research Area, AECL; Chalk River Research Area, AECL; Whiteshell Research Area, AECL. Other sources of information have been included where sufficient site specific geologic and hydrogeologic information is provided

  10. Compilation of radiation damage test data

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Tavlet, M.

    1989-01-01

    This report summarizes radiation damage test data on commercially available organic cable insulation and jacket materials: Ethylene-propylene rubbers, polyethylenes, polyurethanes, silicone rubbers, and copolymers based on polyethylene. The materials have been irradiated either in a nuclear reactor, or with a cobalt-60 source, or in the CERN accelerators, at different dose rates. The absorbed doses were between 10 3 and 5x10 6 Gy. Mechanical properties, e.g. tensile strength, elongation at break, and hardness, have been tested on irradiated and non-irradiated samples, according to the recommendations of the International Electrotechnical Commission. The results are presented in the form of tables and graphs to show the effect of the absorbed dose on the measured properties. (orig.)

  11. Compilation of radiation damage test data. II

    International Nuclear Information System (INIS)

    Schoenbacher, H.; Stolarz-Izycka, A.

    1979-01-01

    This report summarizes radiation damage test data on thermosetting and thermoplastic resins, with the main emphasis on epoxy resins used for magnet coil insulations. Also, other materials such as polyesters, phenolics, polyurethanes, silicones, etc., are represented. The materials have been irradiated in a nuclear reactor to integrated absorbed doses between 5x10 6 Gy and 1x10 8 Gy. The mechanical properties, e.g. the flexural strength, deflection at break, and tangent modulus of elasticity, have been measured on irradiated and non-irradiated samples. The results are given as variation of these parameters versus absorbed dose and are presented in the form of tables and graphs. The tested materials are catalogued in alphabetical order. (Auth.)

  12. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  13. Compilation of LLNL CUP-2 Data

    Energy Technology Data Exchange (ETDEWEB)

    Eppich, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kips, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lindvall, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-31

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived from the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results

  14. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  15. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  16. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  17. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  18. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  19. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  20. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  1. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  2. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  3. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  4. Neutron data compilation at the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Lemmel, H.D.; Attree, P.M.; Byer, T.A.; Good, W.M.; Hjaerne, L.; Konshin, V.A.; Lorens, A.

    1968-03-01

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  5. Neutron data compilation at the International Atomic Energy Agency

    Energy Technology Data Exchange (ETDEWEB)

    Lemmel, H D; Attree, P M; Byer, T A; Good, W M; Hjaerne, L; Konshin, V A; Lorens, A [Nuclear Data Unit, International Atomic Energy Agency, Vienna (Austria)

    1968-03-15

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  6. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  7. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  8. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  9. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    Riemer, R.L.

    1992-01-01

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets

  10. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1987-03-01

    The Panel on Basic Nuclear Data Compilations believes that it is of paramount importance to achieve as short a cycle time as is reasonably possible in the evaluation and publication of the A-chains. The panel, therefore, has concentrated its efforts on identifying those factors that have tended to increase the cycle time and on finding ways to remove the obstacles. An important step was made during the past year to address reduction of the size of the published evaluations - another factor that can reduce cycle time. The Nuclear Structure and Decay Data (NSDD) network adopted new format guidelines, which generated a 30% reduction by eliminating redundancy and/or duplication. A current problem appears to be the rate at which the A-chains are being evaluated, which, on the average, is only about one-half of what it could be. It is hoped that the situation will improve with an increase in the number of foreign centers and an increase in efficiency as more A-chains are recycled by the same evaluator who did the previous evaluation. Progress has been made in the area of on-line access to the nuclear data files in that a subcommittee report describing the requirements of an on-line system has been produced. 2 tabs

  11. VizieR Online Data Catalog: Compilation of stellar rotation data (Kovacs, 2018)

    Science.gov (United States)

    Kovacs, G.

    2018-03-01

    The three datasets included in table1-1.dat, table1-2.dat and table1-6.dat respectively, correspond to the type of stars listed in Table 1 in lines 1 [Praesepe], 2 [HJ_host] and 6 [Field(C)]. These data result from the compilation of rotational and other stellar data from the literature. (4 data files).

  12. Compilation and evaluation of atomic and molecular data relevant to controlled thermonuclear research needs: USA programs

    International Nuclear Information System (INIS)

    Barnett, C.F.

    1976-01-01

    The U.S. role in the compilation and evaluation of atomic data for controlled thermonuclear research is discussed in the following three areas: (1) atomic structure data, (2) atomic collision data, and (3) surface data

  13. ccPDB: compilation and creation of data sets from Protein Data Bank.

    Science.gov (United States)

    Singh, Harinder; Chauhan, Jagat Singh; Gromiha, M Michael; Raghava, Gajendra P S

    2012-01-01

    ccPDB (http://crdd.osdd.net/raghava/ccpdb/) is a database of data sets compiled from the literature and Protein Data Bank (PDB). First, we collected and compiled data sets from the literature used for developing bioinformatics methods to annotate the structure and function of proteins. Second, data sets were derived from the latest release of PDB using standard protocols. Third, we developed a powerful module for creating a wide range of customized data sets from the current release of PDB. This is a flexible module that allows users to create data sets using a simple six step procedure. In addition, a number of web services have been integrated in ccPDB, which include submission of jobs on PDB-based servers, annotation of protein structures and generation of patterns. This database maintains >30 types of data sets such as secondary structure, tight-turns, nucleotide interacting residues, metals interacting residues, DNA/RNA binding residues and so on.

  14. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  15. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  16. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  17. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1985-01-01

    Topics discussed include: the status of mass-chain evaluations, remote terminal access, other US Nuclear Data Network publications, formats and procedures subcommittee report, keyword follow-up (Phys. Rev. C), and atomic data and nuclear data tables

  18. Fatigue data compilation and evaluation of fatigue on design

    International Nuclear Information System (INIS)

    Nyilas, A.

    1985-05-01

    The aim of this report is a review of the available fatigue data of various materials necessary for the design of large superconducting magnets for fusion. One of the primary objectives of this work is to present a broad outline of the low temperature fatigue data of relevant materials within the scope of available data. Besides the classical fatigue data of materials the fatigue crack propagation measurements are outlined widely. The existing recommendations for the design of cryogenic structures are described. A brief introduction of fracture mechanics as well as a historical background of the development of our present day understanding of fatigue has been done. (orig.) [de

  19. Hydrocarbon solvent exposure data: compilation and analysis of the literature.

    Science.gov (United States)

    Caldwell, D J; Armstrong, T W; Barone, N J; Suder, J A; Evans, M J

    2000-01-01

    An occupational exposure database for hydrocarbon solvent end-use applications was constructed from the published literature. The database provides exposure assessment information for such purposes as regulatory risk assessments, support of industry product stewardship initiatives, and identification of applications in which limited exposure data are available. It is quantitative, documented, and based on credible data. Approximately 350 articles containing quantitative hydrocarbon solvent exposure data were identified using a search of computer databases of published literature. Many articles did not report sufficient details of the exposure data for inclusion in the database (e.g., full-shift exposure or task-based exposure data). Others were excluded because only limited summary statistics were provided, which precluded statistical analysis of the data (e.g., arithmetic mean concentration presented, but no sample number). Following evaluation, 16,880 hydrocarbon solvent exposure measurements from 99 articles were entered into a database for analysis. Methods used to identify and evaluate published solvent exposure data are described along with more detailed analysis of worker exposure to hydrocarbon solvents in three major end-use applications: painting and coating, printing, and adhesives. Solvent exposures were evaluated against current ACGIH threshold limit values (TLVs) and trends were identified. Limited quantitative data are available prior to 1970. In general, reported hydrocarbon solvent exposures decreased fourfold from 1960 to 1998, were below the TLVs applicable to specific hydrocarbon solvents at the time, and on average have been below 40% of the TLV since 1980. The database already has proved valuable; however, the utility of published exposure data could be further improved if authors consistently reported essential data elements and supporting information.

  20. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  1. Compilation and evaluation of fission yield nuclear data

    International Nuclear Information System (INIS)

    Lammer, M.

    1991-09-01

    The task of this meeting was to review the progress made since the previous meeting on fission yield evaluation and to define the tasks for an IAEA Co-ordinated Research Programme in detail. Improvements have been noted in measured data, model calculations and the situation of fission yield evaluation. Tabs

  2. Compilation of radiation damage test data cable insulating materials

    CERN Document Server

    Schönbacher, H; CERN. Geneva

    1979-01-01

    This report summarizes radiation damage test data on commercially available organic cable insulation and jacket materials: ethylene- propylene rubber, Hypalon, neoprene rubber, polyethylene, polyurethane, polyvinylchloride, silicone rubber, etc. The materials have been irradiated in a nuclear reactor to integrated absorbed doses from 5*10/sup 5/ to 5*10/sup 6/ Gy. Mechanical properties, e.g. tensile strength, elongation at break, and hardness, have been tested on irradiated and non-irradiated samples. The results are presented in the form of tables and graphs, to show the effect of the absorbed dose on the measured properties. (13 refs).

  3. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  4. Compilation of radiation damage test data. Pt. 3

    International Nuclear Information System (INIS)

    Beynel, P.; Maier, P.; Schoenbacher, H.

    1982-01-01

    This handbook gives the results of radiation damage tests on various engineering materials and components intended for installation in radiation areas of the CERN high-energy particle accelerators. It complements two previous volumes covering organic cable-insulating materials and thermoplastic and thermosetting resins. The irradiation have been carried out at various radiation sources and the results of the different tests are reported, sometimes illustrated by tables and graphs to show the variation of the measured property with absorbed radiation dose. For each entry, an appreciation of the radiation resistance is given, based on measurement data, indicating the range of damage (moderate to severe) for doses from 10 to 10 8 Gy. Also included are tables, selected from published reports, of general relative radiation effects for several groups of materials, to which there are systematic cross-references in the alphabetical part. This third and last volume contains cross-references to all the materials presented up to now, so that it can be used as a guide to the three volumes. (orig.)

  5. ERES--a PC software for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingjin

    1993-01-01

    The major functions and implementation of the software ERES (EXFOR Edit System) are introduced. The ERES is developed for nuclear data compilation in EXFOR (EXchange FORmat) format, running on IBM-PC/XT or IBM-PC/AT. EXFOR is the format for the exchange of experimental neutron data accepted by four neutron data centers in the world

  6. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  7. Compilations and evaluations of data on the interaction of electromagnetic radiation with matter

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-05-01

    The material contained in this report deals with data on the interaction of electromagnetic radiation with matter, listing major compilations of X-ray, photon and gamma-ray cross sections and attentuation coefficients, as well as selected reports featuring data on compton scattering, photoelectric absorption and pair production

  8. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  9. Los Alamos geostationary orbit synoptic data set: a compilation of energetic particle data

    International Nuclear Information System (INIS)

    Baker, D.N.; Higbie, P.R.; Belian, R.D.; Aiello, W.P.; Hones, E.W. Jr.; Tech, E.R.; Halbig, M.F.; Payne, J.B.; Robinson, R.; Kedge, S.

    1981-08-01

    Energetic electron (30 to 2000 keV) and proton (145 keV to 150 MeV) measurements made by Los Alamos National Laboratory sensors at geostationary orbit 6.6 R/sub E/ are summarized. The data are plotted in terms of daily average spectra, 3-h local time averages, and in a variety of statistical formats. The data summarize conditions from mid-1976 through 1978 (S/C 1976-059) and from early 1977 through 1978 (S/C 1977-007). The compilations correspond to measurements at 35 0 W, 70 0 W, and 135 0 W geographic longitude and, thus, are indicative of conditions at 9 0 , 11 0 , and 4.8 0 geomagnetic latitude, respectively. Most of this report is comprised of data plots that are organized according to Carrington solar rotations so that the data can be easily compared to solar rotation-dependent interplanetary data. As shown in prior studies, variations in solar wind conditions modulate particle intensity within the terrestrial magnetosphere. The effects of these variations are demonstrated and discussed. Potential uses of the Synoptic Data Set by the scientific and applications-oriented communities are also discussed

  10. Palaeoecological studies as a source of peat depth data: A discussion and data compilation for Scotland

    Directory of Open Access Journals (Sweden)

    J. Ratcliffe

    2016-06-01

    Full Text Available The regional/national carbon (C stock of peatlands is often poorly characterised, even for comparatively well-studied areas. A key obstacle to better estimates of landscape C stock is the scarcity of data on peat depth, leading to simplistic assumptions. New measurements of peat depth become unrealistically resource-intensive when considering large areas. Therefore, it is imperative to maximise the use of pre-existing datasets. Here we propose that one potentially valuable and currently unexploited source of peat depth data is palaeoecological studies. We discuss the value of these data and present an initial compilation for Scotland (United Kingdom which consists of records from 437 sites and yields an average depth of 282 cm per site. This figure is likely to be an over-estimate of true average peat depth and is greater than figures used in current estimates of peatland C stock. Depth data from palaeoecological studies have the advantages of wide distribution, high quality, and often the inclusion of valuable supporting information; but also the disadvantage of spatial bias due to the differing motivations of the original researchers. When combined with other data sources, each with its own advantages and limitations, we believe that palaeoecological datasets can make an important contribution to better-constrained estimates of peat depth which, in turn, will lead to better estimates of peatland landscape carbon stock.

  11. A compilation of structural property data for computer impact calculation (5/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 5 involve structural property data of wood. (author)

  12. Irradiation of red meat. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ''Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control''. 146 refs

  13. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  14. Data compilation of respiration, feeding, and growth rates of marine pelagic organisms

    DEFF Research Database (Denmark)

    2013-01-01

    's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from...

  15. Compilation of data and descriptions for United States and foreign liquid metal fast breeder reactors

    International Nuclear Information System (INIS)

    Appleby, E.R.

    1975-08-01

    This document is a compilation of design and engineering information pertaining to liquid metal cooled fast breeder reactors which have operated, are operating, or are currently under construction, in the United States and abroad. All data has been taken from publicly available documents, journals, and books

  16. Compilation of properties data for Li{sub 2}TiO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Roux, N [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1998-03-01

    Properties data obtained at CEA for Li{sub 2}TiO{sub 3} are reported. The compilation includes : stability of Li{sub 2}TiO{sub 3} {beta} phase, specific heat, thermal diffusivity, thermal conductivity, linear thermal expansion, thermal creep, interaction with water and acid. (author)

  17. Irradiation of red meat. A compilation of technical data for its authorization and control

    Energy Technology Data Exchange (ETDEWEB)

    International consultative group on food irradiation

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ``Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control``. 146 refs.

  18. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  19. A compilation of experimental burnout data for axial flow of water in rod bundles

    International Nuclear Information System (INIS)

    Chapman, A.G.; Carrard, G.

    1981-02-01

    A compilation has been made of burnout (critical heat flux) data from the results of more thant 12,000 tests on 321 electrically-heated, water-cooled experimental assemblies each simulating, to some extent, the operating or postulated accident conditions in the fuel elements of water-cooled nuclear power reactors. The main geometric characteristics of the assemblies are listed and references are given for the sources of information from which the data were gathered

  20. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias

    2015-08-12

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  1. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias; Bruckner, Stefan; Groller, M. Eduard; Hadwiger, Markus; Rautek, Peter

    2015-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  2. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure.

    Science.gov (United States)

    Labschütz, Matthias; Bruckner, Stefan; Gröller, M Eduard; Hadwiger, Markus; Rautek, Peter

    2016-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  3. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  4. A compilation of structural property data for computer impact calculation (1/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nagata, Norio.

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 1 involve structural property data and data processing computer program. (author)

  5. GfW-handbook for data compilation of irradiation tested electronic components

    International Nuclear Information System (INIS)

    Wulf, F.; Braeunig, D.; Gaebler, W.

    1981-06-01

    The present 2. edition of the Data Compilation of Irradiation Tested Electronic Components represents a continuation of the 1. edition and is published as a loose-leaf handbook. In addition to the 190 reports provided in the 1. issue the present handbook contains further 44 test reports of currently used semiconductor devices in a comprehensive but easily to handle graphical and tabular presentation. Statistical values are given in order to facilitate the parts life time evaluation in a radiative environment. (orig.) [de

  6. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  7. A compilation of structural property data for computer impact calculation (3/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 3 involve structural property data of stainless steel. (author)

  8. A compilation of structural property data for computer impact calculation (2/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 2 involve structural property data of mild steel. (author)

  9. Irradiation of bulbs and tuber crops. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1997-04-01

    This publication contains a compilation of available scientific and technical data on the irradiation of bulbs and tuber crops. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of the food. It was also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International Conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. 448 refs, 6 tabs

  10. Irradiation of strawberries. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1994-12-01

    The document contains a compilation of all available scientific and technical data on the irradiation of strawberries. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. Refs, 1 tab

  11. Hydrogeological conditions in the Finnsjoen area. Compilation of data and conceptual model

    International Nuclear Information System (INIS)

    Andersson, J.E.; Nordqvist, R.; Nyberg, G.; Smellie, J.; Tiren, S.

    1991-02-01

    In the present report all available data gathered from the Finnsjoen area of potential use for numerical modelling are compiled and discussed. The data have been collected during different phases during the period 1977-1989. This inevitably means that the quality of the measured and interpreted data varies in accordance with the continuous developments of improved equipments and interpretation techniques. The present report is an updated version of the SKB progress report 89-24 with the same title and authors, see introduction. (au)

  12. Status on the compilation of nuclear data for medical radioisotopes produced by accelerators

    International Nuclear Information System (INIS)

    Gandarias-Cruz, D.; Okamoto, K.

    1988-10-01

    The status of data on excitation functions and thick target yields for medical radioisotopes produced by accelerators is summarized. Most of the information was extracted from the compiled data in EXFOR (EXCHANGE FORMAT) which is a common format used by the co-operating nuclear data centres in the world. The nuclear decay mode, half-life, production method, Q-value, maximum cross-section value and the energy at this maximum, are tabulated. For some commonly used reactions, the available excitation functions are plotted in graph. (author). 353 refs

  13. Compilation of floristic and herbarium specimen datain Iran: proposal to data structure

    Directory of Open Access Journals (Sweden)

    Majid Sharifi-Tehrani

    2013-09-01

    Full Text Available Floristic databases constitute the second level of plant information systems, after taxonomic-nomenclatural databases. This paper provided the details of data structure and available data resources to develop a floristic database, along with some explanations on taxonomic and floristic databases. Also, this paper proposed the availability and possibility of a shortcut to constructing a national floristic database through uniforming and compilation of dispersed floristic data contained in various botanical centers of Iran. Therefore, Iran could be the second country in SW Asia region to have a national floristic database, and the resulted services can be presented to national scientific community.

  14. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  15. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  16. Principal facts for gravity data collected in the southern Albuquerque Basin area and a regional compilation, central New Mexico

    Science.gov (United States)

    Gillespie, Cindy L.; Grauch, V.J.S.; Oshetski, Kim; Keller, Gordon R.

    2000-01-01

    Principal facts for 156 new gravity stations in the southern Albuquerque basin are presented. These data fill a gap in existing data coverage. The compilation of the new data and two existing data sets into a regional data set of 5562 stations that cover the Albuquerque basin and vicinity is also described. Bouguer anomaly and isostatic residual gravity data for this regional compilation are available in digital form from ftp://greenwood.cr.usgs.gov/pub/openfile- reports/ofr-00-490.

  17. Compilation of reactor-physical data of the AVR experimental reactor for 1982

    International Nuclear Information System (INIS)

    Werner, H.; Wawrzik, U.; Grotkamp, T.; Buettgen, I.

    1983-12-01

    Since the end of 1981 the calculation model AVR-80 has been taken as a basis for compiling reactor-physical data of the AVR experimental reactor. A brief outline of the operation history of 1982 is given, including the beginning of a large-scale experiment dealing with change-over from high enriched uranium to low enriched uranium. Calculations relative to spectral shift, diffusion, temperature, burnup, and recirculation of the fuel elements are described in brief. The essential results of neutron-physical and thermodynamic calculations and the characteristical data of the various types of fuel used are shown in tables and illustrations. (RF) [de

  18. The NASA earth resources spectral information system: A data compilation, second supplement

    Science.gov (United States)

    Vincent, R. K.

    1973-01-01

    The NASA Earth Resources Spectral Information System (ERSIS) and the information contained therein are described. It is intended for use as a second supplement to the NASA Earth Resources Spectral Information System: A Data Compilation, NASA CR-31650-24-T, May 1971. The current supplement includes approximately 100 rock and mineral, and 375 vegetation directional reflectance spectral curves in the optical region from 0.2 to 22.0 microns. The data were categorized by subject and each curve plotted on a single graph. Each graph is fully titled to indicate curve source and indexed by subject to facilitate user retrieval from ERSIS magnetic tape records.

  19. Compilation of nuclear decay data used for dose calculations. Data for radionuclides not listed in ICRP publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Tsutomu

    1999-07-01

    Nuclear decay data used for dose calculations were compiled for 162 nuclides with half-lives greater than or equal to 10 min that are not listed in ICRP Publication 38 (Publ. 38) and their 28 daughter nuclides. Additional 14 nuclides that are considered to be important in fusion reactor facilities were also included. The data were compiled using decay data sets of the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Investigations of the data sets were performed to check their consistency by referring to recent literature and NUBASE, the database for nuclear and decay properties of nuclides, and by using the utility programs of ENSDF. Possible revisions of the data sets were made for their format and syntax errors, level schemes, normalization records, and so on. The revised data sets were processed by EDISTR in order to calculate the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays including annihilation photons, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformations of the radionuclides. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were presented in two types of format; Publ. 38 and NUCDECAY formats. This report provides the decay data in the Publ. 38 format along with decay scheme drawings. The data will be widely used for internal and external dose calculations in radiation protection. (author)

  20. Priorities for injury prevention in women's Australian football: a compilation of national data from different sources.

    Science.gov (United States)

    Fortington, Lauren V; Finch, Caroline F

    2016-01-01

    Participation in Australian football (AF) has traditionally been male dominated and current understanding of injury and priorities for prevention are based solely on reports of injuries in male players. There is evidence in other sports that indicates that injury types differ between males and females. With increasing participation in AF by females, it is important to consider their specific injury and prevention needs. This study aimed to provide a first injury profile from existing sources for female AF. Compilation of injury data from four prospectively recorded data sets relating to female AF: (1) hospital admissions in Victoria, 2008/09-13/14, n=500 injuries; (2) emergency department (ED) presentations in Victoria, 2008/09-2012/13, n=1,879 injuries; (3) insurance claims across Australia 2004-2013, n=522 injuries; (4) West Australian Women's Football League (WAWFL), 2014 season club data, n=49 injuries. Descriptive results are presented as injury frequencies, injury types and injury to body parts. Hospital admissions and ED presentations were dominated by upper limb injuries, representing 47% and 51% of all injuries, respectively, primarily to the wrist/hand at 32% and 40%. Most (65%) insurance claim injuries involved the lower limb, 27% of which were for knee ligament damage. A high proportion of concussions (33%) were reported in the club-collected data. The results provide the first compilation of existing data sets of women's AF injuries and highlight the need for a rigorous and systematic injury surveillance system to be instituted.

  1. Compilation of reactor physics data of the year 1984, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-12-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1984 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  2. Compilation of reactor physics data of the year 1983, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-06-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1983 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  3. Compilation of nucleon-nucleon and nucleon-antinucleon elastic scattering data

    International Nuclear Information System (INIS)

    Carter, M.K.; Collins, P.D.B.; Whalley, M.R.

    1986-01-01

    A compilation of the data on pp, pn, nn, p-barp, p-barn, n-barp, and n-barn is presented, in both tabular and graphical form, including when available the total and elastic cross sections, the differences of the total cross section in different spin states, the ratio of the real to imaginary part of the forward scattering amplitude, the elastic differential cross sections, the polarization asymmetry and the spin correlation parameters, for all laboratory-frame momenta >=2 GeV/c. All the data in this review can be found in and retrieved from the Durham-RAL HEP data base together with data on a wide variety of other reactions. (author)

  4. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  5. Compilation of field-scale caisson data on solute transport in the unsaturated zone

    International Nuclear Information System (INIS)

    Polzer, W.L.; Essington, E.H.; Fuentes, H.R.; Nyhan, J.W.

    1986-11-01

    Los Alamos National Laboratory has conducted technical support studies to assess siting requirements mandated by Nuclear Regulatory Commission in 10 CFR Part 61. Field-scale transport studies were conducted under unsaturated moisture conditions and under steady and unsteady flow conditions in large caissons located and operated in a natural (field) environment. Moisture content, temperature, flow rate, base-line chemical, tracer influent, and tracer breakthrough data collected during tracer migration studies in the caisson are compiled in tables and graphs. Data suggest that the imposition of a period of drainage (influent solution flow was stopped) may cause an increase in tracer concentration in the soil solution at various sampling points in the caisson. Evaporation during drainage and diffusion of the tracers from immobile to mobile water are two phenomena that could explain the increase. Data also suggest that heterogeneity of sorption sites may increase the variability in transport of sorbing tracers compared with nonsorbing tracers

  6. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    International Nuclear Information System (INIS)

    Endo, Akira; Yamaguchi, Yasuhiro

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of α particles, β particles, γ rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt γ rays, delayed γ rays, and β particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  7. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  8. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the ecotoxicological effect concentrations, the occurrence

  9. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the

  10. Annual accumulation over the Greenland ice sheet interpolated from historical and newly compiled observation data

    Science.gov (United States)

    Shen, Dayong; Liu, Yuling; Huang, Shengli

    2012-01-01

    The estimation of ice/snow accumulation is of great significance in quantifying the mass balance of ice sheets and variation in water resources. Improving the accuracy and reducing uncertainty has been a challenge for the estimation of annual accumulation over the Greenland ice sheet. In this study, we kriged and analyzed the spatial pattern of accumulation based on an observation data series including 315 points used in a recent research, plus 101 ice cores and snow pits and newly compiled 23 coastal weather station data. The estimated annual accumulation over the Greenland ice sheet is 31.2 g cm−2 yr−1, with a standard error of 0.9 g cm−2 yr−1. The main differences between the improved map developed in this study and the recently published accumulation maps are in the coastal areas, especially southeast and southwest regions. The analysis of accumulations versus elevation reveals the distribution patterns of accumulation over the Greenland ice sheet.

  11. Depleted uranium hexafluoride management program : data compilation for the Paducah site

    International Nuclear Information System (INIS)

    Hartmann, H.

    2001-01-01

    This report is a compilation of data and analyses for the Paducah site, near Paducah, Kentucky. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Paducah site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  12. Depleted uranium hexafluoride management program : data compilation for the Portsmouth site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the Portsmouth site, near Portsmouth, Ohio. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the Portsmouth site and summarizes potential environmental impacts that could result from conducting the following depleted uranium hexafluoride (UF 6 ) management activities at the site: continued cylinder storage, preparation of cylinders for shipment, conversion, and long-term storage. DOE's preferred alternative is to begin converting the depleted UF 6 inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  13. Compilation of data concerning know and suspected water hammer events in nuclear power plants, CY 1969

    International Nuclear Information System (INIS)

    Chapman, R.L.; Christensen, D.D.; Dafoe, R.E.; Hanner, O.M.; Wells, M.E.

    1981-05-01

    This report compiles data concerning known and suspected water hammer events reported by BWR and PWR power plants in the United States from January 1, 1969, to May 1, 1981. This information is summarized for each event and is tabulated for all events by plant, plant type, year of occurrence, type of water hammer, system affected, basis/cause for the event, and damage incurred. Information is also included from other events not specifically identified as water hammer related. These other events involved vibration and/or system components similar to those involved in the water hammer events. The other events are included to ensure completeness of the report, but are not used to point out particular facts or trends. This report does not evaluate findings abstracted from the data

  14. Compilation of selected deep-sea biological data for the US subseabed disposal project

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-03-01

    The US Subseabed Disposal Project (SDP) has compiled an extensive deep-sea biological data base to be used in calculating biological parameters of state and rate included in mathematical models of oceanographic transport of radionuclides. The data base is organized around a model deep-sea ecosystem which includes the following components: zooplankton, fish and other nekton, invertebrate benthic megafauna, benthic macrofauna, benthic meiofauna, heterotrophic microbiota, as well as suspended and sediment particulate organic carbon. Measurements of abundance and activity rates (e.g., respiration, production, sedimentation, etc.) reported in the international oceanographic literature are summarized in 23 tables. Included in these tables are the latitudinal position of the studies, as well as information describing sampling techniques and any special notes needed to better assess the data presented. This report has been prepared primarily as a resource document to be used in calculating parameter values for various modeling applications, and for preparing historical data reviews for other SDP reports. Depending on the intended use, these data will require further reduction and unit conversion

  15. Mineralogy and geochemistry of rocks and fracture fillings from Forsmark and Oskarshamn: Compilation of data for SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Drake, Henrik; Sandstroem, Bjoern [Isochron GeoConsulting HB, Goeteborg (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden)

    2006-11-15

    This report is a compilation of the so far available data for the safety assessment SR-Can carried out by SKB. The data consists of mineralogy, geochemistry, porosity, density and redox properties for both dominating rock types and fracture fillings at the Forsmark and Oskarshamn candidate areas. In addition to the compilation of existing information, the aim has been to identify missing data and to clarify some conception of e.g. deformation zones. The objective of the report is to present the available data requested for the modelling of the chemical stability of the two sites. The report includes no interpretation of the data.

  16. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD50s and a compilation of experimental data

    International Nuclear Information System (INIS)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD 50 where sufficient experimental data are available. Exposure rates varied in magnitude from about 10 -2 to 10 3 R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs

  17. EPOCA/EUR-OCEANS data compilation on the biological and biogeochemical responses to ocean acidification

    Directory of Open Access Journals (Sweden)

    A.-M. Nisumaa

    2010-07-01

    Full Text Available The uptake of anthropogenic CO2 by the oceans has led to a rise in the oceanic partial pressure of CO2, and to a decrease in pH and carbonate ion concentration. This modification of the marine carbonate system is referred to as ocean acidification. Numerous papers report the effects of ocean acidification on marine organisms and communities but few have provided details concerning full carbonate chemistry and complementary observations. Additionally, carbonate system variables are often reported in different units, calculated using different sets of dissociation constants and on different pH scales. Hence the direct comparison of experimental results has been problematic and often misleading. The need was identified to (1 gather data on carbonate chemistry, biological and biogeochemical properties, and other ancillary data from published experimental data, (2 transform the information into common framework, and (3 make data freely available. The present paper is the outcome of an effort to integrate ocean carbonate chemistry data from the literature which has been supported by the European Network of Excellence for Ocean Ecosystems Analysis (EUR-OCEANS and the European Project on Ocean Acidification (EPOCA. A total of 185 papers were identified, 100 contained enough information to readily compute carbonate chemistry variables, and 81 data sets were archived at PANGAEA – The Publishing Network for Geoscientific & Environmental Data. This data compilation is regularly updated as an ongoing mission of EPOCA.

    Data access: http://doi.pangaea.de/10.1594/PANGAEA.735138

  18. Discipline, Dilemmas, Decisions and Data Distribution in the Planning and Compilation of Monolingual Dictionaries

    Directory of Open Access Journals (Sweden)

    Rufus H Gouws

    2011-10-01

    Full Text Available

    Abstract: Bilingual dictionaries play an important role in the standardisation of a language and are often the first dictionary type to be compiled for a given speech community. However, this may never lead to an underestimation of the role and importance of monolingual descriptive dictionaries in the early lexicographic development of a language. In the planning of first descriptive dictionaries the choice of the proper subtype and a consistent application of theoretical principles should be regarded as of extreme importance. Even the compilation of a restricted descriptive dictionary should be done according to similar theoretical principles as those applying to comprehensive dictionaries. This contribution indicates a number of dilemmas confronting the lexicographer during the compilation of restricted monolingual descriptive dictionaries. Attention is given to the role of lexicographic functions and the choice and presentation of lexicographic data, with special reference to the presentation of certain types of polysemous senses which are subjected to frequency of use restrictions. Emphasis is placed on the value of a heterogeneous article structure and a micro-architecture in the articles of restricted dictionaries.

    Keywords: ACCESS STRUCTURE, DATA DISTRIBUTION, FRAME STRUCTURE, FRE-QUENCY OF USE, HETEROGENEOUS ARTICLE STRUCTURE, LEXICOGRAPHIC FUNC-TIONS, LEXICOGRAPHIC PROCESS, MICRO-ARCHITECTURE, MONOLINGUAL DICTION-ARY, POLYSEMY, SEMANTIC DATA, TEXT BLOCK, USER-FRIENDLINESS, USER-PERSPEC-TIVE, VERTICAL ARCHITECTONIC EXTENSION

    Opsomming: Dissipline, dilemmas, besluite en dataverspreiding in die beplanning en samestelling van eentalige woordeboeke. Tweetalige woordeboeke speel 'n belangrike rol in die standaardisering van taal en is dikwels die eerste woordeboektipe wat vir 'n bepaalde taalgemeenskap saamgestel word. Dit mag egter nie tot 'n geringskatting lei van die rol en waarde van eentalige verklarende woordeboeke in die

  19. Depleted uranium hexafluoride management program : data compilation for the K-25 site

    International Nuclear Information System (INIS)

    Hartmann, H. M.

    2001-01-01

    This report is a compilation of data and analyses for the K-25 site on the Oak Ridge Reservation, Oak Ridge, Tennessee. The data were collected and the analyses were done in support of the U.S. Department of Energy (DOE) 1999 Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride (DOE/EIS-0269). The report describes the affected environment at the K-25 site and summarizes the potential environmental impacts that could result from continued cylinder storage and preparation of cylinders for shipment at the site. It is probable that the cylinders at the K-25 site will be shipped to another site for conversion. Because conversion and long-term storage of the entire inventory at the K-25 site are highly unlikely, these data are not presented in this report. DOE's preferred alternative is to begin converting the depleted uranium hexafluoride inventory as soon as possible to either uranium oxide, uranium metal, or a combination of both, while allowing for use of as much of this inventory as possible

  20. MIRNA-DISTILLER: a stand-alone application to compile microRNA data from databases

    Directory of Open Access Journals (Sweden)

    Jessica K. Rieger

    2011-07-01

    Full Text Available MicroRNAs (miRNA are small non-coding RNA molecules of ~22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3’-UTR of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp

  1. MIRNA-DISTILLER: A Stand-Alone Application to Compile microRNA Data from Databases.

    Science.gov (United States)

    Rieger, Jessica K; Bodan, Denis A; Zanger, Ulrich M

    2011-01-01

    MicroRNAs (miRNA) are small non-coding RNA molecules of ∼22 nucleotides which regulate large numbers of genes by binding to seed sequences at the 3'-untranslated region of target gene transcripts. The target mRNA is then usually degraded or translation is inhibited, although thus resulting in posttranscriptional down regulation of gene expression at the mRNA and/or protein level. Due to the bioinformatic difficulties in predicting functional miRNA binding sites, several publically available databases have been developed that predict miRNA binding sites based on different algorithms. The parallel use of different databases is currently indispensable, but highly uncomfortable and time consuming, especially when working with numerous genes of interest. We have therefore developed a new stand-alone program, termed MIRNA-DISTILLER, which allows to compile miRNA data for given target genes from public databases. Currently implemented are TargetScan, microCosm, and miRDB, which may be queried independently, pairwise, or together to calculate the respective intersections. Data are stored locally for application of further analysis tools including freely definable biological parameter filters, customized output-lists for both miRNAs and target genes, and various graphical facilities. The software, a data example file and a tutorial are freely available at http://www.ikp-stuttgart.de/content/language1/html/10415.asp.

  2. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    International Nuclear Information System (INIS)

    1994-01-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA's 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire

  3. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA`s 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire 3 figs, 5 tabs

  4. Compilation of criticality data involving thorium or 233U and light water moderation

    Energy Technology Data Exchange (ETDEWEB)

    Gore, B.F.

    1978-07-01

    The literature has been searched for criticality data for light water moderated systems which contain thorium or /sup 233/U, and data found are compiled herein. They are from critical experiments, extrapolations, and exponential experiments performed with homogeneous solutions and metal spheres of /sup 233/U; with lattices of fuel rods containing highly enriched /sup 235/UO/sub 2/ - ThO/sub 2/ and /sup 233/UO/sub 2/ - ThO/sub 2/; and with arrays of cyclinders of /sup 233/U solutions. The extent of existing criticality data has been compared with that necessary to implement a thorium-based fuel cycle. No experiments have been performed with any solutions containing thorium. Neither do data exist for homogeneous /sup 233/U systems with H/U < 34, except for solid metal systems. Arrays of solution cylinders up to 3 x 3 x 3 have been studied. Data for solutions containing fixed or soluble poisons are very limited. All critical lattices using /sup 233/UO/sub 2/ - ThO/sub 2/ fuels (LWBR program) were zoned radially, and in most cases axially also. Only lattice experiments using /sup 235/UO/sub 2/ - ThO/sub 2/ fuels have been performed using a single fuel rod type. Critical lattices of /sup 235/UO/sub 2/ - ThO/sub 2/ rods poisoned with boron have been measured, but only exponential experiments have been performed using boron-poisoned lattices of /sup 233/UO/sub 2/ - ThO/sub 2/ rods. No criticality data exist for denatured fuels (containing significant amounts of /sup 238/U) in either solution or lattice configurations.

  5. Compilation of data for the analysis of radionuclide migration from SFL 3-5

    International Nuclear Information System (INIS)

    Skagius, K.; Pettersson, Michael; Wiborgh, M.; Albinsson, Yngve; Holgersson, Stellan

    1999-12-01

    A preliminary safety assessment of the deep repository for long-lived, low and intermediate level waste, SFL 3-5, has been made. This report contains a compilation of data selected for the calculations of the migration of radionuclides and toxic metals from the waste to the biosphere. It also contains the data needed for the next step, which is to calculate dose to man from the far-field release figures. In the preliminary safety assessment it is assumed that SFL 3-5 is located in connection to the deep repository for spent fuel. This makes it possible to utilise site-specific information derived within the safety assessment of the deep repository for spent fuel, SR 97, for the sites Aberg, Beberg and Ceberg. When information from SR 97 is utilised, the values selected are as far as possible those proposed as a 'reasonable estimate' for the migration calculations in SR 97. The selection of values for parameters specific for the calculation of migration from the SFL 3-5 repository is in general on the pessimistic side. The uncertainty in the selected values is discussed and if possible also quantified

  6. Compilation of data for the analysis of radionuclide migration from SFL 3-5

    Energy Technology Data Exchange (ETDEWEB)

    Skagius, K.; Pettersson, Michael; Wiborgh, M. [Kemakta Konsult AB, Stockholm (Sweden); Albinsson, Yngve; Holgersson, Stellan [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1999-12-01

    A preliminary safety assessment of the deep repository for long-lived, low and intermediate level waste, SFL 3-5, has been made. This report contains a compilation of data selected for the calculations of the migration of radionuclides and toxic metals from the waste to the biosphere. It also contains the data needed for the next step, which is to calculate dose to man from the far-field release figures. In the preliminary safety assessment it is assumed that SFL 3-5 is located in connection to the deep repository for spent fuel. This makes it possible to utilise site-specific information derived within the safety assessment of the deep repository for spent fuel, SR 97, for the sites Aberg, Beberg and Ceberg. When information from SR 97 is utilised, the values selected are as far as possible those proposed as a 'reasonable estimate' for the migration calculations in SR 97. The selection of values for parameters specific for the calculation of migration from the SFL 3-5 repository is in general on the pessimistic side. The uncertainty in the selected values is discussed and if possible also quantified.

  7. Can Experimental Scientists, Data Evaluators and Compilers, and Nuclear Data Users Understand One Another?

    International Nuclear Information System (INIS)

    Usachev, L.N.

    1966-01-01

    The International Atomic Energy Agency organizes conferences on a wide variety of scientific subjects, all of which are of fundamental importance for the development of nuclear power. These include the technology of fuel elements, their stability in neutron fields, and chemical reprocessing as well as reactor physics, mathematical computational methods and the problems of protection and dosimetry. The problem of microscopic nuclear data, an essential aspect of reactor work, is just one of these many subjects. On the other hand, it should be remembered that the possibility of releasing nuclear energy was established in the first place by obtaining nuclear data on the fission process occurring in the uranium nucleus following the capture of a neutron and on the escape of the 2-3 secondary fission neutrons. In early nuclear power work the information provided by nuclear data was of considerable, even of decisive, importance. For example, the information available on the neutron balance in fast reactors showed that such reactors could operate as breeders and thus that it was worth while developing them. Strictly speaking, it is of course difficult to speak of a knowledge of nuclear data at this early period. It is perhaps more accurate to speak of the understanding of and the feeling for such data which grew up on the basis of the existing physical ideas on the fission of the nucleus, radiative capture and neutron scattering. Experimental data were very scanty but for that reason they were particularly valuable

  8. Can Experimental Scientists, Data Evaluators and Compilers, and Nuclear Data Users Understand One Another?

    Energy Technology Data Exchange (ETDEWEB)

    Usachev, L. N. [Institute of Physics and Energetics, Obninsk, USSR (Russian Federation)

    1966-07-01

    The International Atomic Energy Agency organizes conferences on a wide variety of scientific subjects, all of which are of fundamental importance for the development of nuclear power. These include the technology of fuel elements, their stability in neutron fields, and chemical reprocessing as well as reactor physics, mathematical computational methods and the problems of protection and dosimetry. The problem of microscopic nuclear data, an essential aspect of reactor work, is just one of these many subjects. On the other hand, it should be remembered that the possibility of releasing nuclear energy was established in the first place by obtaining nuclear data on the fission process occurring in the uranium nucleus following the capture of a neutron and on the escape of the 2-3 secondary fission neutrons. In early nuclear power work the information provided by nuclear data was of considerable, even of decisive, importance. For example, the information available on the neutron balance in fast reactors showed that such reactors could operate as breeders and thus that it was worth while developing them. Strictly speaking, it is of course difficult to speak of a knowledge of nuclear data at this early period. It is perhaps more accurate to speak of the understanding of and the feeling for such data which grew up on the basis of the existing physical ideas on the fission of the nucleus, radiative capture and neutron scattering. Experimental data were very scanty but for that reason they were particularly valuable.

  9. Consolidating duodenal and small bowel toxicity data via isoeffective dose calculations based on compiled clinical data.

    Science.gov (United States)

    Prior, Phillip; Tai, An; Erickson, Beth; Li, X Allen

    2014-01-01

    To consolidate duodenum and small bowel toxicity data from clinical studies with different dose fractionation schedules using the modified linear quadratic (MLQ) model. A methodology of adjusting the dose-volume (D,v) parameters to different levels of normal tissue complication probability (NTCP) was presented. A set of NTCP model parameters for duodenum toxicity were estimated by the χ(2) fitting method using literature-based tolerance dose and generalized equivalent uniform dose (gEUD) data. These model parameters were then used to convert (D,v) data into the isoeffective dose in 2 Gy per fraction, (D(MLQED2),v) and convert these parameters to an isoeffective dose at another NTCP (D(MLQED2'),v). The literature search yielded 5 reports useful in making estimates of duodenum and small bowel toxicity. The NTCP model parameters were found to be TD50(1)(model) = 60.9 ± 7.9 Gy, m = 0.21 ± 0.05, and δ = 0.09 ± 0.03 Gy(-1). Isoeffective dose calculations and toxicity rates associated with hypofractionated radiation therapy reports were found to be consistent with clinical data having different fractionation schedules. Values of (D(MLQED2'),v) between different NTCP levels remain consistent over a range of 5%-20%. MLQ-based isoeffective calculations of dose-response data corresponding to grade ≥2 duodenum toxicity were found to be consistent with one another within the calculation uncertainty. The (D(MLQED2),v) data could be used to determine duodenum and small bowel dose-volume constraints for new dose escalation strategies. Copyright © 2014 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  10. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  11. Compiling the functional data-parallel language SaC for Microgrids of Self-Adaptive Virtual Processors

    NARCIS (Netherlands)

    Grelck, C.; Herhut, S.; Jesshope, C.; Joslin, C.; Lankamp, M.; Scholz, S.-B.; Shafarenko, A.

    2009-01-01

    We present preliminary results from compiling the high-level, functional and data-parallel programming language SaC into a novel multi-core design: Microgrids of Self-Adaptive Virtual Processors (SVPs). The side-effect free nature of SaC in conjunction with its data-parallel foundation make it an

  12. Standard guide for formats for collection and compilation of corrosion data for metals for computerized database input

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This guide covers the data categories and specific data elements (fields) considered necessary to accommodate desired search strategies and reliable data comparisons in computerized corrosion databases. The data entries are designed to accommodate data relative to the basic forms of corrosion and to serve as guides for structuring multiple source database compilations capable of assessing compatibility of metals and alloys for a wide range of environments and exposure conditions.

  13. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1987-09-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The INDC Secretariat tries to maintain this list up-to-date in order to facilitate an efficient interchange of information on nuclear data topics. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  14. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1986-04-01

    This list of INDC Correspondents, including information on currently existing National Nuclear Data Committees and their memberships, is compiled and published upon the request of the International Nuclear Data Committee with the objective to promote the interaction and enhance the awareness of nuclear data activities in IAEA Member States. It also serves as a basis for the distribution of documents originated by or for the International Nuclear Data Committee and includes the names of all recipients of INDC documents. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section list names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s); where applicable committee membership and/or area of specialization are indicated. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  15. Description of source term data on contaminated sites and buildings compiled for the waste management programmatic environmental impact statement (WMPEIS)

    International Nuclear Information System (INIS)

    Short, S.M.; Smith, D.E.; Hill, J.G.; Lerchen, M.E.

    1995-10-01

    The U.S. Department of Energy (DOE) and its predecessor agencies have historically had responsibility for carrying out various national missions primarily related to nuclear weapons development and energy research. Recently, these missions have been expanded to include remediation of sites and facilities contaminated as a result of past activities. In January 1990, the Secretary of Energy announced that DOE would prepare a Programmatic Environmental Impact Statement on the DOE's environmental restoration and waste management program; the primary focus was the evaluation of (1) strategies for conducting remediation of all DOE contaminated sites and facilities and (2) potential configurations for waste management capabilities. Several different environmental restoration strategies were identified for evaluation, ranging from doing no remediation to strategies where the level of remediation was driven by such factors as final land use and health effects. A quantitative assessment of the costs and health effects of remediation activities and residual contamination levels associated with each remediation strategy was made. These analyses required that information be compiled on each individual contaminated site and structure located at each DOE installation and that the information compiled include quantitative measurements and/or estimates of contamination levels and extent of contamination. This document provides a description of the types of information and data compiled for use in the analyses. Also provided is a description of the database used to manage the data, a detailed discussion of the methodology and assumptions used in compiling the data, and a summary of the data compiled into the database as of March 1995. As of this date, over 10,000 contaminated sites and structures and over 8,000 uncontaminated structures had been identified across the DOE complex of installations

  16. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  17. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  18. Compilation of data relating to the erosive response of 608 recently-burned basins in the western United States

    Science.gov (United States)

    Gartner, Joseph E.; Cannon, Susan H.; Bigio, Erica R.; Davis, Nicole K.; Parrett, Charles; Pierce, Kenneth L.; Rupert, Michael G.; Thurston, Brandon L.; Trebesch, Matthew J.; Garcia, Steve P.; Rea, Alan H.

    2005-01-01

    This report presents a compilation of data on the erosive response, debris-flow initiation processes, basin morphology, burn severity, event-triggering rainfall, rock type, and soils for 608 basins recently burned by 53 fires located throughout the Western United States.  The data presented here are a combination of those collected during our own field research and those reported in the literature.  In some cases, data from a Geographic Information System (GIS) and Digital Elevation Models (DEMs) were used to supplement the data from the primary source.  Due to gaps in the information available, not all parameters are characterized for all basins. This database provides a resource for researchers and land managers interested in examining relations between the runoff response of recently burned basins and their morphology, burn severity, soils and rock type, and triggering rainfall.  The purpose of this compilation is to provide a single resource for future studies addressing problems associated with wildfire-related erosion.  For example, data in this compilation have been used to develop a model for debris flow probability from recently burned basins using logistic multiple regression analysis (Cannon and others, 2004).  This database provides a convenient starting point for other studies.  For additional information on estimated post-fire runoff peak discharges and debris-flow volumes, see Gartner and others (2004).

  19. Data Compilation for AGR-1 Baseline Coated Particle Composite LEU01-46T

    International Nuclear Information System (INIS)

    Hunn, John D.; Lowden, Richard Andrew

    2006-01-01

    This document is a compilation of characterization data for the AGR-1 baseline coated particle composite LEU01-46T, a composite of four batches of TRISO-coated 350 (micro)m 19.7% low enrichment uranium oxide/uranium carbide kernels (LEUCO). The AGR-1 TRISO-coated particles consist of a spherical kernel coated with a ∼ 50% dense carbon buffer layer (100 (micro)m nominal thickness) followed by a dense inner pyrocarbonlayer (40 (micro)m nominal thickness) followed by a SiC layer (35 (micro)m nominal thickness) followed by another dense outer pyrocarbon layer (40 (micro)m nominal thickness). The coated particles, were produced by ORNL for the Advanced Gas Reactor Fuel Development and Qualification (AGR) program to be put into compacts for insertion in the first irradiation test capsule, AGR-1. The kernels were obtained from BWXT and identified as composite (G73D-20-69302). The BWXT kernel lot G73D-20-69302 was riffled into sublots for characterization and coating by ORNL and identified as LEU01-?? (where ?? is a series of integers beginning with 01). Additional particle batches were coated with only buffer or buffer plus inner pyrocarbon (IPyC) layers using similar process conditions as used for the full TRISO batches comprising the LEU01-46T composite. These batches were fabricated in order to qualify that the process conditions used for buffer and IPyC would produce acceptable densities, as described in sections 8 and 9. These qualifying batches used 350 (micro)m natural uranium oxide/uranium carbide kernels (NUCO). The kernels were obtained from BWXT and identified as composite G73B-NU-69300. The use of NUCO surrogate kernels is not expected to significantly effect the densities of the buffer and IPyC coatings. Confirmatory batches using LEUCO kernels from G73D-20-69302 were coated and characterized to verify this assumption. The AGR-1 Fuel Product Specification and Characterization Guidance (INL EDF-4380, Rev. 6) provides the requirements necessary for acceptance

  20. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  1. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  2. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    International Nuclear Information System (INIS)

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  3. Summary report of the 1. research co-ordination meeting on compilation and evaluation of photonuclear data for applications

    International Nuclear Information System (INIS)

    1997-04-01

    The present report contains the summary of the first Research Co-ordination Meeting on ''Compilation and Evaluation of Photonuclear Data for Applications'', held in Obninsk, Russia, from 3 to 6 December 1996. The project aims to produce a Technical Document on Photonuclear Data Library for Applications and to develop an IAEA Photonuclear Data Library. Summarized are the conclusions and recommendations of the meeting together with a detailed list of actions. Attached is the information sheet on the project, the agenda of the meeting and the list of participants along with extended abstracts of their presentations. Refs, figs, tabs

  4. Surface data for fusion devices. Progress report on data compilation and assessment by the US, Japanese, and IAEA data centers

    International Nuclear Information System (INIS)

    Thomas, E.W.; Itoh, N.; Langley, R.A.

    1982-01-01

    Besides presenting data in a format useful to plasma modellers, these data collection activities also serve the function of disclosing gaps in the available data base. The IAEA review panel has pointed out that information on processes of electron ejection and reflection is sparse and is generally unsatisfactory for purposes of modelling sheath effects. The US and Japanese Data Centers recently held a joint workshop where it was concluded that data on trapping and reemission was in an unsatisfactory state. In this case the parameters used to record the phenomena are closely related to the model adopted to describe the process. Existing handbooks on materials are generally weak in the areas closely related to metallurgical properties. All three data centers continue to pursue different aspects of the data collection and review process with a close interaction to avoid significant overlap of activities

  5. Thirty years of progress in harmonizing and compiling food data as a result of the establishment of INFOODS.

    Science.gov (United States)

    Murphy, Suzanne P; Charrondiere, U Ruth; Burlingame, Barbara

    2016-02-15

    The International Network of Foods Data Systems (INFOODS) has provided leadership on the development and use of food composition data for over 30years. The mission of INFOODS is the promotion of international participation, cooperation and harmonization in the generation, compilation and dissemination of adequate and reliable data on the composition of foods, beverages, and their ingredients in forms appropriate to meet the needs of various users. Achievements include the development of guidelines and standards, increased capacity development in generating and compiling food composition data, a food composition database management system, improvements in laboratory quality assurance, and development of several food composition databases and tables. Recently, INFOODS has led efforts to define and document food biodiversity. As new foods and food components come into prominence, and as analytical methods evolve, the activities of INFOODS will continue to advance the quality and quantity of food composition data globally into the future. Copyright © 2015 Food and Agriculture Organization of the United Nations. Published by Elsevier Ltd.. All rights reserved.

  6. Thermodynamic data for modeling acid mine drainage problems: compilation and estimation of data for selected soluble iron-sulfate minerals

    Science.gov (United States)

    Hemingway, Bruch S.; Seal, Robert R.; Chou, I-Ming

    2002-01-01

    Enthalpy of formation, Gibbs energy of formation, and entropy values have been compiled from the literature for the hydrated ferrous sulfate minerals melanterite, rozenite, and szomolnokite, and a variety of other hydrated sulfate compounds. On the basis of this compilation, it appears that there is no evidence for an excess enthalpy of mixing for sulfate-H2O systems, except for the first H2O molecule of crystallization. The enthalpy and Gibbs energy of formation of each H2O molecule of crystallization, except the first, in the iron(II) sulfate - H2O system is -295.15 and -238.0 kJ?mol-1, respectively. The absence of an excess enthalpy of mixing is used as the basis for estimating thermodynamic values for a variety of ferrous, ferric, and mixed-valence sulfate salts of relevance to acid-mine drainage systems.

  7. Compilation of radiometric age and trace-element geochemical data, Yucca Mountain and surrounding areas of southwestern Nevada

    International Nuclear Information System (INIS)

    Weiss, S.I.; Noble, D.C.; Larson, L.T.

    1994-01-01

    This document is a compilation of available radiometric age and trace-element geochemical data for volcanic rocks and episodes of hydrothermal activity in Yucca Mountain and the surrounding region of southwestern Nevada. Only the age determinations considered to be geologically reasonable (consistent with stratigraphic relations) are listed below. A number of the potassium-argon (K-Ar) ages of volcanic rocks given by Kistler, Marvin et al., Noble et al., Weiss et al., and Noble et al. are not included as these ages have been shown to be incorrect or disturbed by hydrothermal alteration based on subsequent stratigraphic and/or petrographic data and the recognition of errors in K-Ar age determinations related to incomplete extraction of argon. In cases where absolute ages are tightly constrained by high precision 40 Ar/ 39 Ar ages and unequivocal stratigraphic relations, we have omitted the less precise K-Ar age data. Similarly, the more precise single-crystal laser-fusion 40 Ar/ 39 Ar age determinations of certain units are reported and less precise ages by multi-grain bulk-fusion 40 Ar/ 39 Ar methods are not included. This compilation does not include age data for basaltic rocks of Pliocene and Quaternary age in the Yucca Mountain region

  8. Deuterium in the water cycle of the Schirmacher Oasis (Dronning Maud Land, East Antarctica). A data compilation

    International Nuclear Information System (INIS)

    Kowski, P.; Richter, W.

    1988-01-01

    The Schirmacher Oasis (Dronning Maud Land) - one of the rock deserts of the South Polar region - is situated on the coast of the Antarctic continent, between inland and shelf ice. The data compilation contains results of deuterium studies from different parts of the local water cycle and is arranged according to the main parts: precipitation and atmospheric moisture, both collected near Novolazarevskaya Station, lake water, surface snow and ice, shallow drill cores of snow and ice, and from melt water runoff. Finally, monthly means of precipitation and atmospheric moisture are given. (author)

  9. Compilation of elemental concentration data for NBS Biological and Environmental Standard Reference Materials

    International Nuclear Information System (INIS)

    Gladney, E.S.

    1980-07-01

    Concentration data on up to 76 elementals in 19 NBS Standard Reference Materials have been collected from 325 journal articles and technical reports. These data are summarized into mean +- one standard deviation values and compared with available data from NBS and other review articles. Data are presented on the analytical procedures employed and all raw data are presented in appendixes

  10. Compiled data set of exact NOE distance limits, residual dipolar couplings and scalar couplings for the protein GB3

    Directory of Open Access Journals (Sweden)

    Beat Vögeli

    2015-12-01

    Full Text Available We compiled an NMR data set consisting of exact nuclear Overhauser enhancement (eNOE distance limits, residual dipolar couplings (RDCs and scalar (J couplings for GB3, which forms one of the largest and most diverse data set for structural characterization of a protein to date. All data have small experimental errors, which are carefully estimated. We use the data in the research article Vogeli et al., 2015, Complementarity and congruence between exact NOEs and traditional NMR probes for spatial decoding of protein dynamics, J. Struct. Biol., 191, 3, 306–317, doi:10.1016/j.jsb.2015.07.008 [1] for cross-validation in multiple-state structural ensemble calculation. We advocate this set to be an ideal test case for molecular dynamics simulations and structure calculations.

  11. Data Blocks : Hybrid OLTP and OLAP on compressed storage using both vectorization and compilation

    NARCIS (Netherlands)

    Lang, Harald; Mühlbauer, Tobias; Funke, Florian; Boncz, Peter; Neumann, Thomas; Kemper, Alfons

    2016-01-01

    This work aims at reducing the main-memory footprint in high performance hybrid OLTP&OLAP databases, while retaining high query performance and transactional throughput. For this purpose, an innovative compressed columnar storage format for cold data, called Data Blocks is introduced. Data Blocks

  12. Truly nested data-parallelism: compiling SaC for the Microgrid architecture

    NARCIS (Netherlands)

    Herhut, S.; Joslin, C.; Scholz, S.-B.; Grelck, C.; Morazan, M.

    2009-01-01

    Data-parallel programming facilitates elegant specification of concurrency. However, the composability of data-parallel operations so far has been constrained by the requirement to have only at data- parallel operation at runtime. In this paper, we present early results on our work to exploit

  13. Toward a Last Interglacial Compilation Using a Tephra-based Chronology: a Future Reference For Model-data Comparison

    Science.gov (United States)

    Bazin, L.; Govin, A.; Capron, E.; Nomade, S.; Lemieux-Dudon, B.; Landais, A.

    2017-12-01

    The Last Interglacial (LIG, 129-116 ka) is a key period to decipher the interactions between the different components of the climate system under warmer-than-preindustrial conditions. Modelling the LIG climate is now part of the CMIP6/PMIP4 targeted simulations. As a result, recent efforts have been made to propose surface temperature compilations focusing on the spatio-temporal evolution of the LIG climate, and not only on its peak warmth as previously proposed. However, the major limitation of these compilations remains in the climatic alignment of records (e.g. temperature, foraminiferal δ18O) that is performed to define the sites' chronologies. Such methods prevent the proper discussion of phase relationship between the different sites. Thanks to recent developments of the Bayesian Datice dating tool, we are now able to build coherent multi-archive chronologies with a proper propagation of the associated uncertainties. We make the best use of common tephra layers identified in well-dated continental archives and marine sediment cores of the Mediterranean region to propose a coherent chronological framework for the LIG independent of any climatic assumption. We then extend this precise chronological context to the North Atlantic as a first step toward a global coherent compilation of surface temperature and stable isotope records. Based on this synthesis, we propose guidelines for the interpretation of different proxies measured from different archives that will be compared with climate model parameters. Finally, we present time-slices (e.g. 127 ka) of the preliminary regional synthesis of temperature reconstructions and stable isotopes to serve as reference for future model-data comparison of the up-coming CMIP6/PMIP4 LIG simulations.

  14. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-07-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  15. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    International Nuclear Information System (INIS)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-01-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  16. USGS compilation of geographic information system (GIS) data of coal mines and coal-bearing areas in Mongolia

    Science.gov (United States)

    Trippi, Michael H.; Belkin, Harvey E.

    2015-09-10

    Geographic information system (GIS) information may facilitate energy studies, which in turn provide input for energy policy decisions. The U.S. Geological Survey (USGS) has compiled GIS data representing coal mines, deposits (including those with and without coal mines), occurrences, areas, basins, and provinces of Mongolia as of 2009. These data are now available for download, and may be used in a GIS for a variety of energy resource and environmental studies of Mongolia. Chemical data for 37 coal samples from a previous USGS study of Mongolia (Tewalt and others, 2010) are included in a downloadable GIS point shapefile and shown on the map of Mongolia. A brief report summarizes the methodology used for creation of the shapefiles and the chemical analyses run on the samples.

  17. The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms

    Science.gov (United States)

    Balch, William; Evans, Robert; Brown, Jim; Feldman, Gene; Mcclain, Charles; Esaias, Wayne

    1992-01-01

    Global pigment and primary productivity algorithms based on a new data compilation of over 12,000 stations occupied mostly in the Northern Hemisphere, from the late 1950s to 1988, were tested. The results showed high variability of the fraction of total pigment contributed by chlorophyll, which is required for subsequent predictions of primary productivity. Two models, which predict pigment concentration normalized to an attenuation length of euphotic depth, were checked against 2,800 vertical profiles of pigments. Phaeopigments consistently showed maxima at about one optical depth below the chlorophyll maxima. CZCS data coincident with the sea truth data were also checked. A regression of satellite-derived pigment vs ship-derived pigment had a coefficient of determination. The satellite underestimated the true pigment concentration in mesotrophic and oligotrophic waters and overestimated the pigment concentration in eutrophic waters. The error in the satellite estimate showed no trends with time between 1978 and 1986.

  18. Data compilation report: Gas and liquid samples from K West Basin fuel storage canisters

    International Nuclear Information System (INIS)

    Trimble, D.J.

    1995-01-01

    Forty-one gas and liquid samples were taken from spent fuel storage canisters in the K West Basin during a March 1995 sampling campaign. (Spent fuel from the N Reactor is stored in sealed canisters at the bottom of the K West Basin.) A description of the sampling process, gamma energy analysis data, and quantitative gas mass spectroscopy data are documented. This documentation does not include data analysis

  19. Data Blocks: hybrid OLTP and OLAP on compressed storage using both vectorization and compilation

    NARCIS (Netherlands)

    H. Lang (Harald); T. Mühlbauer; F. Funke; P.A. Boncz (Peter); T. Neumann (Thomas); A. Kemper (Alfons)

    2016-01-01

    htmlabstractThis work aims at reducing the main-memory footprint in high performance hybrid OLTP & OLAP databases, while retaining high query performance and transactional throughput. For this purpose, an innovative compressed columnar storage format for cold data, called Data Blocks is introduced.

  20. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    Science.gov (United States)

    Longman, Ryan J.; Giambelluca, Thomas W.; Nullet, Michael A.; Frazier, Abby G.; Kodama, Kevin; Crausbay, Shelley D.; Krushelnycky, Paul D.; Cordell, Susan; Clark, Martyn P.; Newman, Andy J.; Arnold, Jeffrey R.

    2018-02-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai'i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National Centers for Environmental Information (NCEI). Researchers attempting to make use of available data are faced with a series of challenges that include: (1) identifying potential data sources; (2) acquiring data; (3) establishing data quality assurance and quality control (QA/QC) protocols; and (4) implementing robust gap filling techniques. This paper addresses these challenges by providing: (1) a summary of the available climate data in Hawai'i including a detailed description of the various meteorological observation networks and data accessibility, and (2) a quality controlled meteorological dataset across the Hawaiian Islands for the 25-year period 1990-2014. The dataset draws on observations from 471 climate stations and includes rainfall, maximum and minimum surface air temperature, relative humidity, wind speed, downward shortwave and longwave radiation data.

  1. Compilation of data for thermomechanical analyses of four potential salt repositories

    International Nuclear Information System (INIS)

    Tammemagi, H.Y.; Loken, M.C.; Osnes, J.D.; Wagner, R.A.

    1986-01-01

    This report includes a collection and summarization of the data which are necessary to perform thermomechanical analyses of four potential salt repository sites: Paradox Basin, Utah; Permian Basin, Texas; Richton Dome, Mississippi; and Vacherie Dome, Louisiana. Thermal, mechanical, and hydrogeological material properties are presented so that the numerical analyses can be subdivided into three geometric regions: canister, disposal room, and repository site. Data are presented for the salt formations, the surrounding geological units, and for human-made materials placed in the repository such as the nuclear waste and its protective steel liner. Wherever possible, site-specific data are used which have been determined from laboratory testing of drill core or from interpretation of geophysical logs. Although much effort has been made to obtain the most appropriate data, there are deficiencies because some of the required site-specific data are either not available or are inconsistent with anticipated values

  2. Irradiation of spices, herbs and other vegetable seasonings: A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1992-02-01

    This publication contains a compilation of all available scientific and technical data on the irradiation of spices, herbs and other vegetable seasonings. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The Compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of the food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International Conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. It is hoped that the information contained in this publication will assist governments in considering requests for the approval of radiation treatment of spices, herbs and other vegetable seasonings, or requests for authorization to import such irradiated products. Refs and tabs

  3. Data compilations for primary production, herbivory, decomposition, and export for different types of marine communities, 1962-2002 (NODC Accession 0054500)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a compilation of published data on primary production, herbivory, and nutrient content of primary producers in pristine communities of...

  4. A Compilation of Global Soil Microbial Biomass Carbon, Nitrogen, and Phosphorus Data

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides the concentrations of soil microbial biomass carbon (C), nitrogen (N) and phosphorus (P), soil organic carbon, total nitrogen, and total...

  5. Transportation legislative data base : state radioactive materials transportation statute compilation, 1989-1993

    Science.gov (United States)

    1994-04-30

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United...

  6. Industry Contributions to Seafloor Mapping: Building Partnerships for Collecting, Sharing, and Compiling Data

    Science.gov (United States)

    Brumley, K. J.; Mitchell, G. A.; Millar, D.; Saade, E. J.; Gharib, J. J.

    2017-12-01

    In an effort to map the remaining 85% of the worlds seafloor, The Nippon Foundation and GEBCO have launched Seabed 2030 to provide high-resolution bathymetry for all ocean waters by the year 2030. This ambitious effort will require sharing of bathymetric information to build a global baseline bathymetry database. Multibeam echosounder (MBES) data is a promising source of data for Seabed 2030. These data benefit multiple users which includes not only bathymetric information, but also valuable backscatter data, useful for determining seafloor characteristics), as well as water column data, which can be used to explore other aspects of the marine environment and potentially help constrain some of the ocean's methane flux estimates. Fugro provides global survey services for clients in the oil and gas, telecommunications, infrastructure industries, and state and federal agencies. With a global fleet of survey vessels and autonomous vehicles equipped with state-of-the-art MBES systems, Fugro has performed some of the world's largest offshore surveys over the past several years mapping close to 1,000,000 km2 of seafloor per year with high-resolution MBES data using multi-vessel operational models and new methods for merging datasets from different multibeam sonar systems. Although most of these data are proprietary, Fugro is working with clients in the private-sector to make data available to the Seabed 2030 project at a decimated resolution of 100 m. The company is also contributing the MBES data acquired during transits to survey locations. Fugro has also partnered with Shell Ocean Discovery XPRIZE to support development of new rapid, unmanned, high-resolution ocean mapping technologies that can benefit understanding of the world's oceans. Collaborative approaches such as these are helping to establish a new standard for other industry contributions, and to facilitate a new outlook for data sharing among the public and private sectors. Recognizing the importance of an

  7. Porosity, sorption and diffusivity data compiled for the SKB 91 study

    International Nuclear Information System (INIS)

    Brandberg, F.; Skagius, K.

    1991-04-01

    The SKB 91 study is an integrated safety analysis of the KBS-3 concept of a repository located in the Finnsjoen area. For this study, values of important transport parameters in the bentonite backfill and in the rock are proposed. K d -values, diffusivities and diffusion porosity for different elements in compacted MX-80 bentonite are based on experimental data found in the literature. With regard to sorption, both a best estimate and a conservative value is given. Because sorption on bentonite is very much dependent on the conditions prevailing and experimental data are limited and not necessary representative for the conditions expected in the repository, the proposed best estimate values may include large uncertainties. Data proposed for rock are matrix diffusivities, matrix porosity and diffusivity in mobile bulk water. These values are based on experimental results on Finnsjoe rock. (au)

  8. Mapping and prediction of schistosomiasis in Nigeria using compiled survey data and Bayesian geospatial modelling

    DEFF Research Database (Denmark)

    Ekpo, Uwem F.; Hürlimann, Eveline; Schur, Nadine

    2013-01-01

    Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35 of the cou......Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35...

  9. Compilation of climate data from heterogeneous networks across the Hawaiian Islands

    Science.gov (United States)

    Ryan J. Longman; Thomas W. Giambelluca; Michael A. Nullet; Abby G. Frazier; Kevin Kodama; Shelley D. Crausbay; Paul D. Krushelnycky; Susan Cordell; Martyn P. Clark; Andy J. Newman; Jeffrey R. Arnold

    2018-01-01

    Long-term, accurate observations of atmospheric phenomena are essential for a myriad of applications, including historic and future climate assessments, resource management, and infrastructure planning. In Hawai‘i, climate data are available from individual researchers, local, State, and Federal agencies, and from large electronic repositories such as the National...

  10. Data compilation for radiation effects on hydrogen recycle in fusion reactor materials

    International Nuclear Information System (INIS)

    Ozawa, Kunio; Fukushima, Kimichika; Ebisawa, Katsuyuki.

    1984-05-01

    Irradiation tests of materials by hydrogen isotopes are under way, to investigate the hydrogen recycling process where exchange of fuel particles takes place between plasma and the wall of the nuclear fusion reactor. In the report, data on hydrogen irradiation are collected and reviewed from the view point of irradiation effects. Data are classified into, (1) Re-emmission, (2) Retention, (Retained hydrogen isotopes, Depth profile in the materials and Thermal desorption spectroscopy), (3) Permeation and (4) Ion impact desorption. Research activities in each area are arranged according to the date of publication, research institutes, materials investigated, so that overview of present status can be made. Then, institute, author and reference are shown for each classification with tables. The list of literature is also attached. (author)

  11. Mineralogy, petrology and whole-rock chemistry data compilation for selected samples of Yucca Mountain tuffs

    International Nuclear Information System (INIS)

    Connolly, J.R.

    1991-12-01

    Petrologic, bulk chemical, and mineralogic data are presented for 49 samples of tuffaceous rocks from core holes USW G-1 and UE-25a number-sign 1 at Yucca Mountain, Nevada. Included, in descending stratigraphic order, are 11 samples from the Topopah Spring Member of the Paintbrush Tuff, 12 samples from the Tuffaceous Beds of Calico Hills, 3 samples from the Prow Pass Member of the Crater Flat Tuff, 20 samples from the Bullfrog Member of the Crater Flat Tuff and 3 samples from the Tram Member of the Crater Flat Tuff. The suite of samples contains a wide variety of petrologic types, including zeolitized, glassy, and devitrified tuffs. Data vary considerably between groups of samples, and include thin section descriptions (some with modal analyses for which uncertainties are estimated), electron microprobe analyses of mineral phases and matrix, mineral identifications by X-ray diffraction, and major element analyses with uncertainty estimates

  12. Methods to Collect, Compile, and Analyze Observed Short-lived Fission Product Gamma Data

    Energy Technology Data Exchange (ETDEWEB)

    Finn, Erin C.; Metz, Lori A.; Payne, Rosara F.; Friese, Judah I.; Greenwood, Lawrence R.; Kephart, Jeremy D.; Pierson, Bruce D.; Ellis, Tere A.

    2011-09-29

    A unique set of fission product gamma spectra was collected at short times (4 minutes to 1 week) on various fissionable materials. Gamma spectra were collected from the neutron-induced fission of uranium, neptunium, and plutonium isotopes at thermal, epithermal, fission spectrum, and 14-MeV neutron energies. This report describes the experimental methods used to produce and collect the gamma data, defines the experimental parameters for each method, and demonstrates the consistency of the measurements.

  13. Closing the light sbottom mass window from a compilation of $e^+ e^- \\to$ hadron data

    CERN Document Server

    AUTHOR|(CDS)2051271

    2004-01-01

    The e+e- -> hadron cross section data from PEP, PETRA, TRISTAN, SLC and LEP, at centre-of-mass energies between 20 to 209GeV, are analysed to search for the production of a pair of light sbottoms decaying hadronically via R-parity-violating couplings. This analysis allows the 95% C.L. exclusion of such a particle if its mass is below 7.5GeV/c2. The light sbottom mass window is closed.

  14. Energy crops. Data for planning of energy crop cultivation. KTBL data compilation with internet services; Energiepflanzen. Daten fuer die Planung des Energiepflanzenanbaus. KTBL-Datensammlung mit Internetangebot

    Energy Technology Data Exchange (ETDEWEB)

    Eckel, H.; Grube, J.; Zimmer, E. (comps.)

    2006-07-01

    Based on the KTBL data compilation ''Betriebsplanung Landwirtschaft'', this data compilation (''Datensammlung Energiepflanzen'') provides comprehensive information on the cultivation of energy crops and production planning. Production techniques are outlined up to the final step of provision to the consumer, so that full-scale cost calculation is possible. Hints for cultivation are presented which take into account the differences between food and fodder crop cultivation. Rare crops are gone into for which little experience is available but which have great potential for utilisation in agriculture. Energetic utilisation is a field for a wider range of crops and with new options for crop rotation. These are discussed in two separate chapters. There is also information on legal aspects of energy crop production, relevant standards, and quality requirements on substrates for energetic use and for secondary harvesting. (orig.)

  15. Compilation of historical radiological data collected in the vicinity of the WIPP site

    International Nuclear Information System (INIS)

    Bradshaw, P.L.; Louderbough, E.T.

    1987-01-01

    The Radiological Baseline Program (RBP) at the Waste Isolation Pilot Plant (WIPP) has been implemented to characterize the radiological conditions at the site prior to receipt of radioactive wastes. Because southeastern New Mexico was the site of an underground nuclear test in 1961, various sampling programs have intermittently monitored background and elevated radiation levels in the vicinity of the WIPP. In addition, radiological characterization of the site region was performed during the 1970's in support of the WIPP Environmental Impact Statement. The historical data are drawn primarily from monitoring activities of the US Public Health Service (PHS), the Environmental Protection Agency (EPA), US Geological Survey (USGS) and Sandia National Laboratories, Albuquerque (SNLA). Information on air and water quality, meat, milk, biota and vegetation is included in the report. This survey is intended to provide a source of reference for historical data on radiological conditions in the vicinity of the WIPP site prior to the establishment of a systematic Radiological Baseline Program. 31 refs., 1 fig

  16. Annual compilation and analysis of hydrologic data for Escondido Creek, San Antonio River basin, Texas

    Science.gov (United States)

    Reddy, D.R.

    1971-01-01

    IntroductionHistory of Small Watershed Projects in TexasThe U.S. Soil Conservation Service is actively engaged in the installation of flood and soil erosion reducing measures in Texas under the authority of the "Flood Control Act of 1936 and 1944" and "Watershed Protection and Flood Prevention Act" (Public Law 566), as amended. The Soil Conservation Service has found a total of approximately 3,500 floodwater-retarding structures to be physically and economically feasible in Texas. As of September 30, 1970, 1,439 of these structures had been built.This watershed-development program will have varying but important effects on the surface and ground-water resources of river basins, especially where a large number of the floodwater-retarding structures are built. Basic hydrologic data under natural and developed conditions are needed to appraise the effects of the structures on the yield and mode of occurrence of runoff.Hydrologic investigations of these small watersheds were begun by the Geological Survey in 1951 and are now being made in 12 study areas (fig. 1). These investigations are being made in cooperation with the Texas Water Development Board, the Soil Conservation Service, the San Antonio River Authority, the city of Dallas, and the Tarrant County Water Control and Improvement District No. 1. The 12 study areas were chosen to sample watershed having different rainfall, topography, geology, and soils. In five of the study areas, (North, Little Elm, Mukewater, little Pond-North Elm, and Pin Oak Creeks), streamflow and rainfall records were collected prior to construction of the floodwater-retarding structures, thus affording the opportunity for analyses of the conditions "before and after" development. A summary of the development of the floodwater-retarding structures in each study areas of September 30, 1970, is shown in table 1.Objectives of the Texas Small Watersheds ProjectThe purpose of these investigations is to collect sufficient data to meeting the

  17. Compilation of radiation damage test data. Pt. 2. Thermoset and thermoplastic resins, composite materials

    International Nuclear Information System (INIS)

    Tavlet, M.; Fontaine, A.; Schoenbacher, H.

    1998-01-01

    This catalogue summarizes radiation damage test data on thermoplastic and thermoset resins and composites. Most of them are epoxy resins used as insulator for magnet coils. Many results are also given for new engineering thermoplastics which can be used either for their electrical properties or for their mechanical properties. The materials have been irradiated either in a 60 Co source, up to integrated absorbed doses between 200 kGy and a few megagrays, at dose rates of the order of 1 Gy/s, or in a nuclear reactor at dose rates of the order of 50 Gy/s, up to doses of 100 MGy. The flexural strength, the deformation and the modulus of elasticity have been measured on irradiated and non-irradiated samples, according to the recommendations of the International Electrotechnical Commissions. The results are presented in the form of tables and graphs to show the effect of the absorbed dose on the measured properties. (orig.)

  18. Compilation of radiation damage test data. Pt. 2. Thermoset and thermoplastic resins, composite materials

    Energy Technology Data Exchange (ETDEWEB)

    Tavlet, M; Fontaine, A; Schoenbacher, H

    1998-05-18

    This catalogue summarizes radiation damage test data on thermoplastic and thermoset resins and composites. Most of them are epoxy resins used as insulator for magnet coils. Many results are also given for new engineering thermoplastics which can be used either for their electrical properties or for their mechanical properties. The materials have been irradiated either in a {sup 60}Co source, up to integrated absorbed doses between 200 kGy and a few megagrays, at dose rates of the order of 1 Gy/s, or in a nuclear reactor at dose rates of the order of 50 Gy/s, up to doses of 100 MGy. The flexural strength, the deformation and the modulus of elasticity have been measured on irradiated and non-irradiated samples, according to the recommendations of the International Electrotechnical Commissions. The results are presented in the form of tables and graphs to show the effect of the absorbed dose on the measured properties. (orig.)

  19. Evaluation and compilation of DOE waste package test data: Biannual report, February 1987--July 1987

    International Nuclear Information System (INIS)

    Interrante, C.; Escalante, E.; Fraker, A.

    1988-05-01

    The waste package is a proposed engineering barrier that is part of a permanent repository for HLW. Metal alloys are the principal barriers within the engineered system. Technical discussions are given for the corrosion of metals proposed for the canister, particularly carbon steels, stainless steels, and copper. The current level of understanding of several canister materials is questioned for the candidate repository in tuff. Three issues are addressed, the possibility of the stress-induced failure of Zircaloy, the possible corrosion of copper and copper alloys, and the lack of site-specific characterization data. Discussions are given on problems concerning localized corrosion and environmentally assisted cracking of AISI 1020 steel at elevated temperatures (150/degree/C). For the proposed salt site, the importance of the duration of corrosion tests and some of the conditions that may preclude prompt initiation of needed long-term testing are two issues that are discussed. 31 refs., 5 figs

  20. Irradiation of fish, shellfish and frog legs. A compilation of technical data for authorization and control

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-06-01

    The International Consultative Group on Food Irradiation (ICGFI) was established on 9 May 1984 under the aegis of FAO, IAEA and WHO. ICGFI is composed of experts and other representatives designated by governments which have accepted the terms of the 'Declaration' establishing ICGFI and have pledged to make voluntary contributions, in cash or in kind, to carry out the activities of ICGFI. The functions of ICGFI are as follows: (a) To evaluate global developments in the field of food irradiation; (b) To provide a focal point of advice on the application of food irradiation to Member States and the Organization; and (c) To furnish information as required, through the Organization, to the Joint FAO/IAEA/WHO Expert Committee on the Wholesomeness of Irradiated Food, and to the Codex Alimentarius Commission. This publication contains the most up to date data on irradiation of fish, shellfish and frog legs. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. It was prepared at the request of the International Consultative Group on Food Irradiation (ICGFI) in response to the increasing acceptance and application of irradiation to ensure hygienic quality of food, especially those of animal origin.

  1. Irradiation of fish, shellfish and frog legs. A compilation of technical data for authorization and control

    International Nuclear Information System (INIS)

    2000-06-01

    The International Consultative Group on Food Irradiation (ICGFI) was established on 9 May 1984 under the aegis of FAO, IAEA and WHO. ICGFI is composed of experts and other representatives designated by governments which have accepted the terms of the 'Declaration' establishing ICGFI and have pledged to make voluntary contributions, in cash or in kind, to carry out the activities of ICGFI. The functions of ICGFI are as follows: (a) To evaluate global developments in the field of food irradiation; (b) To provide a focal point of advice on the application of food irradiation to Member States and the Organization; and (c) To furnish information as required, through the Organization, to the Joint FAO/IAEA/WHO Expert Committee on the Wholesomeness of Irradiated Food, and to the Codex Alimentarius Commission. This publication contains the most up to date data on irradiation of fish, shellfish and frog legs. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. It was prepared at the request of the International Consultative Group on Food Irradiation (ICGFI) in response to the increasing acceptance and application of irradiation to ensure hygienic quality of food, especially those of animal origin

  2. Compilation of fastener testing data received in response to NRC Compliance Bulletin 87-02

    International Nuclear Information System (INIS)

    Cwalina, G.C.; Conway, J.T.; Parker, L.B.

    1989-06-01

    On November 6, 1987, the Nuclear Regulatory Commission (NRC) issued Bulletin 87-02, ''Fastener Testing to Determine Conformance With Applicable Material Specifications,'' to all holders of operating licenses or construction permits for nuclear power reactors (licensees). The bulletin was issued so that the NRC staff could gather data to determine whether counterfeit fasteners are a problem in the nuclear power industry. The bulletin requested nuclear power plant owners to determine whether fasteners obtained from suppliers and/or manufacturers for use in their facilities meet the mechanical and chemical specifications stipulated in the procurement documents. The licensees were requested to sample a minimum of 10 safety-related and 10 non-safety-related fasteners (studs, bolts, and/or cap screws) and a sample of typical nuts that would be used with each fastener and to report the testing results to the NRC. The results of this study did not indicate a safety concern relating to the use of mismarked or counterfeit fasteners in the nuclear industry, but they did indicate a nonconformance rate of 8 to 12 percent for fasteners. The NRC staff is considering taking action to improve the effectiveness of receipt inspection and testing programs for all materials at nuclear power plants

  3. Bearing tester data compilation analysis, and reporting and bearing math modeling

    Science.gov (United States)

    Cody, J. C.

    1986-01-01

    Integration of heat transfer coefficients, modified to account for local vapor quality, into the 45 mm bearing model has been completed. The model has been evaluated with two flow rates and subcooled and saturated coolant. The evaluation showed that by increasing the flow from 3.6 to 7.0 lbs/sec the average ball temperature was decreased by 102 F, using a coolant temperature of -230 F. The average ball temperature was decreased by 63 F by decreasing the inlet coolant temperature from saturated to -230 F at a flow rate of 7.0 lbs/sec. Since other factors such as friction, cage heating, etc., affect bearing temperatures, the above bearing temperature effects should be considered as trends and not absolute values. The two phase heat transfer modification has been installed in the 57 mm bearing model and the effects on bearing temperatures have been evaluated. The average ball temperature was decreased by 60 F by increasing the flow rate from 4.6 to 9.0 lbs/sec for the subcooled case. By decreasing the inlet coolant temperature from saturation to -24 F, the average ball temperature was decreased 57 F for a flow rate of 9.0 lbs/sec. The technique of relating the two phase heat transfer coefficient to local vapor quality will be applied to the tester model and compared with test data.

  4. Evaluation and compilation of DOE waste package test data: Biannual report, August 1986-January 1987

    International Nuclear Information System (INIS)

    Interrante, C.; Escalante, E.; Fraker, A.; Harrison, S.; Shull, R.; Linzer, M.; Ricker, R.; Ruspi, J.

    1987-10-01

    This report summarizes results of the National Bureau of Standards (NBS) evaluations of Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW). The waste package is a proposed engineered barrier that is part of a permanent repository for HLW. Metal alloys are the principal barriers within the engineered system. Technical discussions are given for the corrosion of metals proposed for the canister, particularly carbon and stainless steels, and copper. In the section on tuff, the current level of understanding of several canister materials is questioned. Within the Basalt Waste Isolation Project (BWIP) section, discussions are given on problems concerning groundwater, materials for use in the metallic overpack, and diffusion through the packing. For the proposed salt site, questions are raised on the work on both ASTM A216 Steel and Ti-Code 12. NBS work related to the vitrification of HLW borosilicate glass at the West Valley Demonstration Project (WVDP) and the Defense Waste Processing Facility (DWPF) is covered. NBS reviews of selected DOE technical reports and a summary of current waste-package activities of the Materials Characterization Center (MCC) is presented. Using a database management system, a computerized database for storage and retrieval of reviews and evaluations of HLW data has been developed and is described. 17 refs., 2 figs., 2 tabs

  5. Life expectancy of the 20th century Venda: a compilation of skeletal and cemetery data.

    Science.gov (United States)

    L'Abbé, E N; Steyn, M; Loots, M

    2008-01-01

    Little information is available on the 20th century mortality rates of rural black South African groups, such as the Venda. The purpose of this study was to apply abridged life tables in order to estimate life expectancy from both skeletal remains and death registry information of modern South African communities. Comparisons were also made with prehistoric and contemporary groups as a means to better evaluate life expectancy for this time period. The sample consisted of 160 skeletons of known Venda origin and burial registry information for 1364 black South Africans from the Rebecca Street and Mamelodi Cemeteries in Pretoria, South Africa. Standard anthropological techniques were applied to determine sex and estimate age from the skeletal remains. The stationary and non-stationary life table models were used to analyse the data. A high rate of child mortality, low juvenile and adult mortality with a steady increase in mortality after the age of 30 years was observed for both the Venda and the cemetery samples. Throughout the 20th century, life expectancy was shown to increase for black South Africans. However, due to the widespread HIV infection/AIDS of the 21st century, infant and young adult mortality rates continue to rise at such a speed that the decline in mortality seen for South Africans in the last 50 years will most likely to be lost in the next decade due to this disease.

  6. Literature search, review, and compilation of data for chemical and radiochemical sensors: Task 1 report

    International Nuclear Information System (INIS)

    1993-01-01

    During the next several decades, the US Department of Energy is expected to spend tens of billions of dollars in the characterization, cleanup, and monitoring of DOE's current and former installations that have various degrees of soil and groundwater contamination made up of both hazardous and mixed wastes. Each of these phases will require site surveys to determine type and quantity of hazardous and mixed wastes. It is generally recognized that these required survey and monitoring efforts cannot be performed using traditional chemistry methods based on laboratory evaluation of samples from the field. For that reason, a tremendous push during the past decade or so has been made on research and development of sensors. This report contains the results of an extensive literature search on sensors that are used or have applicability in environmental and waste management. While restricting the search to a relatively small part of the total chemistry spectrum, a sizable body of reference material is included. Results are presented in tabular form for general references obtained from data base searches, as narrative reviews of relevant chapters from proceedings, as book reviews, and as reviews of journal articles with particular relevance to the review. Four broad sensor types are covered: electrochemical processes, piezoelectric devices, fiber optics, and radiochemical processes. The topics of surface chemistry processes and biosensors are not treated separately because they often are an adjunct to one of the four sensors listed. About 1,000 tabular entries are listed, including selected journal articles, reviews of conference/meeting proceedings, and books. Literature to about mid-1992 is covered

  7. Subsurface temperature maps in French sedimentary basins: new data compilation and interpolation

    International Nuclear Information System (INIS)

    Bonte, D.; Guillou-Frottier, L.; Garibaldi, C.; Bourgine, B.; Lopez, S.; Bouchot, V.; Garibaldi, C.; Lucazeau, F.

    2010-01-01

    Assessment of the underground geothermal potential requires the knowledge of deep temperatures (1-5 km). Here, we present new temperature maps obtained from oil boreholes in the French sedimentary basins. Because of their origin, the data need to be corrected, and their local character necessitates spatial interpolation. Previous maps were obtained in the 1970's using empirical corrections and manual interpolation. In this study, we update the number of measurements by using values collected during the last thirty years, correct the temperatures for transient perturbations and carry out statistical analyses before modelling the 3D distribution of temperatures. This dataset provides 977 temperatures corrected for transient perturbations in 593 boreholes located in the French sedimentary basins. An average temperature gradient of 30.6 deg. C/km is obtained for a representative surface temperature of 10 deg. C. When surface temperature is not accounted for, deep measurements are best fitted with a temperature gradient of 25.7 deg. C/km. We perform a geostatistical analysis on a residual temperature dataset (using a drift of 25.7 deg. C/km) to constrain the 3D interpolation kriging procedure with horizontal and vertical models of variograms. The interpolated residual temperatures are added to the country-scale averaged drift in order to get a three dimensional thermal structure of the French sedimentary basins. The 3D thermal block enables us to extract isothermal surfaces and 2D sections (iso-depth maps and iso-longitude cross-sections). A number of anomalies with a limited depth and spatial extension have been identified, from shallow in the Rhine graben and Aquitanian basin, to deep in the Provence basin. Some of these anomalies (Paris basin, Alsace, south of the Provence basin) may be partly related to thick insulating sediments, while for some others (southwestern Aquitanian basin, part of the Provence basin) large-scale fluid circulation may explain superimposed

  8. Compilation of hydrologic data, Little Elm Creek, Trinity River basin, Texas, 1968

    Science.gov (United States)

    ,

    1972-01-01

    The U.S. Soil Conservation Service is actively engaged in the installation of flood and soil erosion reducing measures in Texas under the authority of "The Flood Control Act ot 1936 and 1944" and ''Watershed Protection and Flood Prevention Act" (Public Law 566), as amended. In June 1968, the Soil Conservation Service estimated approximately 3,500 structures to be physically and economically feasible for installation in Texas. As of September 30, 1968, 1,271 of these structures had been built. This watershed-development program will have varying but important effects on the surface- and ground-water resources of river basins, especially where a large number of the floodwater-retarding structures are built. Basic hydrologic data are needed to appraise the effects of the structures on water yield and the mode of occurrence of runoff. Hydrologic investigations of these small watersheds were begun by the Geological Survey in 1951 and are now being made in 11 areas (fig. 1). These studies are being made in cooperation with t he Texas Water Development Board, the Soil Conservation Service, the San Antonio River Authority, the city of Dallas, and the Tarrant County Water Control and Improvement District No. 1. The 11 study areas were choson to sample watersheds having different rainfall, topography, geology, and soils. In four of the study areas (Mukewater, North, Little Elm, and Pin Oak Creeks), streamflow and rainfall records were collected prior to construction of the floodwater-retarding structures, thus affording the opportunity for analyses to the conditions before and after" development. Structures have now been built in three of these study areas. A summary of the development of the floodwater-retarding structures on each study area as of September 30, 1968, is shown in table 1.

  9. Literature search, review, and compilation of data for chemical and radiochemical sensors: Task 1 report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-01-01

    During the next several decades, the US Department of Energy is expected to spend tens of billions of dollars in the characterization, cleanup, and monitoring of DOE`s current and former installations that have various degrees of soil and groundwater contamination made up of both hazardous and mixed wastes. Each of these phases will require site surveys to determine type and quantity of hazardous and mixed wastes. It is generally recognized that these required survey and monitoring efforts cannot be performed using traditional chemistry methods based on laboratory evaluation of samples from the field. For that reason, a tremendous push during the past decade or so has been made on research and development of sensors. This report contains the results of an extensive literature search on sensors that are used or have applicability in environmental and waste management. While restricting the search to a relatively small part of the total chemistry spectrum, a sizable body of reference material is included. Results are presented in tabular form for general references obtained from data base searches, as narrative reviews of relevant chapters from proceedings, as book reviews, and as reviews of journal articles with particular relevance to the review. Four broad sensor types are covered: electrochemical processes, piezoelectric devices, fiber optics, and radiochemical processes. The topics of surface chemistry processes and biosensors are not treated separately because they often are an adjunct to one of the four sensors listed. About 1,000 tabular entries are listed, including selected journal articles, reviews of conference/meeting proceedings, and books. Literature to about mid-1992 is covered.

  10. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD/sub 50/s and a compilation of experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD/sub 50/ where sufficient experimental data are available. Exposure rates varied in magnitude from about 10/sup -2/ to 10/sup 3/ R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs.

  11. Compilation of Water-Resources Data and Hydrogeologic Setting for Brunswick County, North Carolina, 1933-2000

    Science.gov (United States)

    Fine, Jason M.; Cunningham, William L.

    2001-01-01

    Water-resources data were compiled for Brunswick County, North Carolina, to describe the hydrologic conditions of the County. Hydrologic data collected by the U.S. Geological Survey as well as data collected by other governmental agencies and reviewed by the U.S. Geological Survey are presented. Data from four weather stations and two surface-water stations are summarized. Data also are presented for land use and land cover, soils, geology, hydrogeology, 12 continuously monitored ground-water wells, 73 periodically measured ground-water wells, and water-quality measurements from 39 ground-water wells. Mean monthly precipitation at the Longwood, Shallotte, Southport, and Wilmington Airport weather stations ranged from 2.19 to 7.94 inches for the periods of record, and mean monthly temperatures at the Longwood, Southport, and Wilmington Airport weather stations ranged from 43.4 to 80.1 degrees Fahrenheit for the periods of record. An evaluation of land-use and land-cover data for Brunswick County indicated that most of the County is either forested land (about 57 percent) or wetlands (about 29 percent). Cross sections are presented to illustrate the general hydrogeology beneath Brunswick County. Water-level data for Brunswick County indicate that water levels ranged from about 110 feet above mean sea level to about 22 feet below mean sea level. Chloride concentrations measured in aquifers in Brunswick County ranged from near 0 to 15,000 milligrams per liter. Chloride levels in the Black Creek and Cape Fear aquifers were measured at well above the potable limit for ground water of 250 milligrams per liter set by the U.S. Environmental Protection Agency for safe drinking water.

  12. Progress in fission product nuclear data. Information about activities in the field of measurements and compilations/evaluations of fission product nuclear data (FPND)

    International Nuclear Information System (INIS)

    Lammer, G.

    1978-07-01

    This is the fourth issue of a report series on Fission Product Nuclear Data (FPND) which is published by the Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA). The purpose of this series is to inform scientists working on FPND, or using such data, about all activities in this field which are planned, ongoing, or have recently been completed. The main part of this report consists of unaltered original contributions which the authors have sent to IAEA/NDS. The types of activities being included in this report are measurements, compilations and evaluations of: Fission product yields (neutron induced and spontaneous fission); neutron reaction cross sections of fission products; data related to the radioactive decay of fission products; delayed neutron data of fission products; and lumped fission product data (decay heat, absorption etc.)

  13. Low-temperature geothermal water in Utah: A compilation of data for thermal wells and springs through 1993

    Energy Technology Data Exchange (ETDEWEB)

    Blackett, R.E.

    1994-07-01

    The Geothermal Division of DOE initiated the Low-Temperature Geothermal Resources and Technology Transfer Program, following a special appropriation by Congress in 1991, to encourage wider use of lower-temperature geothermal resources through direct-use, geothermal heat-pump, and binary-cycle power conversion technologies. The Oregon Institute of Technology (OIT), the University of Utah Research Institute (UURI), and the Idaho Water Resources Research Institute organized the federally-funded program and enlisted the help of ten western states to carry out phase one. This first phase involves updating the inventory of thermal wells and springs with the help of the participating state agencies. The state resource teams inventory thermal wells and springs, and compile relevant information on each sources. OIT and UURI cooperatively administer the program. OIT provides overall contract management while UURI provides technical direction to the state teams. Phase one of the program focuses on replacing part of GEOTHERM by building a new database of low- and moderate-temperature geothermal systems for use on personal computers. For Utah, this involved (1) identifying sources of geothermal date, (2) designing a database structure, (3) entering the new date; (4) checking for errors, inconsistencies, and duplicate records; (5) organizing the data into reporting formats; and (6) generating a map (1:750,000 scale) of Utah showing the locations and record identification numbers of thermal wells and springs.

  14. Neutron data compilation. Report of a Panel sponsored by the International Atomic Energy Agency and held in Brookhaven, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-02-15

    The IAEA organized and convened a Panel on Neutron Data Compilation. This Panel was organized by the Agency following the recommendations made by the International Nuclear Data Committee (INDC) which agreed that a general review of world neutron data compilation activities was desirable. In this context neutron data compilation encompasses the collection, storage and dissemination of bibliographic information and of qualitative and numerical data on the interaction of neutrons with nuclei and atoms for all incident energies. Such information and data have important applications in low energy neutron physics and many important. areas of nuclear technology. The principal objective of the Panel on Neutron Data Compilation, Which was held at Brookhaven National Laboratory during 10-14 February 1969, was to review how the world's principal data centers located at Brookhaven, Saclay, Obninsk and Vienna could ideally meet the demands and needs of experimental and theoretical neutron physicists, evaluators, reactor physicists as well as other existing and potential users. Fourteen papers were considered during formal sessions of the Panel and are reported on the following pages. The members of the Panel separated into five working groups to consider specific terms of references and make recommendations. Their reports were discussed.

  15. INDC list of correspondents for the exchange of nuclear data information and compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1989-05-01

    The document includes information on currently existing National Nuclear Data Committees and their memberships. It contains five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s). This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  16. INDC list of correspondents for the exchange of nuclear data information and compilation of National Nuclear Data Committees

    International Nuclear Information System (INIS)

    1991-01-01

    The document includes information on currently existing National Nuclear Data Committees and their memberships. The report is presented in five sections. The first section contains a detailed description of the INDC distribution categories, distribution codes and document designator codes. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization together with the assigned INDC document distribution code(s). This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  17. INDC list of correspondents for the exchange of nuclear data information and compilation of National Nuclear Data Committees

    International Nuclear Information System (INIS)

    1984-07-01

    The report is presented in five sections. In the first section a detailed description of the INDC distribution categories, distribution codes and document designator codes is given. The second section describes the aims, organization and objectives of individual national nuclear data committees. The third section lists names and addresses in alphabetical order within each state or international organization; INDC document distribution code(s), committee membership and/or area of specialization are indicated, where applicable, for each person. This is followed by four shorter lists, indicating the names of individuals in each distribution category, sorted by country or international organization, and the total number of individuals in each category. The final section provides the names of nuclear data committee members also listed by country or international organization

  18. Arctic temperature and moisture trends during the past 2000 years - Progress from multiproxy-paleoclimate data compilations

    Science.gov (United States)

    Kaufman, Darrell; Routson, Cody; McKay, Nicholas; Beltrami, Hugo; Jaume-Santero, Fernando; Konecky, Bronwen; Saenger, Casey

    2017-04-01

    Instrumental climate data and climate-model projections show that Arctic-wide surface temperature and precipitation are positively correlated. Higher temperatures coincide with greater moisture by: (1) expanding the duration and source area for evaporation as sea ice retracts, (2) enhancing the poleward moisture transport, and (3) increasing the water-vapor content of the atmosphere. Higher temperature also influences evaporation rate, and therefore precipitation minus evaporation (P-E), the climate variable often sensed by paleo-hydroclimate proxies. Here, we test whether Arctic temperature and moisture also correlate on centennial timescales over the Common Era (CE). We use the new PAGES2k multiproxy-temperature dataset along with a first-pass compilation of moisture-sensitive proxy records to calculate century-scale composite timeseries, with a focus on longer records that extend back through the first millennium CE. We present a new Arctic borehole temperature reconstruction as a check on the magnitude of Little Ice Age cooling inferred from the proxy records, and we investigate the spatial pattern of centennial-scale variability. Similar to previous reconstructions, v2 of the PAGES2k proxy temperature dataset shows that, prior to the 20th century, mean annual Arctic-wide temperature decreased over the CE. The millennial-scale cooling trend is most prominent in proxy records from glacier ice, but is also registered in lake and marine sediment, and trees. In contrast, the composite of moisture-sensitive (primarily P-E) records does not exhibit a millennial-scale trend. Determining whether fluctuations in the mean state of Arctic temperature and moisture were in fact decoupled is hampered by the difficulty in detecting a significant trend within the relatively small number of spatially heterogeneous multi-proxy moisture-sensitive records. A decoupling of temperature and moisture would indicate that evaporation had a strong counterbalancing effect on precipitation

  19. Data compilation for the 1984 interim report of the Scientific Advisory Council on Forest Decline/Air Pollutions of the Federal German Government and Laender

    International Nuclear Information System (INIS)

    1984-01-01

    The data compilation contains contributions towards an inventory of damage and counter-measures at the forestry level, or investigations into causes and effects in the case of a direct impact on the vegetation, via the soil or interactions between trees, as well as regarding air pollutants and technical measures to reduce emissions (individual entries for parts C2). (DG) [de

  20. Organic carbon fluxes in the Atlantic and the Southern Ocean: relationship to primary production compiled from satellite radiometer data

    Science.gov (United States)

    Fischer, G.; Ratmeyer, V.; Wefer, G.

    Fluxes of organic carbon normalised to a depth of 1000 m from 18 sites in the Atlantic and the Southern Ocean are presented, comprising nine biogeochemical provinces as defined by Longhurst et al. (1995. Journal of Plankton Research 17, 1245-1271). For comparison with primary production, we used a recent compilation of primary production values derived from CZCS data (Antoine et al., 1996. Global Biogeochemical Cycles 10, 57-69). In most cases, the seasonal patterns stood reasonably well in accordance with the carbon fluxes. Particularly, organic carbon flux records from two coastal sites off northwest and southwest Africa displayed a more distinct correlation to the primary production in sectors (1×1°) which are situated closer to the coastal environments. This was primarily caused by large upwelling filaments streaming far offshore, resulting in a cross-shelf carbon transport. With respect to primary production, organic carbon export to a water depth of 1000 m, and the fraction of primary production exported to a depth of 1000 m (export fraction=EF 1000), we were able to distinguish between: (1) the coastal environments with highest values (EF 1000=1.75-2.0%), (2) the eastern equatorial upwelling area with moderately high values (EF 1000=0.8-1.1%), (3) and the subtropical oligotrophic gyres that yielded lowest values (EF 1000=0.6%). Carbon export in the Southern Ocean was low to moderate, and the EF 1000 value seems to be quite low in general. Annual organic carbon fluxes were proportional to primary production, and the export fraction EF 1000 increased with primary production up to 350 gC m -2 yr-1. Latitudinal variations in primary production were reflected in the carbon flux pattern. A high temporal variability of primary production rates and a pronounced seasonality of carbon export were observed in the polar environments, in particular in coastal domains, although primary production (according to Antoine et al., 1996. Global Biogeochemical Cycles 10, 57

  1. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami Group and the Toki Granite. Fiscal year 2014

    International Nuclear Information System (INIS)

    Hayashida, Kazuki; Munemoto, Takashi; Iwatsuki, Teruki; Aosai, Daisuke; Inui, Michiharu

    2016-06-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2014. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  2. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami group and the Toki granite. Fiscal year 2015

    International Nuclear Information System (INIS)

    Hayashida, Kazuki; Kato, Toshihiro; Munemoto, Takashi; Kubota, Mitsuru; Iwatsuki, Teruki; Aosai, Daisuke; Inui, Michiharu

    2017-03-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect of excavation and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2015. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method and analytical method) and methodology for quality control are described. (author)

  3. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in the Mizunami group and the Toki granite. Fiscal year 2013

    International Nuclear Information System (INIS)

    Ohmori, Kazuaki; Hasegawa, Takashi; Munemoto, Takashi; Iwatsuki, Teruki; Masuda, Kaoru; Aosai, Daisuke; Inui, Michiharu

    2014-12-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry to understand the effect on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2013. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  4. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  5. Hydrochemical investigation at the Mizunami Underground Research Laboratory. Compilation of groundwater chemistry data in Mizunami group and Toki granite. Fiscal year 2012

    International Nuclear Information System (INIS)

    Ohmori, Kazuaki; Iwatsuki, Teruki; Shingu, Shinya; Masuda, Kaoru; Aosai, Daisuke; Inui, Michiharu

    2014-03-01

    Japan Atomic Energy Agency has been investigating groundwater chemistry on excavating and maintenance of underground facilities as part of the Mizunami Underground Research Laboratory (MIU) Project in Mizunami, Gifu, Japan. In this report, we compiled data of groundwater chemistry obtained at the MIU in the fiscal year 2012. In terms of ensuring traceability of data, basic information (e.g. sampling location, sampling time, sampling method, analytical method) and methodology for quality control are described. (author)

  6. Compilation and evaluation of fission yield nuclear data. Final report of a co-ordinated research project 1991-1996

    International Nuclear Information System (INIS)

    2000-12-01

    Fission product yields are required at several stages of the nuclear fuel cycle and are therefore included in all large international data files for reactor calculations and related applications. Such files are maintained and disseminated by the Nuclear Data Section of the IAEA as a member of an international data centres network. Users of these data are from the fields of reactor design and operation, waste management and nuclear materials safeguards, all of which are essential parts of the IAEA programme. In the 1980s, the number of measured fission yields increased so drastically that the manpower available for evaluating them to meet specific user needs was insufficient. To cope with this task, it was concluded in several meetings on fission product nuclear data, some of them convened by the IAEA, that international co-operation was required, and an IAEA co-ordinated research project (CRP) was recommended. This recommendation was endorsed by the International Nuclear Data Committee, an advisory body for the nuclear data programme of the IAEA. As a consequence, the CRP on the Compilation and Evaluation of Fission Yield Nuclear Data was initiated in 1991, after its scope, objectives and tasks had been defined by a preparatory meeting. The different tasks, such as special evaluations and development of improved methods, were distributed among participants. The results of the research work were discussed and approved by all participants in research co-ordination meetings. For a successful development of theoretical and empirical models, experiments had to be recommended and their results to be awaited, which made necessary an extension of the CRP by two years. This TECDOC is the result of a joint effort of all participants in this CRP. The individual sections represent CRP tasks and were prepared by the participants responsible for doing the research, some of which comprise significant new scientific developments. The appendices to this book contain voluminous

  7. The KNK II/1 fuel assembly NY-205: Compilation of the irradiation history and the fuel and fuel pin fabrication data of the INTERATOM data bank system BESEX

    International Nuclear Information System (INIS)

    Patzer, G.; Geier, F.

    1988-01-01

    The fuel assembly NY-205 has been irradiated during the first and the second core of KNK II with a total residence time of 832 equivalent full-power days. A maximum burnup of 175.000 MWd/tHM or 18.6 % was reached with a maximum steel damage of 66 dpa-NRT. For the cladding the materials 1.4970 and 1.4981 have been used in different metallurgical conditions, and for the Uranium/Plutonium mixed- oxide fuel the most important variants of the major fabrication parameters had been realized. The assembly will be brought to the Hot Cells of the KfK Karlsruhe for post-irradiation examination in February 1988, so that the knowledge of the fabrication data is of interest for the selection of fuel pins and for the evaluation of the examination results. Therefore this report compiles the fuel and fuel pin fabrication data from the INTERATOM data bank system BESEX and additionally, an overview of the irradiation history of the assembly is given [de

  8. Compilation of data used for the analysis of the geological and hydrogeological DFN models. Site descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    Hermanson, Jan; Fox, Aaron; Oehman, Johan; Rhen, Ingvar

    2008-08-01

    This report provides an overview and compilation of the various data that constitutes the basis for construction of the geological and hydrogeological discrete feature network (DFN) models as part of model version SDM-Site Laxemar. This includes a review of fracture data in boreholes and in outcrop. Furthermore, the basis for the construction of lineament maps is given as well as a review of the hydraulic test data from cored and percussion-drilled boreholes. An emphasis is put on graphical representation of borehole logs in the form of composites of geological, hydrogeological and even hydrogeochemical data in the case of cored boreholes. One major contribution is a compilation of characteristics of minor local deformation zones (MDZs) identified in cored boreholes. Basic orientation data and fracture intensity data are presented as a function of depth for individual boreholes. The coupling between hydrogeological data and geological data is further refined in plots of Posiva flow log (PFL) data vs. geological single hole interpretation data

  9. Report on the second consultants' meeting of nuclear reaction data centers Kiev, USSR, 11-16 April 1977. Including the thirteenth four-center meeting and the third meeting on charged particle nuclear data compilation

    International Nuclear Information System (INIS)

    Lemmel, H.D.

    1977-10-01

    This second ''NRDC meeting'' combined the 13th ''four centers meeting'' (consultants' meeting of the four neutron nuclear data centers) with the third ''CPND meeting'' (consultants' meeting on charged particle nuclear data compilation). In Part I of the meeting, the neutron data centers held a special session on neutron data matters, in particular on the jointly operated neutron data index CINDA, whereas all items of more general interest, in particular the data exchange system EXFOR, were treated in Part II of the meeting

  10. Mineralogy, geochemistry, porosity and redox properties of rocks from Forsmark. Compilation of data from the regional model volume for SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Sandstroem, Bjoern (WSP Sverige AB, Stockholm (Sweden)); Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden))

    2009-11-15

    This report is a compilation of the data acquired during the Forsmark site investigation programme on the mineralogy, geochemistry, redox properties and porosity of different rock types at Forsmark. The aim is to provide a final summary of the available data for use during the SR-Site modelling work. Data presented in this report represent the regional model volume and have previously been published in various SKB reports. The data have been extracted from the SKB database Sicada and are presented as calculated median values, data range and lower/upper quartile. The representativity of all samples used for the calculations have been evaluated and data from samples where there is insufficient control on the rock type have been omitted. Rock samples affected by alteration have been omitted from the unaltered samples and are presented separately based on type of alteration (e.g. oxidised or albitized rock)

  11. Compilation of new and previously published geochemical and modal data for Mesoproterozoic igneous rocks of the St. Francois Mountains, southeast Missouri

    Science.gov (United States)

    du Bray, Edward A.; Day, Warren C.; Meighan, Corey J.

    2018-04-16

    The purpose of this report is to present recently acquired as well as previously published geochemical and modal petrographic data for igneous rocks in the St. Francois Mountains, southeast Missouri, as part of an ongoing effort to understand the regional geology and ore deposits of the Mesoproterozoic basement rocks of southeast Missouri, USA. The report includes geochemical data that is (1) newly acquired by the U.S. Geological Survey and (2) compiled from numerous sources published during the last fifty-five years. These data are required for ongoing petrogenetic investigations of these rocks. Voluminous Mesoproterozoic igneous rocks in the St. Francois Mountains of southeast Missouri constitute the basement buried beneath Paleozoic sedimentary rock that is over 600 meters thick in places. The Mesoproterozoic rocks of southeast Missouri represent a significant component of approximately 1.4 billion-year-old (Ga) igneous rocks that crop out extensively in North America along the southeast margin of Laurentia and subsequent researchers suggested that iron oxide-copper deposits in the St. Francois Mountains are genetically associated with ca. 1.4 Ga magmatism in this region. The geochemical and modal data sets described herein were compiled to support investigations concerning the tectonic setting and petrologic processes responsible for the associated magmatism.

  12. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada.

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2007-10-24

    Between 1951 and 1992, 828 underground tests were conducted on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  13. Digitally Available Interval-Specific Rock-Sample Data Compiled from Historical Records, Nevada Test Site and Vicinity, Nye County, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    David B. Wood

    2009-10-08

    Between 1951 and 1992, underground nuclear weapons testing was conducted at 828 sites on the Nevada Test Site, Nye County, Nevada. Prior to and following these nuclear tests, holes were drilled and mined to collect rock samples. These samples are organized and stored by depth of borehole or drift at the U.S. Geological Survey Core Library and Data Center at Mercury, Nevada, on the Nevada Test Site. From these rock samples, rock properties were analyzed and interpreted and compiled into project files and in published reports that are maintained at the Core Library and at the U.S. Geological Survey office in Henderson, Nevada. These rock-sample data include lithologic descriptions, physical and mechanical properties, and fracture characteristics. Hydraulic properties also were compiled from holes completed in the water table. Rock samples are irreplaceable because pre-test, in-place conditions cannot be recreated and samples cannot be recollected from the many holes destroyed by testing. Documenting these data in a published report will ensure availability for future investigators.

  14. Geodatabase compilation of hydrogeologic, remote sensing, and water-budget-component data for the High Plains aquifer, 2011

    Science.gov (United States)

    Houston, Natalie A.; Gonzales-Bradford, Sophia L.; Flynn, Amanda T.; Qi, Sharon L.; Peterson, Steven M.; Stanton, Jennifer S.; Ryter, Derek W.; Sohl, Terry L.; Senay, Gabriel B.

    2013-01-01

    The High Plains aquifer underlies almost 112 million acres in the central United States. It is one of the largest aquifers in the Nation in terms of annual groundwater withdrawals and provides drinking water for 2.3 million people. The High Plains aquifer has gained national and international attention as a highly stressed groundwater supply primarily because it has been appreciably depleted in some areas. The U.S. Geological Survey has an active program to monitor the changes in groundwater levels for the High Plains aquifer and has documented substantial water-level changes since predevelopment: the High Plains Groundwater Availability Study is part of a series of regional groundwater availability studies conducted to evaluate the availability and sustainability of major aquifers across the Nation. The goals of the regional groundwater studies are to quantify current groundwater resources in an aquifer system, evaluate how these resources have changed over time, and provide tools to better understand a systems response to future demands and environmental stresses. The purpose of this report is to present selected data developed and synthesized for the High Plains aquifer as part of the High Plains Groundwater Availability Study. The High Plains Groundwater Availability Study includes the development of a water-budget-component analysis for the High Plains completed in 2011 and development of a groundwater-flow model for the northern High Plains aquifer. Both of these tasks require large amounts of data about the High Plains aquifer. Data pertaining to the High Plains aquifer were collected, synthesized, and then organized into digital data containers called geodatabases. There are 8 geodatabases, 1 file geodatabase and 7 personal geodatabases, that have been grouped in three categories: hydrogeologic data, remote sensing data, and water-budget-component data. The hydrogeologic data pertaining to the northern High Plains aquifer is included in three separate

  15. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  16. Compilation of Cooperative Data Element Dictionary of Five Federal Agencies’ Systems for Processing of Technical Report Literature

    Science.gov (United States)

    1983-03-01

    investigator participated with Libary of Congress (LC) Staff and Gov- ernment Printing Office (GPO) catalogers in identifying which data elements...National Standards Institute Standard format, ANSI Z.39-2). This standard includes iden- tif ication of specific data elements by means of 3- digit tags...the closer they can come to efficient exchange. But tags assigned to the core data elements differ: they are all 3- digit tags, but the same 3 digits

  17. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  18. k0-measurements and related nuclear data compilation for (n, γ) reactor neutron activation analysis Pt. 3b

    International Nuclear Information System (INIS)

    Corte, F. de; Simonits, A.

    1989-01-01

    k 0 -factors and related nuclear data are tabulated for 112 radionuclides of interest in (n, γ) reactor neutron activation analysis. Whenever relevant, critical comments are made with respect to the accuracy of literature data for e. g. isotopic abundances, half-lives, absolute gamma-intensities and 2200 m · s -1 (n, γ) cross sections. As to the latter, a comparison is made with the values calculated from the experimentally determined k 0 -factors, by introduction of selected literature data for the input parameters. References to the table (79 pages) include 156 items. (author) 7 refs.; 1 tab

  19. Massachusetts shoreline change project: a GIS compilation of vector shorelines and associated shoreline change data for the 2013 update

    Science.gov (United States)

    Smith, Theresa L.; Himmelstoss, Emily A.; Thieler, E. Robert

    2013-01-01

    Identifying the rates and trends associated with the position of the shoreline through time presents vital information on potential impacts these changes may have on coastal populations and infrastructure, and supports informed coastal management decisions. This report publishes the historical shoreline data used to assess the scale and timing of erosion and accretion along the Massachusetts coast from New Hampshire to Rhode Island including all of Cape Cod, Martha’s Vineyard, Nantucket and the Elizabeth Islands. This data is an update to the Massachusetts Office of Coastal Zone Management Shoreline Change Project. Shoreline positions from the past 164 years (1845 to 2009) were used to compute the shoreline change rates. These data include a combined length of 1,804 kilometers of new shoreline data derived from color orthophoto imagery collected in 2008 and 2009, and topographic lidar collected in 2007. These new shorelines have been added to previously published historic shoreline data from the Massachusetts Office of Coastal Zone Management and the U.S. Geological Survey. A detailed report containing a discussion of the shoreline change data presented here and a summary of the resulting rates is available and cited at the end of the Introduction section of this report.

  20. Compilation of data on the release of radioactive substances in the vent air of nuclear power plants in the Federal Republic of Germany in 1975

    International Nuclear Information System (INIS)

    Winkelmann, I.; Endrulat, H.J.; Haubelt, R.; Westpfahl, U.

    1976-04-01

    The present compilation of data on the release of radioactive substances in the vent air of nuclear power plants in the FRG is a continuation of a report series on aerosol filter and iodine filter samples from the exhaust air control systems of the nuclear power plants Gundremmingen, Obrigheim, Wuergassen, Stade, Lingen and Biblis A. The reports have been issued by the Federal public health office since 1972. This report is supplemented by annual release values on radioactive noble gases, on short- and long-lived aerosols, and on gaseous 131 I, supplied by the individual nuclear power plants as in previous years on uniform questionnaires. Data on the release of tritium are also available from some nuclear power plants. (orig.) [de

  1. Compilation, assessment and expansion of the strong earthquake ground motion data base. Seismic Safety Margins Research Program (SSMRP)

    International Nuclear Information System (INIS)

    Crouse, C.B.; Hileman, J.A.; Turner, B.E.; Martin, G.R.

    1980-09-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications. (author)

  2. Statistical results 1988-1990 of the Official Personal Dosimetry Service and data compilation 1980-1990

    International Nuclear Information System (INIS)

    Boerner, E.; Drexler, G.; Scheibe, D.; Schraube, H.

    1994-01-01

    The report consists of a summary of relevant statistical data in the official personal dosimetry in 1988-1990 for the Federal States of Bavaria, Hesse, Schleswig-Holstein, and since 1989, Baden-Wuerttemberg. The data are based on the survey of more than 8000 institutions with over 100000 occupational exposed persons and are derived from more than one million single measurements. The report covers informations on the institutions, on the persons as well as dosimetric values. The measuring method is described briefly with respect to dosimeters used, their range and the interpretation of values. Information on notional doses and the interpolation of values nearby the detection limits are given. (HP) [de

  3. Customer satisfaction at US Army Corps of Engineers-administered lakes: a compilation of two years of performance data

    Science.gov (United States)

    Robert C. Burns; Alan R. Graefe; John P. Titre

    1998-01-01

    The purpose of this paper was to demonstrate the application of a model which can be used to predict the overall customer satisfaction levels of water-based recreationists. Data were collected from two distinctly different user groups; boat ramp users and campground users. Results indicated that each user group had different satisfaction attributes that impacted their...

  4. Compilation of geospatial data for the mineral industries and related infrastructure of Latin America and the Caribbean

    Science.gov (United States)

    Baker, Michael S.; Buteyn, Spencer D.; Freeman, Philip A.; Trippi, Michael H.; Trimmer III, Loyd M.

    2017-07-31

    This report describes the U.S. Geological Survey’s (USGS) ongoing commitment to its mission of understanding the nature and distribution of global mineral commodity supply chains by updating and publishing the georeferenced locations of mineral commodity production and processing facilities, mineral exploration and development sites, and mineral commodity exporting ports in Latin America and the Caribbean. The report includes an overview of data sources and an explanation of the geospatial PDF map format.The geodatabase and geospatial data layers described in this report create a new geographic information product in the form of a geospatial portable document format (PDF) map. The geodatabase contains additional data layers from USGS, foreign governmental, and open-source sources as follows: (1) coal occurrence areas, (2) electric power generating facilities, (3) electric power transmission lines, (4) hydrocarbon resource cumulative production data, (5) liquefied natural gas terminals, (6) oil and gas concession leasing areas, (7) oil and gas field center points, (8) oil and gas pipelines, (9) USGS petroleum provinces, (10) railroads, (11) recoverable proven plus probable hydrocarbon resources, (12) major cities, (13) major rivers, and (14) undiscovered porphyry copper tracts.

  5. Radiation dose responses for chemoradiation therapy of pancreatic cancer: an analysis of compiled clinical data using biophysical models.

    Science.gov (United States)

    Moraru, Ion C; Tai, An; Erickson, Beth; Li, X Allen

    2014-01-01

    We analyzed recent clinical data obtained from chemoradiation of unresectable, locally advanced pancreatic cancer (LAPC) in order to examine possible benefits from radiation therapy dose escalation. A modified linear quadratic model was used to fit clinical tumor response and survival data of chemoradiation treatments for LAPC reported from 20 institutions. Biophysical radiosensitivity parameters were extracted from the fits. Examination of the clinical data demonstrated an enhancement in tumor response with higher irradiation dose, an important clinical result for palliation and quality of life. Little indication of improvement in 1-year survival with increased radiation dose was observed. Possible dose escalation schemes are proposed based on calculations of the biologically effective dose required for a 50% tumor response rate. Based on the evaluation of tumor response data, the escalation of radiation dose presents potential clinical benefits which when combined with normal tissue complication analyses may result in improved treatment outcome for locally advanced pancreatic cancer patients. Copyright © 2014 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  6. Compiling a national resistivity atlas of Denmark based on airborne and ground-based transient electromagnetic data

    DEFF Research Database (Denmark)

    Barfod, Adrian; Møller, Ingelise; Christiansen, Anders Vest

    2016-01-01

    in the large-scale resistivity-lithology relations, reflecting geological details such as available source material for tills. The resistivity maps also reveal a clear ambiguity in the resistivity values for different lithologies. The Resistivity Atlas is highly useful when geophysical data are to be used...

  7. Annual compilation and analysis of hydrologic data for urban studies in the Bryan, Texas, metropolitan area, 1969

    Science.gov (United States)

    Robbins, W.D.

    1972-01-01

    Hydrologic investigations of urban areas in Texas were begun by the U.S. Geological Survey in 1954. These studies are now in progress in Austin, Houston, Dallas, Dallas County, Fort Worth, San Antonio, and Bryan. Hydrologic investigations of urban areas in Texas were begun by the U.S. Geological Survey in 1954. These studies are now in progress in Austin, Houston, Dallas, Dallas County, Fort Worth, San Antonio, and Bryan. 1. To determine, on the basis of historical data and hydrologic analyses, the magnitude and frequency of floods. 2. To document and define the areal extent of floods of greater than ordinary magnitude. 3. To determine the effect of urban development on flood peaks and volume. 4. To provide applied research facilities for studies at Texas A & M University at College Stations. This report, the first in a series of reports to be published annually, is primarily applicable to objectives 2, 3, and 4. The report presents the basic hydrologic data collected in two study areas during the 1969 water year (October 1, 1968, to September 30, 1969) and basic hydrologic data collected during part of the 1968 water year (April 5, 1968, to September 30, 1968). The locations of the two basins within the study area, Burton Creek and Hudson Creek, are shown on figure 1.

  8. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs

  9. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs.

  10. VizieR Online Data Catalog: UY UMa and EF Boo compiled time of minima (Yu+, 2017)

    Science.gov (United States)

    Yu, Y.-X.; Zhang, X.-D.; Hu, K.; Xiang, F.-Y.

    2017-11-01

    In order to construct the (O-C) diagram to analyze the period change of UY UMa, we have performed a careful search for all available times of light minima. A total of 76 times of light minima were collected and listed in Table 2. >From the literatures and two well-known databases (i.e., the O-C gateway (http://var.astro.cz/ocgate) and the Lichtenknecker database of the BAV (http://www.bav-astro.de/LkDB/index.php)), we have collected a total of 75 available times of light minima for EF Boo, which are summarized in Table 3. (3 data files).

  11. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 9, May 2008

    International Nuclear Information System (INIS)

    2008-05-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2007

  12. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 8, August 2007

    International Nuclear Information System (INIS)

    2007-08-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2006

  13. Compilation of the Dakota Aquifer Project isotope data and publications: The Isotope Hydrology Program of the Isotope Sciences Division

    International Nuclear Information System (INIS)

    Davisson, M.L.; Smith, D.K.; Hudson, G.B.; Niemeyer, S.; Macfarlane, P.A.; Whittemore, D.O.

    1995-01-01

    In FY92 the then Nuclear Chemistry Division embarked on a scientific collaboration with the Kansas Geological Survey (KGS) to characterize with isotope techniques groundwater of the Dakota Formation of Kansas. The Dakota Formation is a Cretaceous-aged marine sandstone hosting potable groundwater in most regions of Kansas whose use will serve to partially offset the severe overdraft problems in the overlying Ogallala Formation. The isotope characterization of the Dakota groundwater has generated data that delineates sources, ages, and subsurface controls on the water quality. Initial interpretations of the data have been published in abstract volumes of (1) the 1993 Geological Society of America National Meeting, (2) the 8th International Conference on Geochronology, Cosmochronology and Isotope Geology, and (3) the 1994 Dakota Aquifer Workshop and Clinic. Copies of all abstracts are included in this brief review. One report will focus on the sources and ages of the groundwater, and the other will focus on the subsurface controls on the natural water quality

  14. Compilation of data to estimate groundwater migration potential for constituents in active liquid discharges at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Ames, L.L.; Serne, R.J.

    1991-03-01

    A preliminary characterization of the constituents present in the 33 liquid waste streams at the US Department of Energy's Hanford Site has been completed by Westinghouse Hanford Company. In addition, Westinghouse Hanford has summarized the soil characteristics based on drill logs collected at each site that receives these liquid wastes. Literature searches were conducted and available Hanford-specific data were tabulated and reviewed. General literature on organic chemicals present in the liquid waste streams was also reviewed. Using all of this information, Pacific Northwest Laboratory has developed a best estimate of the transport characteristics (water solubility and soil adsorption properties) for those radionuclides and inorganic and organic chemicals identified in the various waste streams. We assume that the potential for transport is qualified through the four geochemical parameters: solubility, distribution coefficient, persistence (radiogenic or biochemical half-life), and volatility. Summary tables of these parameters are presented for more than 50 inorganic and radioactive species and more than 50 organic compounds identified in the liquid waste streams. Brief descriptions of the chemical characteristics of Hanford sediments, solubility, and adsorption processes, and of how geochemical parameters are used to estimate migration in groundwater-sediment environments are also presented. Groundwater monitoring data are tabulated for wells neighboring the facilities that receive the liquid wastes. 91 refs., 16 figs., 23 tabs.

  15. Compilation of data to estimate groundwater migration potential for constituents in active liquid discharges at the Hanford Site

    International Nuclear Information System (INIS)

    Ames, L.L.; Serne, R.J.

    1991-03-01

    A preliminary characterization of the constituents present in the 33 liquid waste streams at the US Department of Energy's Hanford Site has been completed by Westinghouse Hanford Company. In addition, Westinghouse Hanford has summarized the soil characteristics based on drill logs collected at each site that receives these liquid wastes. Literature searches were conducted and available Hanford-specific data were tabulated and reviewed. General literature on organic chemicals present in the liquid waste streams was also reviewed. Using all of this information, Pacific Northwest Laboratory has developed a best estimate of the transport characteristics (water solubility and soil adsorption properties) for those radionuclides and inorganic and organic chemicals identified in the various waste streams. We assume that the potential for transport is qualified through the four geochemical parameters: solubility, distribution coefficient, persistence (radiogenic or biochemical half-life), and volatility. Summary tables of these parameters are presented for more than 50 inorganic and radioactive species and more than 50 organic compounds identified in the liquid waste streams. Brief descriptions of the chemical characteristics of Hanford sediments, solubility, and adsorption processes, and of how geochemical parameters are used to estimate migration in groundwater-sediment environments are also presented. Groundwater monitoring data are tabulated for wells neighboring the facilities that receive the liquid wastes. 91 refs., 16 figs., 23 tabs

  16. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File

  17. Compilation of streamflow statistics calculated from daily mean streamflow data collected during water years 1901–2015 for selected U.S. Geological Survey streamgages

    Science.gov (United States)

    Granato, Gregory E.; Ries, Kernell G.; Steeves, Peter A.

    2017-10-16

    Streamflow statistics are needed by decision makers for many planning, management, and design activities. The U.S. Geological Survey (USGS) StreamStats Web application provides convenient access to streamflow statistics for many streamgages by accessing the underlying StreamStatsDB database. In 2016, non-interpretive streamflow statistics were compiled for streamgages located throughout the Nation and stored in StreamStatsDB for use with StreamStats and other applications. Two previously published USGS computer programs that were designed to help calculate streamflow statistics were updated to better support StreamStats as part of this effort. These programs are named “GNWISQ” (Get National Water Information System Streamflow (Q) files), updated to version 1.1.1, and “QSTATS” (Streamflow (Q) Statistics), updated to version 1.1.2.Statistics for 20,438 streamgages that had 1 or more complete years of record during water years 1901 through 2015 were calculated from daily mean streamflow data; 19,415 of these streamgages were within the conterminous United States. About 89 percent of the 20,438 streamgages had 3 or more years of record, and about 65 percent had 10 or more years of record. Drainage areas of the 20,438 streamgages ranged from 0.01 to 1,144,500 square miles. The magnitude of annual average streamflow yields (streamflow per square mile) for these streamgages varied by almost six orders of magnitude, from 0.000029 to 34 cubic feet per second per square mile. About 64 percent of these streamgages did not have any zero-flow days during their available period of record. The 18,122 streamgages with 3 or more years of record were included in the StreamStatsDB compilation so they would be available via the StreamStats interface for user-selected streamgages. All the statistics are available in a USGS ScienceBase data release.

  18. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  19. Data compilation task report for the source investigation of the 300-FF-1 operable unit phase 1 remedial investigation

    International Nuclear Information System (INIS)

    Young, J.S.; Fruland, R.M.; Fruchter, J.S.

    1990-02-01

    This report provides additional information on facility and waste characteristics for the 300-FF-1 operable unit. The additional information gathered and reported includes meetings and on-site visits with current and past personnel having knowledge of operations in the operable unit, a more precise determination of the location of the Process Sewer lines and Retired Radioactive Liquid Waste Sewer, a better understanding of the phosphoric acid spill at the 340 Complex, and a search for engineering plans and environmental reports related to the operable unit. As a result of this data-gathering effort, recommendations for further investigation include characterization of the 307 Trenches to determine the origin of an underlying uranium plume in the groundwater, more extensive sampling of near-surface and dike sediments in the North and South Process Ponds to better define the extent of horizontal contamination, and detection of possible leaks in the abandoned Radioactive Waste Sewer by either electromagnetic induction or remote television camera inspection techniques. 16 refs., 4 figs., 5 tabs

  20. Compilation and evaluation of radioecological data on soil/plant transfer in consideration of local variabilities in Germany

    International Nuclear Information System (INIS)

    Cierjacks, A.; Albers, B.

    2004-01-01

    Publications on soil-to-plant transfer factors (TFs) for radiocesium, radiostrontium, plutonium and iodine-129 in Germany were evaluated. Over 100 publications with relevant TFs were identified, whereof 54 were intensively analyzed and rated according to quality criteria. A database was created which gives a comprehensive survey of the transfer factors, important related soil and plant parameters and peculiarities of sampling and analyses. For better comparability, TFs were standardized and expressed in units of [Bq kg -1 plant dry matter / Bq kg -1 soil dry matter]. To enable statistical analyses, soil and plant parameters were standardized, too. Standardization also prepares data as input for modelling. The database contains 4800 records which represent singular and aggregated values of more than 7300 samples taken in Germany. Mean values of individual combinations radionuclide/crop can be queried easily using a special software module. Additional information about the experimental design, nuclide contents in plants and soil, important parameters and detailed remarks allow a classification of each record

  1. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  2. The National Assessment of Shoreline Change: a GIS compilation of vector shorelines and associated shoreline change data for the U.S. southeast Atlantic coast

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.

    2006-01-01

    The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Southeast Atlantic Coast (Florida, Georgia, South Carolina, North Carolina). These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates of shorelines and shoreline change rates can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the U.S. Southeast Atlantic Coast is the second in a series that already includes the Gulf of Mexico, and will eventually include the Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1997-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change for the U.S. Southeast Atlantic Coast at http://pubs.usgs.gov/of/2005/1401/ to get additional

  3. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  4. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  5. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  6. A compilation of U.S. Geological Survey pesticide concentration data for water and sediment in the Sacramento–San Joaquin Delta region: 1990–2010

    Science.gov (United States)

    Orlando, James L.

    2013-01-01

    Beginning around 2000, abundance indices of four pelagic fishes (delta smelt, striped bass, longfin smelt, and threadfin shad) within the San Francisco Bay and Sacramento–San Joaquin Delta began to decline sharply (Sommer and others, 2007). These declines collectively became known as the pelagic organism decline (POD). No single cause has been linked to this decline, and current theories suggest that combinations of multiple stressors are likely to blame. Contaminants (including current-use pesticides) are one potential stressor being investigated for its role in the POD (Anderson, 2007). Pesticide concentration data collected by the U.S. Geological Survey (USGS) at multiple sites in the delta region over the past two decades are critical to understanding the potential effects of current-use pesticides on species of concern as well as the overall health of the delta ecosystem. In April 2010, a compilation of contaminant data for the delta region was published by the State Water Resources Control Board (Johnson and others, 2010). Pesticide occurrence was the major focus of this report, which concluded that “there was insufficient high quality data available to make conclusions about the potential role of specific contaminants in the POD.” The report cited multiple sources; however, data collected by the USGS were not included in the publication even though these data met all criteria listed for inclusion in the report. What follows is a summary of publicly available USGS data for pesticide concentrations in surface water and sediments within the Sacramento–San Joaquin Delta region from the years 1990 through 2010. Data were retrieved though the USGS National Water Information System (NWIS) database, a publicly available online-data repository (U.S. Geological Survey, 1998), and from published USGS reports (also available online at http://pubs.er.usgs.gov/). The majority of the data were collected in support of two long term USGS monitoring programs

  7. Using Multi-Disciplinary Data to Compile a Hydrocarbon Budget for GC600, a Natural Seep in the Gulf of Mexico

    Science.gov (United States)

    MacDonald, I. R.; Johansen, C.; Marty, E.; Natter, M.; Silva, M.; Hill, J. C.; Viso, R. F.; Lobodin, V.; Diercks, A. R.; Woolsey, M.; Macelloni, L.; Shedd, W. W.; Joye, S. B.; Abrams, M.

    2016-12-01

    Fluid exchange between the deep subsurface and the overlying ocean and atmosphere occurs at hydrocarbon seeps along continental margins. Seeps are key features that alter the seafloor morphology and geochemically affect the sediments that support chemosynthetic communities. However, the dynamics and discharge rates of hydrocarbons at cold seeps remain largely unconstrained. Here we merge complementary geochemical (oil fingerprinting), geophysical (seismic, subbottom, backscatter, multibeam) and video/imaging (Video Time Lapse Camera, DSV ALVIN video) data sets to constrain pathways and magnitudes of hydrocarbon fluxes from the source rock to the seafloor at a well-studied, prolific seep site in the Northern Gulf of Mexico (GC600). Oil fingerprinting showed compositional similarities for samples from the following collections: the reservoir, an active vent, and the sea-surface. This was consistent with reservoir structures and pathways identified in seismic data. Video data, which showed the spatial distribution of seep indicators such as bacteria mats, or hydrate outcrops at the sediment interface, were combined with known hydrocarbon fluxes from the literature and used to quantify the total hydrocarbon fluxes in the seep domain. Using a systems approach, we combined data sets and published values at various scales and resolutions to compile a preliminary hydrocarbon budget for the GC600 seep site. Total estimated in-flow of hydrocarbons was 2.07 x 109 mol/yr. The combined total of out-flow and sequestration amounted to 7.56 x 106 mol/yr leaving a potential excess (in-flow - out-flow) of 2.06 x 109 mol/yr. Thus quantification of the potential out-flow from the seep domains based on observable processes does not equilibrate with the theoretical inputs from the reservoir. Processes that might balance this budget include accumulation of gas hydrate and sediment free-gas, as well as greater efficiency of biological sinks.

  8. Migration chemistry and behaviour of iodine relevant to geological disposal of radioactive wastes. A literature review with a compilation of sorption data

    International Nuclear Information System (INIS)

    Liu, Y.; Gunten, H.R. von

    1988-09-01

    This report reviews the literature on iodine migration, chemistry and behaviour in the environment up to November 1987. It deals mainly with 129 I released from a land repository, with particular consideration of the Swiss scenario for the disposal of low- and medium-level radioactive waste. As a background to this review, the basic properties of radioiodine, its distribution, circulation in nature and radiological impact are presented. A large number of sorption and diffusion data for iodine on rocks, sediments, minerals, cements and other materials have been compiled from many different laboratories. Based on this information, an assessment of the sorption and retardation of radioiodine in geomedia is made and methodologies for obtaining sorption distribution ratios (R D values) are discussed. The review also covers natural analogue studies of 129 I, retardation of iodine by cement barriers and the possible influences of organic compounds and microorganisms on the behaviour of iodine. Some possibilities for further research on diffusion measurements and near-field chemistry of radioiodine are outlined. (author) 259 refs., 9 figs., 32 tabs

  9. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  10. Neutron Elastic Scattering Cross Sections Experimental Data and Optical Model Cross Section Calculations. A Compilation of Neutron Data from the Studsvik Neutron Physics Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Holmqvist, B; Wiedling, T

    1969-06-15

    Neutron elastic scattering cross section measurements have been going on for a long period at the Studsvik Van de Graaff laboratory. The cross sections of a range of elements have been investigated in the energy interval 1.5 to 8 MeV. The experimental data have been compared with cross sections calculated with the optical model when using a local nuclear potential.

  11. A compilation of empirical data and variations in data concerning radiocesium in water, sediments and fish in European lakes after Chernobyl

    International Nuclear Information System (INIS)

    Hakanson, L.

    1999-01-01

    This work concerns the variability of radiocesium within lakes. The focus is on a broad set of data concerning radiocesium after the Chernobyl accident in lake water, sediments and different species of fish. Data are available to the author from three European data bases. Basic questions are: Are there any general patterns to be found concerning the variability of 137 Cs in lakes? Is it possible to give any recommendations concerning CV values (coefficient of variation; CV=SD/MV; SD=standard deviation, MV=mean value) for radiocesium in lake water, sediments and different species of fish? The analysis can be summarised as: 1. The CV values for lake water vary around 0.3 and are rather independent of the time after fallout. 2. One can use a CV of 0.6 as a reference for the typical uncertainty in radiocesium concentration in surficial lake sediments. CV values are generally higher (up to CV=1) for bottom areas dominated by erosion and transport processes (for fine material following Stokes' law) and lower (CV approx. 0.2) for areas dominated by continuous sedimentation and fine deposits (accumulation areas). CV values for radiocesium in sediments are also likely to increase with contamination and the size of the lake. 3. A reference CV of 0.22 would be a reasonable general CV value for lake fish. CV values are typically larger just after fallout (CV approx. 0.3) and decrease with time after fallout (to about 0.15). CV values are likely to increase with trophic level, from about 0.1 for planktivores to about 0.3 for piscivores, but these CV are based on limited data and are quite uncertain. The benefit of general empirical CV values is evident in modelling, e.g., when empirical data are compared to modelled values. The CV values for water, sediments and fish can be used to set empirical uncertainty bands for model predictions to enable meaningful discussions about predictive success. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  12. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  13. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  14. Use of innovative projects in the Agenda procedure: Heating - ventilation - sanitary. Compilation of selected projects from data base of Deutsche Bundesstiftung Umwelt. CD-ROM of May 2000 as further added sources.; Nutzung innovativer Projekte im Agenda Prozess: Heizung - Lueftung - Sanitaer. Zusammenstellung ausgewaehlter Projekte aus der Datenbank der Deutschen Bundesstiftung Umwelt. CD-ROM vom Mai 2000 sowie weiterer hinzugezogene Quellen

    Energy Technology Data Exchange (ETDEWEB)

    Schefe, W. (comp.)

    2001-05-01

    Compilation of selected projects from data base of Deutsche Bundesstiftung Umwelt. (orig.) [German] Zusammenstellung ausgewaehlter Projekte aus der Datenbank der Deutschen Bundesstiftung Umwelt. (orig.)

  15. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  16. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  17. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  18. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  19. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  20. Power and transmission rate orders and related documents. Office of Power Marketing Coordination, data compiled January 1, 1980-December 31, 1981

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-08-01

    This publication contains the power and transmission rate orders and related documents issued by the Department of Energy. It covers calendar years 1980 and 1981. The first publication, DOE/CE-007 covering the period from March through December 1979, was published July 1981. This publication is a compilation of all rate orders issued by the Assistant Secretary for Resource Applications and the Assistant Secretary for Conservation and Renewable Energy during calendar years 1980 and 1981 under Delegation Order No. 0204-33. It also includes all final approvals, remands, and disapprovals by the FERC, and a petition to the FERC for reconsideration by a Power Marketing Administration during 1980 and 1981. Also included are two delegation orders along with an amendment and a supplement to one delegation order, a departmental order on financial reporting, and Power and Transmission Rate Adjustment Procedures relating to federal power marketing.

  1. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  2. Compiling Data from Geological, Mineralogical and Geophysical (IP/RS Studies on Mahour Deposit, Northwest of Deh-salm, Lut Block

    Directory of Open Access Journals (Sweden)

    Arash Gorabjeiri Puor

    2015-10-01

    Full Text Available Introduction The Mahour exploration area is a polymetallic system containing copper, zinc and silver. The mineralization can be seen in two forms of veins and disseminations. This area is structurally within the Lut block, west of Deh-salm Village. Recent exploration work and studies carried out by geologists on this volcanic-plutonic area of Lut demonstate its importance indicating new reserves of copper, gold, and lead and zinc. Several articles have been published on the Mahour deposit in recent years, including work on fluid inclusions (Mirzaei et al., 2012a; Mirzaei et al., 2012b. The present report aims at completion of previous studies on Mahour. During the course of this research, the IP/RS geophysical methods were used to locate the extent and depth of sulfide veins in order to locate drill sites. The IP/RS method has been used extensively worldwide in locating sulfide mineralization at deposits such as Olympic Dam in Australia (Esdale et al., 1987, Hishikari epithermal gold deposit in Kagoshima, Japan (Okada, 1995 and Cadia-Ridgeway copper and gold deposit in New South Wales, Australia (Rutley et al., 2001. Materials and Methods 1. Determination of mineralogy of ore and alteration by examination of 70 thin sections and 45 polished sections. 2. Compilation of geological and mineralization maps of the studied area at a scale of 1:1000. 3. Geological, alteration, mineralization and trace element geochemical studies of 6 drill holes. 4. IP/RS measurements for 2585 points on a rectangular grid with profile intervals of 50 meters and electrode intervals of 20 meters. 5. Interpretation of IP/RS results. Discussion The Mahour area is covered by a volcanic sequence of basalt, andesite, dacite, rhyolite and pyro-clastics. During the Late Eocene through Early Oligocene this volcanic complex was intruded by several diorite and quartz-diorite bodies, which were responsible for mineralization of the area. Mineralized veins hosted by dacite show NNE

  3. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  4. FY 1998 annual report on the compilation of the data related to new energy technology development. Photovoltaic power generation; 1998 nendo shin energy gijutsu kaihatsu kankei data shu sakusei chosa hokokusho. Taiyoko hatsuden

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The data, centered by those related to generation of photovoltaic (PV) power as one branch of new energy, are collected and systematically compiled under the following fields. (Significance of adopting PV power systems) describes, e.g., CO2 reduction effect and energy pay-back time for different customers, i.e., nation, local governments, industries and individual households. (Present status of solar cell markets) describes solar cell production by region, cell technology and industry; shipments by application; production values; and prices. (PV system policies overseas) compares the policies of the industrialized countries for PV power systems with those of Japan. (Introduction of PV power systems overseas) compares situations of various countries in PV power system introduction with those of Japan with respect to estimated quantities of PV systems installed and target quantities. (Financial supports for PV power system installation) describes subsidies, tax benefits and loans adopted in Japan. Other items covered herein include transition of PV-related budgets, flow of PV system introduction, measures taken by central and local governments, and contacts for PV-related enterprises. (NEDO)

  5. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  6. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  7. Arctic industrial activities compilation

    International Nuclear Information System (INIS)

    1991-01-01

    Most industrial activities in the Beaufort Sea region are directly or indirectly associated with the search for oil and gas. Activities in marine areas include dredging, drilling, seismic and sounding surveys, island/camp maintenance, vessel movements, helicoptor and fixed-wind flights, and ice-breaking. This inventory contains a summary of chemical usage at 119 offshore drilling locations in the Beaufort Sea, Arctic Islands and Davis Straight of the Canadian Arctic between 1973 and 1987. Data are graphically displayed for evaluating patterns of drill waste discharge in the three offshore drilling areas. These displays include a comparison of data obtained from tour sheets and well history records, summaries of drilling mud chemicals used by year, well and oil company, frequency of wells drilled as a function of water depth, and offshore drilling activity by year, company, and platform. 21 refs., 104 figs., 2 tabs

  8. Compilation of colony forming unit data for Bacillus anthracis and B. atrophaeus before and after exposure to various fogging treatments using peracetic acid or hydrogen peroxide

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data set contains CFU data for positive controls and test coupons for each test, for each material, and for each microorganism used. Also included are efficacy data...

  9. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  10. Distributed picture compilation demonstration

    Science.gov (United States)

    Alexander, Richard; Anderson, John; Leal, Jeff; Mullin, David; Nicholson, David; Watson, Graham

    2004-08-01

    A physical demonstration of distributed surveillance and tracking is described. The demonstration environment is an outdoor car park overlooked by a system of four rooftop cameras. The cameras extract moving objects from the scene, and these objects are tracked in a decentralized way, over a real communication network, using the information form of the standard Kalman filter. Each node therefore has timely access to the complete global picture and because there is no single point of failure in the system, it is robust. The demonstration system and its main components are described here, with an emphasis on some of the lessons we have learned as a result of applying a corpus of distributed data fusion theory and algorithms in practice. Initial results are presented and future plans to scale up the network are also outlined.

  11. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  12. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  13. Compilation, quality control, analysis, and summary of discrete suspended-sediment and ancillary data in the United States, 1901-2010

    Science.gov (United States)

    Lee, Casey J.; Glysson, G. Douglas

    2013-01-01

    Human-induced and natural changes to the transport of sediment and sediment-associated constituents can degrade aquatic ecosystems and limit human uses of streams and rivers. The lack of a dedicated, easily accessible, quality-controlled database of sediment and ancillary data has made it difficult to identify sediment-related water-quality impairments and has limited understanding of how human actions affect suspended-sediment concentrations and transport. The purpose of this report is to describe the creation of a quality-controlled U.S. Geological Survey suspended-sediment database, provide guidance for its use, and summarize characteristics of suspended-sediment data through 2010. The database is provided as an online application at http://cida.usgs.gov/sediment to allow users to view, filter, and retrieve available suspended-sediment and ancillary data. A data recovery, filtration, and quality-control process was performed to expand the availability, representativeness, and utility of existing suspended-sediment data collected by the U.S. Geological Survey in the United States before January 1, 2011. Information on streamflow condition, sediment grain size, and upstream landscape condition were matched to sediment data and sediment-sampling sites to place data in context with factors that may influence sediment transport. Suspended-sediment and selected ancillary data are presented from across the United States with respect to time, streamflow, and landscape condition. Examples of potential uses of this database for identifying sediment-related impairments, assessing trends, and designing new data collection activities are provided. This report and database can support local and national-level decision making, project planning, and data mining activities related to the transport of suspended-sediment and sediment-associated constituents.

  14. amamutdb.no: A relational database for MAN2B1 allelic variants that compiles genotypes, clinical phenotypes, and biochemical and structural data of mutant MAN2B1 in α-mannosidosis.

    Science.gov (United States)

    Riise Stensland, Hilde Monica Frostad; Frantzen, Gabrio; Kuokkanen, Elina; Buvang, Elisabeth Kjeldsen; Klenow, Helle Bagterp; Heikinheimo, Pirkko; Malm, Dag; Nilssen, Øivind

    2015-06-01

    α-Mannosidosis is an autosomal recessive lysosomal storage disorder caused by mutations in the MAN2B1 gene, encoding lysosomal α-mannosidase. The disorder is characterized by a range of clinical phenotypes of which the major manifestations are mental impairment, hearing impairment, skeletal changes, and immunodeficiency. Here, we report an α-mannosidosis mutation database, amamutdb.no, which has been constructed as a publicly accessible online resource for recording and analyzing MAN2B1 variants (http://amamutdb.no). Our aim has been to offer structured and relational information on MAN2B1 mutations and genotypes along with associated clinical phenotypes. Classifying missense mutations, as pathogenic or benign, is a challenge. Therefore, they have been given special attention as we have compiled all available data that relate to their biochemical, functional, and structural properties. The α-mannosidosis mutation database is comprehensive and relational in the sense that information can be retrieved and compiled across datasets; hence, it will facilitate diagnostics and increase our understanding of the clinical and molecular aspects of α-mannosidosis. We believe that the amamutdb.no structure and architecture will be applicable for the development of databases for any monogenic disorder. © 2015 WILEY PERIODICALS, INC.

  15. Compilation of ocean circulation and other data from ADCP current meters, CTD casts, tidal gauges, and other instruments from a World-Wide distribution by Oregon State University and other institutions as part of World Ocean Circulation Experiment (WOCE) and other projects from 24 November 1985 to 30 December 2000 (NODC Accession 0000649)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Compilation of ocean circulation and other data were collected from a World-Wide distribution by Oregon State University (OSU) and other institutions as part of...

  16. Integrating Emerging Data Sources into Operational Practice: Capabilities and Limitations of Devices to Collect, Compile, Save, and Share Messages from CAVs and Connected Travelers

    Science.gov (United States)

    2018-03-01

    Connected and automated vehicles (CAVs) and connected travelers will be providing substantially increased levels of data which will be available for agencies to consider using to improve the management and operation of the surface transportation syst...

  17. A listing of cosmogenic, optically-stimulated-luminescence, (U-Th)/He, and fission-track sample locations, analyses, and age data compiled as part of Marsden contract GNS002 (2000-2004) : subduction initiation beneath Fiordland, southwest New Zealand

    International Nuclear Information System (INIS)

    Sutherland, R.

    2007-01-01

    This report is an archive of data relevant to Fiordland tectonic evolution. Specifically these are the data that were collected and compiled as part of Marsden contract GNS002 (2000-2004); samples were collected between 1987 and 2002. The principal investigator of the Marsden Fund project was Rupert Sutherland (GNS Science). Additional named investigators (with responsibilities and affiliations) were: Kyeong Kim (cosmogenic isotope dating; GNS Science); Peter Kamp (fission track dating; Waikato University); Martha House (U-Th/He dating; Caltech, USA); and Michael Gurnis (numerical models of subduction initiation; Caltech, USA). After the project began, Uwe Rieser from Victoria University agreed to undertake optically-stimulated luminescence (OSL) dating as part of the project. (author). 6 refs

  18. Seismic Safety Margins Research Program, Phase I. Project II: seismic input. Compilation, assessment and expansion of the strong earthquake ground motion data base

    Energy Technology Data Exchange (ETDEWEB)

    Crouse, C B; Hileman, J A; Turner, B E; Martin, G R

    1980-04-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms, (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications.

  19. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  20. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  1. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  2. Hydrothermal alteration maps of the central and southern Basin and Range province of the United States compiled from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data

    Science.gov (United States)

    Mars, John L.

    2013-01-01

    Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and Interactive Data Language (IDL) logical operator algorithms were used to map hydrothermally altered rocks in the central and southern parts of the Basin and Range province of the United States. The hydrothermally altered rocks mapped in this study include (1) hydrothermal silica-rich rocks (hydrous quartz, chalcedony, opal, and amorphous silica), (2) propylitic rocks (calcite-dolomite and epidote-chlorite mapped as separate mineral groups), (3) argillic rocks (alunite-pyrophyllite-kaolinite), and (4) phyllic rocks (sericite-muscovite). A series of hydrothermal alteration maps, which identify the potential locations of hydrothermal silica-rich, propylitic, argillic, and phyllic rocks on Landsat Thematic Mapper (TM) band 7 orthorectified images, and geographic information systems shape files of hydrothermal alteration units are provided in this study.

  3. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 7, December 2005 (last updated 2005.12.15)

    International Nuclear Information System (INIS)

    2005-12-01

    This Profiles report is based on data collected using the NEWMDB from May to October 2005. The report was first published on line within the NEWMDB December 2005. Starting with NEWMDB version II (introduced January 2004) individual reports can be updated after a Profiles report is published. Please refer to the Profiles bookmark; the page that is accessed via this bookmark lists revisions to individual Profiles (if there are any)

  4. Compilation of three-dimensional coordinates and specific data of the instrumentation of the prestressed concrete pressure vessel/high temperature helium test rig

    International Nuclear Information System (INIS)

    Klausinger, D.

    1977-04-01

    The positions of the thermoelements, strain gauges of various types, and of Gloetzl instruments installed by SGAE in the model vessel of the Common Project Prestressed Concrete Pressure Vessel/High Temperature Helium Test Rig are defined in cylindrical coordinates. The specific data of the instruments are given like configuration of multiple instruments; type, group and number of the instrument; number of cable and of channel; calibration factors; resistances of instruments and cables. (author)

  5. SU-E-T-256: Radiation Dose Responses for Chemoradiation Therapy of Pancreatic Cancer: An Analysis of Compiled Clinical Data Using Biophysical Models.

    Science.gov (United States)

    Moraru, I; Tai, A; Erickson, B; Li, X

    2012-06-01

    We have analyzed recent clinical data obtained from chemoradiation of unresectable, locally advanced pancreatic cancer in order to examine possible benefits from radiotherapy (RT) dose escalation as well as to propose possible dose escalated fractionation schemes. A modified linear quadratic (LQ) model was used to fit clinical tumor response data from chemoradiation treatments using different fractionations. Biophysical radiosensitivy parameters, a and α/β, tumor potential doubling time, Td, and delay time for tumor doubling during treatment, Tk, were extracted from the fits and were used to calculate feasible fractionation schemes for dose escalations. Examination of published data from 20 institutions showed no clear indication of improved survival with raised radiation dose. However, an enhancement in tumor response was observed for higher irradiation doses, an important and promising clinical Result with respect to palliation and quality of life. The radiobiological parameter estimates obtained from the analysis are: α/β = 10 ± 3 Gy, a = 0.010 ± 0.003 Gŷ-1, Td = 56 ± 5 days and Tk = 7 ± 2 days. Possible dose escalation schemes are proposed based on the calculation of the biologically equivalent dose (BED) required for a 50% tumor response rate. From the point of view of tumor response, escalation of the administered radiation dose leads to a potential clinical benefit, which when combined with normal tissue complication analyses may Result in improved treatments for certain patients with advanced pancreatic cancer. Based on this analysis, a dose escalation trial with 2.25 Gy/fraction up to 69.75 Gy is being initiated for unresectable pancreatic cancer at our institution. Partially supported by MCW Cancer Center Meinerz Foundation. © 2012 American Association of Physicists in Medicine.

  6. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  7. Compilation and preliminary interpretation of hydrologic data for the Weldon Spring radioactive waste-disposal sites, St Charles County, Missouri; a progress report

    Science.gov (United States)

    Kleeschulte, M.J.; Emmett, L.F.

    1986-01-01

    The Weldon Spring Chemical Plant is located just north of the drainage divide separating the Mississippi River and the Missouri River in St. Charles County, Missouri. From 1957 to 1966 the plant converted uranium-ore concentrates and recycled scrap to pure uranium trioxide, uranium tetrafluoride, and uranium metal. Residues from these operations were pumped to four large pits that had been excavated near the plant. Small springs and losing streams are present in the area. Water overlying the residue in the pits has a large concentration of dissolved solids and a different chemical composition compared to the native groundwater and surface water. This difference is indicated by the concentrations of calcium, sodium, sulfate, nitrate, fluoride, uranium, radium, lithium, molybdenum, strontium, and vanadium, all of which are greater than natural or background concentrations. Water from Burgermeister Spring, located about 1.5 miles north of the chemical plant area, contains uranium and nitrate concentrations greater than background concentrations. Groundwater in the shallow bedrock aquifer moves northward from the vicinity of the chemical plant toward Dardenne Creek. An abandoned limestone quarry several miles southwest of the chemical plant also has been used for the disposal of radioactive waste and rubble. Groundwater flow from the quarry area is southward through the alluvium, away from the quarry and toward the Missouri River. The St. Charles County well field is located in the Missouri River flood plain near the quarry and the large yield wells are open to the Missouri River alluvial aquifer. Water from a well 4,000 ft southeast of the quarry was analyzed; there was no indication of contamination from the quarry. Additional water quality and water level data are needed to determine if water from the quarry moves toward the well field. Observation wells need to be installed in the area between the chemical plant, pits, and Dardenne Creek. The wells would be used to

  8. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  9. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  10. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  11. NanoData Landscape Compilation. Construction

    NARCIS (Netherlands)

    Allan, J.E.M.; Bakker, B.J.; Buist, H.E.; Flament, G.; Hartmann, C.; Jawad, I.; Kuijpers, L.T.; Kuittinen, H.; Noyons, E.; Stolwijk, C.C.M.; Olaeta, X.U.; Yegros, A.

    2017-01-01

    This report offers a snapshot of the status of the environment for nanotechnology in the context of construction. The construction industry covers the building, maintaining and repairing of buildings and infrastructures for living, working and transport, including providing materials for those

  12. Compilation of requests for nuclear data. Addendum

    International Nuclear Information System (INIS)

    1984-09-01

    Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, Steen, DNR. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident energy resolution in percent is sometimes given

  13. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  14. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  15. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  16. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  17. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  18. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    International Nuclear Information System (INIS)

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries

  19. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries. Refs, figs, tabs.

  20. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  1. Improvements in PIE-techniques at the IFE hot-laboratory. 'Neutron radiography, three dimensional profilometry and image compilation of PIE data for visualization in an image based user-interface'

    International Nuclear Information System (INIS)

    Jenssen, H.K.; Oberlaender, B.C.

    2002-01-01

    The PIE-techniques used at IFE are continuously improved through upgrading of equipment and methods, e.g. image handling techniques and components utilized in data acquisition and editing techniques. To improve the quality or spatial resolution of neutron radiographs the normal technique was complemented with another method, i.e. the dysprosium foil/X ray film technique is supplemented with a track-etch recorder consisting of a cellulose nitrate film. For further examination of the neutron radiographs the cellulose nitrate film can be digitized to allow electronic image treatment. Promising results were obtained with this technique on neutron radiographs, namely higher spatial resolution compared to the normal technique, high contrast and sharp neutron radiography images. The traditional uniaxial profilometry of fuel rods was modified so that diameter/bow measurements are possible at several angular orientations during one acquisition sequence. This extension is very useful in several ways, for instance the built-in data symmetry of the method is used to check the correctness of the measurement results. Diameter and bow measurements give information of cladding irregularities and fuel rod profiles. Implementation of electronic image handling techniques is particularly useful in PIE when data are collected and compiled in an image file. Inspection and examination of the file contents (examination results) are possible through an ideal user-interface, i.e. Adobe Photoshop software with navigator possibilities. Examples incorporating PIE data acquired from neutron radiography, visual inspection and ceramography are utilized for illustration of the user-interface and some of its possibilities. (author)

  2. Compilation, evaluation and assessment of the existing data on the pollution load affecting the water quality of the central stretch of the river Elbe on the basis of uniform common criteria (preliminary study). Vol. 1

    International Nuclear Information System (INIS)

    Guhr, H.; Buettner, O.; Dreyer, U.; Krebs, D.; Spott, D.; Suhr, U.; Weber, E.

    1993-01-01

    The data (concentrations, pollution loads) measured for the 566 km flowing stretch of the river Elbe on the territory of the former GDR were compiled and evaluated according to primary statistical aspects. The longitudinal section was prepared for major variables with regard to the flow for Q50% and Q5%. The distribution of concentration in various measuring points was shown by means of box plots. Interdependencies between concentration and flow as well as water temperature were investigated and used for assessing diffuse matter input. In addition to determining the saprobic index, the biological control of water pollution comprised the assessment of the macrozoobenthos in the area of sewage discharge, chlorophyll measurements, and inventory of the fish population, analysis of pollutant accumulation in fish and in zoobenthos as well as virus detection. The water quality of the river Elbe was evaluated in compliance with the binding E.C. guidelines and national regulations/recommendations revealing an extreme pollution level which impairs or excludes various utilizations of the Elbe water. (orig.) [de

  3. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  4. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  5. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  6. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  7. Compilation and evaluation of high energy γ-ray standards from nuclear reactions. Work performed under the coordinated research project 'Update of X- and γ-ray decay data standards for detector calibration'

    International Nuclear Information System (INIS)

    Marcinkowski, A.; Marianski, B.

    1999-02-01

    The report presents the following aspects needed for the compilation and evaluation of high energy γ-ray standards from nuclear reactions: evaluation of emission probabilities of γ-rays with energies 4.44 MeV and 15.11 MeV from 12 C * , preparation of the list of reactions suitable for production of the above mentioned excited radionuclide, and compilation and evaluation of cross sections for these reactions, including inelastic proton scattering on 12 C and radiative capture on 11 B

  8. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  9. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  10. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  11. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  12. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  13. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  14. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  15. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  16. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  17. Compilation of hydrologic data for White Sands pupfish habitat and nonhabitat areas, northern Tularosa Basin, White Sands Missile Range and Holloman Air Force Base, New Mexico, 1911-2008

    Science.gov (United States)

    Naus, C.A.; Myers, R.G.; Saleh, D.K.; Myers, N.C.

    2014-01-01

    The White Sands pupfish (Cyprinodon tularosa), listed as threatened by the State of New Mexico and as a Federal species of concern, is endemic to the Tularosa Basin, New Mexico. Because water quality can affect pupfish and the environmental conditions of their habitat, a comprehensive compilation of hydrologic data for pupfish habitat and nonhabitat areas in the northern Tularosa Basin was undertaken by the U.S. Geological Survey in cooperation with White Sands Missile Range. The four locations within the Tularosa Basin that are known pupfish habitat areas are the Salt Creek, Malpais Spring and Malpais Salt Marsh, Main Mound Spring, and Lost River habitat areas. Streamflow data from the Salt Creek near Tularosa streamflow-gaging station indicated that the average annual mean streamflow and average annual total streamflow for water years 1995–2008 were 1.35 cubic feet per second (ft3/s) and 983 acre-feet, respectively. Periods of no flow were observed in water years 2002 through 2006. Dissolved-solids concentrations in Salt Creek samples collected from 1911 through 2007 ranged from 2,290 to 66,700 milligrams per liter (mg/L). The average annual mean streamflow and average annual total streamflow at the Malpais Spring near Oscura streamflow-gaging station for water years 2003–8 were 6.81 ft3/s and 584 acre-feet, respectively. Dissolved-solids concentrations for 16 Malpais Spring samples ranged from 3,882 to 5,500 mg/L. Isotopic data for a Malpais Spring near Oscura water sample collected in 1982 indicated that the water was more than 27,900 years old. Streamflow from Main Mound Spring was estimated at 0.007 ft3/s in 1955 and 1957 and ranged from 0.02 to 0.07 ft3/s from 1996 to 2001. Dissolved-solids concentrations in samples collected between 1955 and 2007 ranged from an estimated 3,760 to 4,240 mg/L in the upper pond and 4,840 to 5,120 mg/L in the lower pond. Isotopic data for a Main Mound Spring water sample collected in 1982 indicated that the water was about

  18. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  19. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  20. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  1. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  2. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  3. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  4. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  5. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  6. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  7. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  8. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  9. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  10. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  11. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  12. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  13. Radiation data definitions and compilation for equipment qualification data bank

    International Nuclear Information System (INIS)

    Bouquet, F.L.; Winslow, J.W.

    1986-01-01

    Dose definitions, physical properties, mechanical properties, electrical properties, and particle definitions are listed for insulators and dielectrics, elastomeric seals and gaskets, lubricants, adhesives, and coatings

  14. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  15. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  16. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  17. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  18. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  19. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  20. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  1. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  2. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  3. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  4. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  5. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  6. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  7. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  8. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  9. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  10. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  11. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  12. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  13. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  14. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  15. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  16. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  17. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  18. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  19. Compilation of FY 1995 and FY 1996 DOD Financial Statements at the Defense Finance and Accounting Service, Indianapolis Center

    National Research Council Canada - National Science Library

    1996-01-01

    The audit objective was to determine whether the Defense Finance and Accounting Service, Indianapolis Center, consistently and accurately compiled financial data from field entities and other sources...

  20. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  1. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  2. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  3. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  4. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors

  5. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  6. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  7. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  8. Compilation of the FY 1999 Department of the Navy Working Capital Fund Financial Statements

    National Research Council Canada - National Science Library

    2000-01-01

    ...) Cleveland Center consistently and accurately compiled and consolidated financial data received from Navy field organizations and other sources to prepare the FY 1999 Navy Working Capital Fund financial statements...

  9. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  10. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  11. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  12. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  13. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  14. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  15. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  16. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  17. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  18. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  19. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  20. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  1. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  2. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  3. Collection methods, data compilation, and lessons learned from a study of stream geomorphology associated with riparian cattle grazing along the Fever River, University of Wisconsin- Platteville Pioneer Farm, Wisconsin, 2004–11

    Science.gov (United States)

    Peppler, Marie C.; Fitzpatrick, Faith A.

    2018-03-09

    Stream geomorphic characteristics were monitored along a 0.8-mile reach of the Fever River in the Driftless Area of southwestern Wisconsin from 2004 to 2011 where cattle grazed in paddocks along the riverbank at the University of Wisconsin-Platteville’s Pioneer Farm. The study reach encompassed seven paddocks that covered a total of 30 acres on both sides of the river. Monitoring data included channel crosssection surveys, eroding bank measurements and photograph points, erosion-pin measurements, longitudinal profile surveys, measurements of the volume of soft sediment in the channel, and repeated time-lapse photographs. Characteristics were summarized into subreaches by use of a geographic information system. From 2004 to 2007, baseline monitoring was done to identify geomorphic conditions prior to evaluating the effects of management alternatives for riparian grazing. Subsequent to the full-scale baseline monitoring, additional data were collected from 2007 to 2011. Samples of eroding bank and in-channel soft sediment were collected and analyzed for dry bulk density in 2008 for use in a sediment budget. One of the pastures was excluded from cattle grazing in the fall of 2007; in 2009 channel cross sections, longitudinal profiles, erosion-pin measurements, photographs, and a soft sediment survey were again collected along the full 0.8-mile reach for a comparison to baseline monitoring data. Channel cross sections were surveyed a final time in 2011. Lessons learned from bank monitoring with erosion pins were most numerous and included the need for consistent tracking of each pin and whether there was deposition or erosion, timing of measurements and bank conditions during measurements (frozen, postflood), and awareness of pins loosening in place. Repeated freezing and thawing of banks and consequential mass wasting and jointing enhance fluvial erosion. Monitoring equipment in the paddocks was kept flush to the ground or located high on posts to avoid injuring the

  4. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  5. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  6. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  7. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  8. Length of Stay After Childbirth in 92 Countries and Associated Factors in 30 Low- and Middle-Income Countries: Compilation of Reported Data and a Cross-sectional Analysis from Nationally Representative Surveys.

    Directory of Open Access Journals (Sweden)

    Oona M R Campbell

    2016-03-01

    Full Text Available Following childbirth, women need to stay sufficiently long in health facilities to receive adequate care. Little is known about length of stay following childbirth in low- and middle-income countries or its determinants.We described length of stay after facility delivery in 92 countries. We then created a conceptual framework of the main drivers of length of stay, and explored factors associated with length of stay in 30 countries using multivariable linear regression. Finally, we used multivariable logistic regression to examine the factors associated with stays that were "too short" (<24 h for vaginal deliveries and <72 h for cesarean-section deliveries. Across countries, the mean length of stay ranged from 1.3 to 6.6 d: 0.5 to 6.2 d for singleton vaginal deliveries and 2.5 to 9.3 d for cesarean-section deliveries. The percentage of women staying too short ranged from 0.2% to 83% for vaginal deliveries and from 1% to 75% for cesarean-section deliveries. Our conceptual framework identified three broad categories of factors that influenced length of stay: need-related determinants that required an indicated extension of stay, and health-system and woman/family dimensions that were drivers of inappropriately short or long stays. The factors identified as independently important in our regression analyses included cesarean-section delivery, birthweight, multiple birth, and infant survival status. Older women and women whose infants were delivered by doctors had extended lengths of stay, as did poorer women. Reliance on factors captured in secondary data that were self-reported by women up to 5 y after a live birth was the main limitation.Length of stay after childbirth is very variable between countries. Substantial proportions of women stay too short to receive adequate postnatal care. We need to ensure that facilities have skilled birth attendants and effective elements of care, but also that women stay long enough to benefit from these. The

  9. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  10. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    International Nuclear Information System (INIS)

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  11. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  12. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  13. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    International Nuclear Information System (INIS)

    Harrington, S.J.

    2011-01-01

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  14. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  15. Compiling a Monolingual Dictionary for Native Speakers

    Directory of Open Access Journals (Sweden)

    Patrick Hanks

    2011-10-01

    Full Text Available

    ABSTRACT: This article gives a survey of the main issues confronting the compilers of monolingual dictionaries in the age of the Internet. Among others, it discusses the relationship between a lexical database and a monolingual dictionary, the role of corpus evidence, historical principles in lexicography vs. synchronic principles, the instability of word meaning, the need for full vocabulary coverage, principles of definition writing, the role of dictionaries in society, and the need for dictionaries to give guidance on matters of disputed word usage. It concludes with some questions about the future of dictionary publishing.

    OPSOMMING: Die samestelling van 'n eentalige woordeboek vir moedertaalsprekers. Hierdie artikel gee 'n oorsig van die hoofkwessies waarmee die samestellers van eentalige woordeboeke in die eeu van die Internet te kampe het. Dit bespreek onder andere die verhouding tussen 'n leksikale databasis en 'n eentalige woordeboek, die rol van korpusgetuienis, historiese beginsels vs sinchroniese beginsels in die leksikografie, die onstabiliteit van woordbetekenis, die noodsaak van 'n volledige woordeskatdekking, beginsels van die skryf van definisies, die rol van woordeboeke in die maatskappy, en die noodsaak vir woordeboeke om leiding te gee oor sake van betwiste woordgebruik. Dit sluit af met 'n aantal vrae oor die toekoms van die publikasie van woordeboeke.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, LEKSIKALE DATABASIS, WOORDEBOEKSTRUKTUUR, WOORDBETEKENIS, BETEKENISVERANDERING, GEBRUIK, GEBRUIKSAANTEKENINGE, HISTORIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, SINCHRONIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, REGISTER, SLANG, STANDAARDENGELS, WOORDESKATDEKKING, KONSEKWENSIE VAN VERSAMELINGS, FRASEOLOGIE, SINTAGMATIESE PATRONE, PROBLEME VAN KOMPOSISIONALITEIT, LINGUISTIESE PRESKRIPTIVISME, LEKSIKALE GETUIENIS

  16. A Critical Compilation of Compressible Turbulent Boundary Layer Data

    Science.gov (United States)

    1977-06-01

    Barrow Hal l Athens Georgia 30602 TAPE: GENERAL SPECIFICATION Seven-track, half inch tape, recorded at 800 BPI Coded : BCD, even parity Block length...thermopile. TABLE 5 PROFILE SERIES PRESENTED CODE HT NX TTP FEB POD2 TOO TW 7304 MN/m K K 0101-5 AW 5 ECP Kistler 0.1 336 291 0201-5 AW 5 ECP Kistler 0.5 336...3.06980+03 1.0500"-03 6.11812 1.3114 1.6316*+03 1.63160+03 1.11600*00 4.62W*~05 1.0000 1.23310+04 Nm Isaias 1.790? 2.91300+0 6.2741401l INFINITE 3.1300*#02

  17. Evaluation and compilation of DOE waste package test data

    International Nuclear Information System (INIS)

    Interrante, C.G.; Escalante, E.; Fraker, A.C.

    1990-11-01

    This report summarizes evaluations by the National Institute of Standards and Technology (NIST) of Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW) for the six-month period August 1988 through January 1989. Included are reviews of related materials research and plans, activities for the DOE Materials Characterization Center, information on the Yucca Mountain Project, and other information regarding supporting research and special assistance. NIST comments are given on the Yucca Mountain Consultation Draft Site Characterization Plan (CDSCP) and on the Waste Compliance Plan for the West Valley Demonstration Project (WVDP) High-Level Waste (HLW) Form. 3 figs

  18. Evaluation and compilation of DOE waste package test data

    International Nuclear Information System (INIS)

    Interrante, C.G.; Fraker, A.C.; Escalante, E.

    1991-12-01

    This report summarizes evaluations by the National Institute of Standards and Technology (NIST) of Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW) for the six-month period, February through July 1989. This includes reviews of related materials research and plans, information on the Yucca Mountain, Nevada disposal site activities, and other information regarding supporting research and special assistance. Outlines for planned interpretative reports on the topics of aqueous corrosion of copper, mechanisms of stress corrosion cracking and internal failure modes of Zircaloy cladding are included. For the publications reviewed during this reporting period, short discussions are given to supplement the completed reviews and evaluations. Included in this report is an overall review of a 1984 report on glass leaching mechanisms, as well as reviews for each of the seven chapters of this report

  19. Evaluation and compilation of DOE waste package test data

    International Nuclear Information System (INIS)

    Interrante, C.G.; Fraker, A.C.; Escalante, E.

    1993-06-01

    This report summarizes evaluations by the National Institute of Standards and Technology (NIST) of some of the Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW) for the six-month period, August 1989--January 1990. This includes reviews of related materials research and plans, information on the Yucca Mountain, Nevada disposal site activities, and other information regarding supporting research and special assistance. Short discussions are given relating to the publications reviewed and complete reviews and evaluations are included. Reports of other work are included in the Appendices

  20. COMPILATION OF AVAILABLE DATA ON BUILDING DECONTAMINATION ALTERNATIVES

    Science.gov (United States)

    The report presents an analysis of selected technologies that have been tested for their potential effectiveness in decontaminating a building that has been attacked using biological or chemical warfare agents, or using toxic industrial compounds. The technologies selected to be ...

  1. 12 CFR 203.4 - Compilation of loan data.

    Science.gov (United States)

    2010-01-01

    ... a merger or acquisition, or as part of the acquisition of all of the assets and liabilities of a... updates). (12)(i) For originated loans subject to Regulation Z, 12 CFR part 226, the difference between... as of the date the interest rate is set, if that difference is equal to or greater than 1.5...

  2. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  3. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  4. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  5. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  6. SERKON program for compiling a multigroup library to be used in BETTY calculation

    International Nuclear Information System (INIS)

    Nguyen Phuoc Lan.

    1982-11-01

    A SERKON-type program was written to compile data sets generated by FEDGROUP-3 into a multigroup library for BETTY calculation. A multigroup library was generated from the ENDF/B-IV data file and tested against the TRX-1 and TRX-2 lattices with good results. (author)

  7. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  8. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  9. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  10. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  11. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  12. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  13. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  14. A compilation of structure functions in deep inelastic scattering

    International Nuclear Information System (INIS)

    Gehrmann, T.; Roberts, R.G.; Whalley, M.R.

    1999-01-01

    A compilation of all the available data on the unpolarized structure functions F 2 and xF 3 , R=(σ L /σ T ), the virtual photon asymmetries A 1 and A 2 and the polarized structure functions g 1 and g 2 , from deep inelastic lepton scattering off protons, deuterium and nuclei is presented. The relevant experiments at CERN, DESY, Fermilab and SLAC from 1991, the date of our earlier review [1], to the present day are covered. A brief general theoretical introduction is given followed by the data presented both in tabular and graphical form and, for the F 2 and xF 3 data, the predictions based on the MRST98 and CTEQ4 parton distribution functions are also displayed. All the data in this review, together with data on a wide variety of other reactions, can be found in and retrieved from the Durham-RAL HEP Databases on the World-Wide-Web (http://durpdg.dur.ac.uk/HEPDATA). (author)

  15. Cosmetics Europe compilation of historical serious eye damage/eye irritation in vivo data analysed by drivers of classification to support the selection of chemicals for development and evaluation of alternative methods/strategies: the Draize eye test Reference Database (DRD).

    Science.gov (United States)

    Barroso, João; Pfannenbecker, Uwe; Adriaens, Els; Alépée, Nathalie; Cluzel, Magalie; De Smedt, Ann; Hibatallah, Jalila; Klaric, Martina; Mewes, Karsten R; Millet, Marion; Templier, Marie; McNamee, Pauline

    2017-02-01

    A thorough understanding of which of the effects assessed in the in vivo Draize eye test are responsible for driving UN GHS/EU CLP classification is critical for an adequate selection of chemicals to be used in the development and/or evaluation of alternative methods/strategies and for properly assessing their predictive capacity and limitations. For this reason, Cosmetics Europe has compiled a database of Draize data (Draize eye test Reference Database, DRD) from external lists that were created to support past validation activities. This database contains 681 independent in vivo studies on 634 individual chemicals representing a wide range of chemical classes. A description of all the ocular effects observed in vivo, i.e. degree of severity and persistence of corneal opacity (CO), iritis, and/or conjunctiva effects, was added for each individual study in the database, and the studies were categorised according to their UN GHS/EU CLP classification and the main effect driving the classification. An evaluation of the various in vivo drivers of classification compiled in the database was performed to establish which of these are most important from a regulatory point of view. These analyses established that the most important drivers for Cat 1 Classification are (1) CO mean ≥ 3 (days 1-3) (severity) and (2) CO persistence on day 21 in the absence of severity, and those for Cat 2 classification are (3) CO mean ≥ 1 and (4) conjunctival redness mean ≥ 2. Moreover, it is shown that all classifiable effects (including persistence and CO = 4) should be present in ≥60 % of the animals to drive a classification. As a consequence, our analyses suggest the need for a critical revision of the UN GHS/EU CLP decision criteria for the Cat 1 classification of chemicals. Finally, a number of key criteria are identified that should be taken into consideration when selecting reference chemicals for the development, evaluation and/or validation of alternative methods and

  16. Compilation of the FY 1998 Army General Fund Financial Statements at the Defense Finance and Accounting Service Indianapolis Center

    National Research Council Canada - National Science Library

    1999-01-01

    Our objective was to determine whether the DFAS Indianapolis Center consistently and accurately compiled financial data from field activities and other sources for the FY 1998 Army General Fund financial statements...

  17. Development of automatic cross section compilation system for MCNP

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Sakurai, Kiyoshi

    1999-01-01

    A development of a code system to automatically convert cross-sections for MCNP is in progress. The NJOY code is, in general, used to convert the data compiled in the ENDF format (Evaluated Nuclear Data Files by BNL) into the cross-section libraries required by various reactor physics codes. While the cross-section library: FSXLIB-J3R2 was already converted from the JENDL-3.2 version of Japanese Evaluated Nuclear Data Library for a continuous energy Monte Carlo code MCNP, the library keeps only the cross-sections at room temperature (300 K). According to the users requirements which want to have cross-sections at higher temperature, say 600 K or 900 K, a code system named 'autonj' is under development to provide a set of cross-section library of arbitrary temperature for the MCNP code. This system can accept any of data formats adopted JENDL that may not be treated by NJOY code. The input preparation that is repeatedly required at every nuclide on NJOY execution is greatly reduced by permitting the conversion process of as many nuclides as the user wants in one execution. A few MCNP runs were achieved for verification purpose by using two libraries FSXLIB-J3R2 and the output of autonj'. The almost identical MCNP results within the statistical errors show the 'autonj' output library is correct. In FY 1998, the system will be completed, and in FY 1999, the user's manual will be published. (K. Tsuchihashi)

  18. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  19. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  20. Compiling for Novel Scratch Pad Memory based Multicore Architectures for Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Aviral

    2016-02-05

    The objective of this proposal is to develop tools and techniques (in the compiler) to manage data of a task and communication among tasks on the scratch pad memory (SPM) of the core, so that any application (a set of tasks) can be executed efficiently on an SPM based manycore architecture.

  1. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  2. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  3. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  4. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  5. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  6. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  7. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  8. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  9. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  10. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  11. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  12. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  13. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  14. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1984, July-September. Volume 9, No. 3

    International Nuclear Information System (INIS)

    1984-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. These precede the following indexes: Contractor Report Number, Personal Author, Subject, NRC Originating Organization (Staff Reports), NRC Contract Sponsor (Contractor Reports), Contractor, and Licensed Facility

  15. Properties of lanthanum hexaboride a compilation

    CERN Document Server

    Fisher, D J

    2013-01-01

    Lanthanum hexaboride is useful because it possesses a high melting point (2210C), a low work function, one of the highest known electron emissivities, and is stable in vacuum. This volume summarises the extant data on the properties of this material, including the: bulk modulus, conductivity, crystal structure, Debye temperature, defect structure, elastic constants, electronic structure, emissivity, Fermi surface, hardness, heat capacity, magnetoresistance, reflectivity, resistivity, specific heat, surface structure, thermal conductivity, thermoelectric power, toughness and work function. The

  16. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  17. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  18. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  19. Licensee Event Report (LER) compilation, December 1991

    International Nuclear Information System (INIS)

    1992-01-01

    This monthly report contains licensee event report (LER) operational information that was processed into the LER data file of the Nuclear Safety Information Center (NSIC) during the one month period identified on the cover of the document. The LERs, from which this information is derived, are submitted to the Nuclear Regulatory Commission (NRC) by nuclear power plant licensees in accordance with federal regulations. Procedures for LER reporting for revisions to those events occurring prior to 1984 are described in NRC Regulatory Guide 1.16 and NUREG-0161, Instructions for Preparation of Data Entry Sheets for Licensee Event Reports. For those events occurring on and after January 1, 1984, LERs are being submitted in accordance with the revised rule contained in Title 10 Part 50.73 of the Code of Federal Regulations (10 CFR 50.73 - Licensee Event Report System) which was published in the Federal Register (Vol. 48, No. 144) on July 26, 1983. NUREG-1022, Licensee Event Report System - Description of Systems and Guidelines for Reporting, provides supporting guidance and information on the revised LER rule

  20. Licensee Event Report (LER) compilation, March 1992

    International Nuclear Information System (INIS)

    1992-05-01

    This monthly report contains Licensee Event Report (LER) operational information that was processed into the LER data file of the Nuclear Operations Analysis Center (NOAC) during the one month period identified on the cover of the document. The LERs, from which this information is derived, are submitted to the Nuclear Regulatory Commission (NRC) by nuclear power plant licensees in accordance with federal regulations. Procedures for LER reporting for revisions to those events occurring prior to 1984 are described in NRC Regulatory Guide 1.16 and NUREG-0161, Instructions for Preparation of Data Entry Sheets for Licensee Event Reports. For those events occurring on and after January 1, 1984, LERs are being submitted in accordance with the revised rule contained in Title 10 Part 50.73 of the Code of Federal Regulations (10 CFR 50.73 -- Licensee Event Report System) which was published in the Federal Register (Vol. 48, No. 144) on July 26, 1983. NUREG-1022, Licensee Event Report System -- Description of Systems and Guidelines for Reporting, provides supporting guidance and information on the revised LER rule. The LER summaries in this report are arranged alphabetically by facility name and then chronologically by event date for each facility. Component, system, keyword, and component vendor indexes follow the summaries. Vendors are those identified by the utility when the LER form is initiated; the keywords for the component, system, and general keyword indexes are assigned by the computer using correlation tables from the Sequence Coding and Search System

  1. Licensee Event Report (LER) compilation, April 1991

    International Nuclear Information System (INIS)

    1991-05-01

    This monthly report contains Licensee Event Report (LER) operational information that was processed into the LER data file of the Nuclear Safety Information Center (NSIC) during the one month period identified on the cover of the document. The LERs, from which this information is derived, are submitted to the Nuclear Regulatory Commission (NRC) by nuclear power plant licensees in accordance with federal regulations. Procedures for LER reporting for revisions to those events occurring prior to 1984 are described in NRC Regulatory Guide 1.16 and NUREG-0161, Instructions for Preparation of Data Entry Sheets for Licensee Event Reports. For those events occurring on and after January 1, 1984, LERs are being submitted in accordance with the revised rule contained in Title 10 Part 50.73 of the Code of Federal Regulations (10 CFR 50.73 -- Licensee Event Report System) which was published in the Federal Register (Vol. 48, No. 144) on July 26, 1983. NUREG-1022, Licensee Event Report System -- Description of Systems and Guidelines for Reporting, provides supporting guidance and information on the revised LER rule. The LER summaries in this report are arranged alphabetically by facility name and then chronologically by event date for each facility. Component, system, keyword, and component vendor indexes follow the summaries. Vendors are those identified by the utility when the LER form is initiated; the keywords for the component, system, and general keyword indexes are assigned by the computer using correlation tables from the Sequence Coding and Search System

  2. The discovery of isotopes a complete compilation

    CERN Document Server

    Thoennessen, Michael

    2016-01-01

    This book describes the exciting discovery of every isotope observed on earth to date, which currently numbers some 3000. For each isotope a short essay highlights the authors of the first publication for the isotope, the laboratory and year where and when the isotope was discovered, as well as details about the production and detection methods used. In controversial cases previously claims are also discussed. At the end a comprehensive table lists all isotopes sorted by elements and a complete list of references. Preliminary versions of these paragraphs have been published over the last few years as separate articles in the journal "Atomic Data and Nuclear Data Tables". The work re-evaluates all assignments judging them with a uniform set of criteria. In addition, the author includes over 100 new isotopes which have been discovered since the articles published. This book is a source of information for researchers as well as enthusiastic laymen alike. From the prepublication review: “The explanations focus ...

  3. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  4. Statistical compilation of NAPAP chemical erosion observations

    Science.gov (United States)

    Mossotti, Victor G.; Eldeeb, A. Raouf; Reddy, Michael M.; Fries, Terry L.; Coombs, Mary Jane; Schmiermund, Ron L.; Sherwood, Susan I.

    2001-01-01

    In the mid 1980s, the National Acid Precipitation Assessment Program (NAPAP), in cooperation with the National Park Service (NPS) and the U.S. Geological Survey (USGS), initiated a Materials Research Program (MRP) that included a series of field and laboratory studies with the broad objective of providing scientific information on acid rain effects on calcareous building stone. Among the several effects investigated, the chemical dissolution of limestone and marble by rainfall was given particular attention because of the pervasive appearance of erosion effects on cultural materials situated outdoors. In order to track the chemical erosion of stone objects in the field and in the laboratory, the Ca 2+ ion concentration was monitored in the runoff solution from a variety of test objects located both outdoors and under more controlled conditions in the laboratory. This report provides a graphical and statistical overview of the Ca 2+ chemistry in the runoff solutions from (1) five urban and rural sites (DC, NY, NJ, NC, and OH) established by the MRP for materials studies over the period 1984 to 1989, (2) subevent study at the New York MRP site, (3) in situ study of limestone and marble monuments at Gettysburg, (4) laboratory experiments on calcite dissolution conducted by Baedecker, (5) laboratory simulations by Schmiermund, and (6) laboratory investigation of the surface reactivity of calcareous stone conducted by Fries and Mossotti. The graphical representations provided a means for identifying erroneous data that can randomly appear in a database when field operations are semi-automated; a purged database suitable for the evaluation of quantitative models of stone erosion is appended to this report. An analysis of the sources of statistical variability in the data revealed that the rate of stone erosion is weakly dependent on the type of calcareous stone, the ambient temperature, and the H + concentration delivered in the incident rain. The analysis also showed

  5. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  6. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  7. Evaluation and compilation of fission product yields 1993

    International Nuclear Information System (INIS)

    England, T.R.; Rider, B.F.

    1995-01-01

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993

  8. Evaluation and compilation of fission product yields 1993

    Energy Technology Data Exchange (ETDEWEB)

    England, T.R.; Rider, B.F.

    1995-12-31

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993.

  9. Fifth Baltic Sea pollution load compilation (PLC-5)

    Energy Technology Data Exchange (ETDEWEB)

    Knuuttila, S.; Svendsen, L. M.; Staaf, H.; Kotilainen, P.; Boutrup, S.; Pyhala, M.; Durkin, M.

    2011-07-01

    This report includes the main results from the Fifth Pollution Load Compilation abbreviated PLC-5. It includes quantified annual waterborne total loads (from rivers, unmonitored and coastal areas as well as direct point and diffuse sources discharging directly to the Baltic Sea) from 1994 to 2008 to provide a basis for evaluating any decreasing (or increasing) trends in the total waterborne inputs to the Baltic Sea. Chapter 1 contains the objectives of PLC and the framework on classification of inputs and sources. Chapter 2 includes a short description of the Baltic Sea catchment area, while the methods for quantification and analysis together with quality assurance topics are briefly introduced in Chapter 3. More detailed information on methodologies is presented in the PLC-5 guidelines (HELCOM 2006). Chapter 4 reports the total inputs to the Baltic Sea of nutrients and selected heavy metals. Furthermore, the results of the quatification of discharges and losses of nitrogen and phosphorus from point and diffuse sources into inland surface waters within the Baltic Sea catchment area (source-oriented approach or gross loads) as well as the total load to the maritime area (load-oriented approarch or net loads) in 2006 are shown. Typically, results are presented by country and by main Baltic Sea sub-region. In Chapter 5, flow normalization is introduced and the results of trend analyses on 1994-2008 time series data on total waterborne loads of nitrogen and phosphorus are given together with a first evaluation of progress in obtaining the provisional reduction targets by country and by main Baltic Sea sub-region. Chapter 6 includes discussion of some of the main conclusions and advice for future PLCs. The annexes contain the flow-normalized annual load data and figures and tables with results from the PLC-5.

  10. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  11. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  12. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  13. Northern hemisphere mid-latitude geomagnetic anomaly revealed from Levantine Archaeomagnetic Compilation (LAC).

    Science.gov (United States)

    Shaar, R.; Tauxe, L.; Agnon, A.; Ben-Yosef, E.; Hassul, E.

    2015-12-01

    The rich archaeological heritage of Israel and nearby Levantine countries provides a unique opportunity for archaeomagnetic investigation in high resolution. Here we present a summary of our ongoing effort to reconstruct geomagnetic variations of the past several millennia in the Levant at decadal to millennial resolution. This effort at the Southern Levant, namely the "Levantine Archaeomagnetic Compilation" (LAC), presently consists of data from over 650 well-dated archaeological objects including pottery, slag, ovens, and furnaces. In this talk we review the methodological challenges in achieving a robust master secular variation curve with realistic error estimations from a large number of different datasets. We present the current status of the compilation, including the southern and western Levant LAC data (Israel, Cyprus, and Jordan) and other published north-eastern Levant data (Syria and southern Turkey), and outline the main findings emerging from these data. The main feature apparent from the new compilation is an extraordinary intensity high that developed over the Levant region during the first two millennia BCE. The climax of this event is a double peak intensity maximum starting at ca. 1000 BCE and ending at ca. 735 BCE, accompanied with at least two events of geomagnetic spikes. Paleomagnetic directions from this period demonstrate anomalies of up to 20 degrees far from the averaged GAD field. This leads us to postulate that the maximum in the intensity is a manifestation of an intense mid-latitude local positive geomagnetic anomaly that persisted for over two centuries.

  14. Nuclear physics at Ganil. A compilation 1989-1991

    International Nuclear Information System (INIS)

    1991-01-01

    This compilation deals with experimental and theoretical work performed at GANIL for the 1989-1991 years about the nuclear structure and nuclear reactions. During this period, the accelerator performances have been strongly increased, as well for the delivered energies and intensities as for the span of accelerated ions. In the experimental areas, a totally new data acquisition system has been set up, and the adding of a Wien filter to the Lise spectrometer results now in a versatile and efficient isotope separator, called LISE III. The time structure and the large intensity of the beam were decisive in identifying, for the first time, kaon production in heavy ions collisions at the GANIL subthreshold energies. Nucleons have to undergo several collisions before inducing such a process, and the strange particle emission should be very sensitive to the physical conditions of the hot and compressed interacting zone. Lead and Uranium beams now available at the Fermi energy, have been used to study the nuclear disassembly of very large and heavy systems. New results have been obtained on the collective flow in heavy ion reactions, giving new insights on the Equation of State problematics. In the field of nuclear structure, the magnetic spectrometer SPEG, coupled with large particle or gamma detectors shed light on new aspects of the giant resonance excitations. Exotic nuclei are extensively studied, with a particular emphasis on the 11Li nucleus. A new method of mass measurement, using the CSS2 as a mass separator, has been successfully tested; it will greatly improve the accuracy achieved on intermediate and heavy nuclei. Last but not least, the theory group is actively working to include fluctuations in the description of the nuclear dynamics and to characterise the onset of the multifragmentation process in heavy ion collisions. Author index and publication list are added

  15. A compiled checklist of seaweeds of Sudanese Red Sea coast

    Directory of Open Access Journals (Sweden)

    Nahid Abdel Rahim Osman

    2016-02-01

    Full Text Available Objective: To present an updated and compiled checklist of Sudanese seaweeds as an example for the region for conservational as well as developmental purposes. Methods: The checklist was developed based on both field investigations using line transect method at 4 sites along the Red Sea coast of Sudan and review of available studies done on Sudanese seaweeds. Results: In total 114 macroalgal names were recorded and were found to be distributed in 16 orders, 34 families, and 62 genera. The Rhodophyceae macroalgae contained 8 orders, 17 families, 32 genera and 47 species. The Phaeophyceae macroalgae composed of 4 orders, 5 families, 17 genera, and 28 species. The 39 species of the Chlorophyceae macroalgae belong to 2 classes, 4 orders, 12 families, and 14 genera. The present paper proposed the addition of 11 macroalgal taxa to be included in Sudan seaweeds species list. These include 3 red seaweed species, 1 brown seaweed species and 7 green seaweed species. Conclusions: This list is not yet inclusive and it only represents the macroalgal species common to the intertidal areas of Sudan Red Sea coast. Further investigation may reveal the presence of more species. While significant levels of diversity and endemism were revealed for other groups of organisms in the Red Sea region, similar work still has to be performed for seaweeds. Considering the impact of climate change on communities’ structure and composition and the growing risk of maritime transportation through the Red Sea particularly that may originate from oil tankers as well as that may emanate from oil exploration, baseline data on seaweeds are highly required for management purposes.

  16. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  17. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  18. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  19. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  20. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  1. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  2. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  3. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  4. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  5. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  6. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  7. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  8. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  9. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  10. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  11. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  12. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  13. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  14. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  15. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  16. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  17. Remeasurement and compilation of excitation function of proton induced reactions on iron for activation techniques

    International Nuclear Information System (INIS)

    Takacs, S.; Vasvary, L.; Tarkanyi, F.

    1994-01-01

    Excitation functions of proton induced reactions on nat Fe(p, xn) 56 Co have been remeasured in the energy region up to 18 MeV using stacked foil technique and standard high resolution gamma-ray spectrometry at the Debrecen MGC-20E cyclotron. Compilation of the available data measured between 1959 and 1993 has been made. The corresponding excitation functions have been reviewed, critical comparison of all the available data was done to obtain the most accurate data set. The feasibility of the evaluated data set was checked by reproducing experimental calibration curves for TLA by calculation. (orig.)

  18. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  19. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  20. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  1. Compilation and network analyses of cambrian food webs.

    Directory of Open Access Journals (Sweden)

    Jennifer A Dunne

    2008-04-01

    Full Text Available A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid

  2. Compilation and network analyses of cambrian food webs.

    Science.gov (United States)

    Dunne, Jennifer A; Williams, Richard J; Martinez, Neo D; Wood, Rachel A; Erwin, Douglas H

    2008-04-29

    A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid diversification of species, body

  3. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  4. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  5. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  6. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  7. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  8. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  9. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  10. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  11. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  12. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  13. MoviCompile : An LLVM based compiler for heterogeneous SIMD code generation

    NARCIS (Netherlands)

    Diken, E.; Jordans, R.; O'Riordan, M.

    2015-01-01

    Numerous applications in communication and multimedia domains show significant data-level parallelism (DLP). The amount of DLP varies between applications in the same domain or even within a single application. Most architectures support a single vector-, SIMD-width which may not be optimal. This

  14. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  15. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  16. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  17. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  18. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  19. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  20. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    International Nuclear Information System (INIS)

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index