WorldWideScience

Sample records for reference resources compiled

  1. Compilation of anatomical, physiological and dietary characteristics for a Filipino Reference Man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, C.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Shiraishi, K.

    1998-01-01

    The Asian Reference Man is the study of the biological characteristics of the different ethnic populations in the Asian Region. Its aim is to update the existing International Reference Values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian Reference Man and is represented by the Philippine Nuclear Research Institute. The biological parameters included in the study are the physical, anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are presented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper Compiled Program. (author)

  2. Compilation of anatomical, physiological and dietary characteristics for a Filipino reference man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, G.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Kawamura, H.; Shiraishi, K.

    1995-01-01

    The Asian reference man is a study of the biological characteristics of the different ethnic populations in the Asian region. Its aim is to update the existing international values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian reference man and is presented by the Philippine Nuclear Research Institute. The biological parameters included in this study are the physical anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are represented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper compiled Program. (author). 18 refs., 12 tabs., 1 fig

  3. World Reference Base for Soil Resources

    NARCIS (Netherlands)

    Deckers, J.A.; Driessen, P.M.; Nachtergaele, F.O.; Spaargaren, O.C.

    2002-01-01

    In 1998, the International Union of Soil Sciences (IUSS) officially adopted the world reference base for soil resources (WRB) as the Union's system for soil correlation. The structure, concepts, and definitions of the WRB are strongly influenced by the FAO-UNESCO legend of the soil map of the world

  4. The NASA earth resources spectral information system: A data compilation, second supplement

    Science.gov (United States)

    Vincent, R. K.

    1973-01-01

    The NASA Earth Resources Spectral Information System (ERSIS) and the information contained therein are described. It is intended for use as a second supplement to the NASA Earth Resources Spectral Information System: A Data Compilation, NASA CR-31650-24-T, May 1971. The current supplement includes approximately 100 rock and mineral, and 375 vegetation directional reflectance spectral curves in the optical region from 0.2 to 22.0 microns. The data were categorized by subject and each curve plotted on a single graph. Each graph is fully titled to indicate curve source and indexed by subject to facilitate user retrieval from ERSIS magnetic tape records.

  5. REM-3D Reference Datasets: Reconciling large and diverse compilations of travel-time observations

    Science.gov (United States)

    Moulik, P.; Lekic, V.; Romanowicz, B. A.

    2017-12-01

    A three-dimensional Reference Earth model (REM-3D) should ideally represent the consensus view of long-wavelength heterogeneity in the Earth's mantle through the joint modeling of large and diverse seismological datasets. This requires reconciliation of datasets obtained using various methodologies and identification of consistent features. The goal of REM-3D datasets is to provide a quality-controlled and comprehensive set of seismic observations that would not only enable construction of REM-3D, but also allow identification of outliers and assist in more detailed studies of heterogeneity. The community response to data solicitation has been enthusiastic with several groups across the world contributing recent measurements of normal modes, (fundamental mode and overtone) surface waves, and body waves. We present results from ongoing work with body and surface wave datasets analyzed in consultation with a Reference Dataset Working Group. We have formulated procedures for reconciling travel-time datasets that include: (1) quality control for salvaging missing metadata; (2) identification of and reasons for discrepant measurements; (3) homogenization of coverage through the construction of summary rays; and (4) inversions of structure at various wavelengths to evaluate inter-dataset consistency. In consultation with the Reference Dataset Working Group, we retrieved the station and earthquake metadata in several legacy compilations and codified several guidelines that would facilitate easy storage and reproducibility. We find strong agreement between the dispersion measurements of fundamental-mode Rayleigh waves, particularly when made using supervised techniques. The agreement deteriorates substantially in surface-wave overtones, for which discrepancies vary with frequency and overtone number. A half-cycle band of discrepancies is attributed to reversed instrument polarities at a limited number of stations, which are not reflected in the instrument response history

  6. Reference Inflow Characterization for River Resource Reference Model (RM2)

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent S [ORNL

    2011-12-01

    Sandia National Laboratory (SNL) is leading an effort to develop reference models for marine and hydrokinetic technologies and wave and current energy resources. This effort will allow the refinement of technology design tools, accurate estimates of a baseline levelized cost of energy (LCoE), and the identification of the main cost drivers that need to be addressed to achieve a competitive LCoE. As part of this effort, Oak Ridge National Laboratory was charged with examining and reporting reference river inflow characteristics for reference model 2 (RM2). Published turbulent flow data from large rivers, a water supply canal and laboratory flumes, are reviewed to determine the range of velocities, turbulence intensities and turbulent stresses acting on hydrokinetic technologies, and also to evaluate the validity of classical models that describe the depth variation of the time-mean velocity and turbulent normal Reynolds stresses. The classical models are found to generally perform well in describing river inflow characteristics. A potential challenge in river inflow characterization, however, is the high variability of depth and flow over the design life of a hydrokinetic device. This variation can have significant effects on the inflow mean velocity and turbulence intensity experienced by stationary and bottom mounted hydrokinetic energy conversion devices, which requires further investigation, but are expected to have minimal effects on surface mounted devices like the vertical axis turbine device designed for RM2. A simple methodology for obtaining an approximate inflow characterization for surface deployed devices is developed using the relation umax=(7/6)V where V is the bulk velocity and umax is assumed to be the near-surface velocity. The application of this expression is recommended for deriving the local inflow velocity acting on the energy extraction planes of the RM2 vertical axis rotors, where V=Q/A can be calculated given a USGS gage flow time

  7. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  8. Compilation of anatomical, physiological and metabolic characteristics of Reference Asian Man in Pakistan

    International Nuclear Information System (INIS)

    Manzoor A. Atta; Perveen Akhter; Malik, G.M.

    1998-01-01

    A research programme was initiated in collaboration with IAEA/RCA to establish local sex specific data and latter on to contribute to define a reference Asian man/woman in the age ran-e of 5, 10, 15, 20-29, 30-39, 40-49 and 20-50 years in order to strengthen the radiation protection infrastructure of the country. Physical data on height, weight, chest and head circumference and food consumption data of reference Pakistani man/women were collected from various socioeconomic strata residing at different ecological areas of Pakistan. The present study revealed that our daily nutritional status and all the physical parameters are significantly lower than ICRP reference man of Caucasian origin except the standing height of male. Since the anatomical organs are roughly proportional to body size so approximation can be made for internal dosimetry purposes with the same ratio as defined by those countries who experimentally established their values. (author)

  9. Compilation of elemental concentration data for NBS Biological and Environmental Standard Reference Materials

    International Nuclear Information System (INIS)

    Gladney, E.S.

    1980-07-01

    Concentration data on up to 76 elementals in 19 NBS Standard Reference Materials have been collected from 325 journal articles and technical reports. These data are summarized into mean +- one standard deviation values and compared with available data from NBS and other review articles. Data are presented on the analytical procedures employed and all raw data are presented in appendixes

  10. IAEA-RCA co-ordinated research programme compilation of anatomical, physiological and metabolic characteristics for a reference asian man

    International Nuclear Information System (INIS)

    Kawamura, H.; Tanaka, G.; Miah, F.K.

    1996-01-01

    The world-wide radiation protection concerns that followed the Chernobyl accident created an increased need for basic data to assess internal doses to members of the public in each country. Therefore, the Reference Asian Man Programme was proposed by Japan and initiated as a part of the IAEA Regional Cooperative Agreement Project for Strengthening Radiation Protection Infrastructure to coincide with the revision of Reference Man data by ICRP Committee 2. The first priority was given to compilation of data on physique, and mass of internal organs where possible, for populations throughout the Asian region. Data on daily consumption of foods and nutrients were also emphasized since they are important components of the ingestion pathway for uptake of radionuclides by humans. Pulmonary function and water balance parameters were considered important in relation to the inhalation pathway and water contamination. The programme was also endorsed by ICRP. (author)

  11. Toward a Last Interglacial Compilation Using a Tephra-based Chronology: a Future Reference For Model-data Comparison

    Science.gov (United States)

    Bazin, L.; Govin, A.; Capron, E.; Nomade, S.; Lemieux-Dudon, B.; Landais, A.

    2017-12-01

    The Last Interglacial (LIG, 129-116 ka) is a key period to decipher the interactions between the different components of the climate system under warmer-than-preindustrial conditions. Modelling the LIG climate is now part of the CMIP6/PMIP4 targeted simulations. As a result, recent efforts have been made to propose surface temperature compilations focusing on the spatio-temporal evolution of the LIG climate, and not only on its peak warmth as previously proposed. However, the major limitation of these compilations remains in the climatic alignment of records (e.g. temperature, foraminiferal δ18O) that is performed to define the sites' chronologies. Such methods prevent the proper discussion of phase relationship between the different sites. Thanks to recent developments of the Bayesian Datice dating tool, we are now able to build coherent multi-archive chronologies with a proper propagation of the associated uncertainties. We make the best use of common tephra layers identified in well-dated continental archives and marine sediment cores of the Mediterranean region to propose a coherent chronological framework for the LIG independent of any climatic assumption. We then extend this precise chronological context to the North Atlantic as a first step toward a global coherent compilation of surface temperature and stable isotope records. Based on this synthesis, we propose guidelines for the interpretation of different proxies measured from different archives that will be compared with climate model parameters. Finally, we present time-slices (e.g. 127 ka) of the preliminary regional synthesis of temperature reconstructions and stable isotopes to serve as reference for future model-data comparison of the up-coming CMIP6/PMIP4 LIG simulations.

  12. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  13. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-07-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  14. Data compilation and evaluation of U(IV) and U(VI) for thermodynamic reference database THEREDA

    International Nuclear Information System (INIS)

    Richter, Anke; Bok, Frank; Brendler, Vinzenz

    2015-01-01

    THEREDA (Thermodynamic Reference Database) is a collaborative project, which has been addressed this challenge. The partners are Helmholtz-Zentrum Dresden-Rossendorf, Karlsruhe Institute of Technology (KIT-INE), Gesellschaft fuer Anlagen- und Reaktorsicherheit Braunschweig mbH (GRS), TU Bergakademie Freiberg (TUBAF) and AF-Consult Switzerland AG (Baden, Switzerland). The aim of the project is the establishment of a consistent and quality assured database for all safety relevant elements, temperature and pressure ranges, with its focus on saline systems. This implied the use of the Pitzer approach to compute activity coefficients suitable for such conditions. Data access is possible via commonly available internet browsers under the address http://www.thereda.de. One part of the project - the data collection and evaluation for uranium - was a task of the Helmholtz-Zentrum Dresden-Rossendorf. The aquatic chemistry and thermodynamics of U(VI) and U(IV) is of great importance for geochemical modelling in repository-relevant systems. The OECD/NEA Thermochemical Database (NEA TDB) compilation is the major source for thermodynamic data of the aqueous and solid uranium species, even though this data selection does not utilize the Pitzer model for the ionic strength effect correction. As a result of the very stringent quality demands, the NEA TDB is rather restrictive and therefore incomplete for extensive modelling calculations of real systems. Therefore, the THEREDA compilation includes additional thermodynamic data of solid secondary phases formed in the waste material, the backfill and the host rock, though falling into quality assessment (QA) categories of lower accuracy. The data review process prefers log K values from solubility experiments (if available) to those calculated from thermochemical data.

  15. Using XML Technologies to Organize Electronic Reference Resources

    OpenAIRE

    Huser, Vojtech; Del Fiol, Guilherme; Rocha, Roberto A.

    2005-01-01

    Provision of access to reference electronic resources to clinicians is becoming increasingly important. We have created a framework for librarians to manage access to these resources at an enterprise level, rather than at the individual hospital libraries. We describe initial project requirements, implementation details, and some preliminary results.

  16. June, 2015 Utilization of Reference Resources and Services

    African Journals Online (AJOL)

    Department of Library and Information Science, MAUTECH, Yola ... reference resources and services mostly for their course work and research works. ... business settings; reference services provided ..... Table 5: Strategies to be adopted to overcome the problems of provision and .... American Library Association, p.782.

  17. Compilation Of An Econometric Human Resource Efficiency Model For Project Management Best Practices

    OpenAIRE

    G. van Zyl; P. Venier

    2006-01-01

    The aim of the paper is to introduce a human resource efficiency model in order to rank the most important human resource driving forces for project management best practices. The results of the model will demonstrate how the human resource component of project management acts as the primary function to enhance organizational performance, codified through improved logical end-state programmes, work ethics and process contributions. Given the hypothesis that project management best practices i...

  18. Compilation Of An Econometric Human Resource Efficiency Model For Project Management Best Practices

    Directory of Open Access Journals (Sweden)

    G. van Zyl

    2006-11-01

    Full Text Available The aim of the paper is to introduce a human resource efficiency model in order to rank the most important human resource driving forces for project management best practices. The results of the model will demonstrate how the human resource component of project management acts as the primary function to enhance organizational performance, codified through improved logical end-state programmes, work ethics and process contributions. Given the hypothesis that project management best practices involve significant human resource and organizational changes, one would reasonably expect this process to influence and resonate throughout all the dimensions of an organisation.

  19. Nursing Reference Center: a point-of-care resource.

    Science.gov (United States)

    Vardell, Emily; Paulaitis, Gediminas Geddy

    2012-01-01

    Nursing Reference Center is a point-of-care resource designed for the practicing nurse, as well as nursing administrators, nursing faculty, and librarians. Users can search across multiple resources, including topical Quick Lessons, evidence-based care sheets, patient education materials, practice guidelines, and more. Additional features include continuing education modules, e-books, and a new iPhone application. A sample search and comparison with similar databases were conducted.

  20. The resource theory of quantum reference frames: manipulations and monotones

    International Nuclear Information System (INIS)

    Gour, Gilad; Spekkens, Robert W

    2008-01-01

    Every restriction on quantum operations defines a resource theory, determining how quantum states that cannot be prepared under the restriction may be manipulated and used to circumvent the restriction. A superselection rule (SSR) is a restriction that arises through the lack of a classical reference frame and the states that circumvent it (the resource) are quantum reference frames. We consider the resource theories that arise from three types of SSRs, associated respectively with lacking: (i) a phase reference, (ii) a frame for chirality, and (iii) a frame for spatial orientation. Focusing on pure unipartite quantum states (and in some cases restricting our attention even further to subsets of these), we explore single-copy and asymptotic manipulations. In particular, we identify the necessary and sufficient conditions for a deterministic transformation between two resource states to be possible and, when these conditions are not met, the maximum probability with which the transformation can be achieved. We also determine when a particular transformation can be achieved reversibly in the limit of arbitrarily many copies and find the maximum rate of conversion. A comparison of the three resource theories demonstrates that the extent to which resources can be interconverted decreases as the strength of the restriction increases. Along the way, we introduce several measures of frameness and prove that these are monotonically non-increasing under various classes of operations that are permitted by the SSR

  1. The science, technology and research network (STARNET) a searchable thematic compilation of web resources

    Science.gov (United States)

    Blados, W.R.; Cotter, G.A.; Hermann, T.

    2007-01-01

    International alliances in space efforts have resulted in a more rapid diffusion of space technology. This, in turn, increases pressure on organizations to push forward with technological developments and to take steps to maximize their inclusion into the research and development (R&D) process and the overall advancement and enhancement of space technology. To cope with this vast and rapidly growing amount of data and information that is vital to the success of the innovation, the Information Management Committee (IMC) of the Research Technology Agency (RTA) developed the science, technology and research network (STARNET). The purpose of this network is to facilitate access to worldwide information elements in terms of science, technology and overall research. It provides a virtual library with special emphasis on international security; a "one stop" information resource for policy makers, program managers, scientists, engineers, researchers and others. ?? 2007 IEEE.

  2. A compilation of silicon, rare earth element and twenty-one other trace element concentrations in the natural river water reference material SLRS-5 (NRC-CNRC)

    International Nuclear Information System (INIS)

    Yeghicheyan, Delphine; Cloquet, Christophe; Bossy, Cecile; Bouhnik Le Coz, Martine; Douchet, Chantal; Granier, Guy; Heimburger, Alexie; Losno, Remi; Lacan, Francois; Labatut, Marie; Pradoux, Catherine; Lanzanova, Aurelie; Candaudap, Frederic; Chmeleff, Jerome; Rousseau, Tristan C.C.; Seidel, Jean-Luc; Delpoux, Sophie; Tharaud, Mickael; Sivry, Yann; Sonke, Jeroen E.

    2013-01-01

    The natural river water certified reference material SLRS-5 (NRC-CNRC) was routinely analysed in this study for major and trace elements by ten French laboratories. Most of the measurements were made using ICP-MS. Because no certified values are assigned by NRC-CNRC for silicon and 35 trace element concentrations (rare earth elements, Ag, B, Bi, Cs, Ga, Ge, Li, Nb, P, Rb, Rh, Re, S, Sc, Sn, Th, Ti, Tl, W, Y and Zr), or for isotopic ratios, we provide a compilation of the concentrations and related uncertainties obtained by the participating laboratories. Strontium isotopic ratios are also given. (authors)

  3. Compilation of Water-Resources Data and Hydrogeologic Setting for Brunswick County, North Carolina, 1933-2000

    Science.gov (United States)

    Fine, Jason M.; Cunningham, William L.

    2001-01-01

    Water-resources data were compiled for Brunswick County, North Carolina, to describe the hydrologic conditions of the County. Hydrologic data collected by the U.S. Geological Survey as well as data collected by other governmental agencies and reviewed by the U.S. Geological Survey are presented. Data from four weather stations and two surface-water stations are summarized. Data also are presented for land use and land cover, soils, geology, hydrogeology, 12 continuously monitored ground-water wells, 73 periodically measured ground-water wells, and water-quality measurements from 39 ground-water wells. Mean monthly precipitation at the Longwood, Shallotte, Southport, and Wilmington Airport weather stations ranged from 2.19 to 7.94 inches for the periods of record, and mean monthly temperatures at the Longwood, Southport, and Wilmington Airport weather stations ranged from 43.4 to 80.1 degrees Fahrenheit for the periods of record. An evaluation of land-use and land-cover data for Brunswick County indicated that most of the County is either forested land (about 57 percent) or wetlands (about 29 percent). Cross sections are presented to illustrate the general hydrogeology beneath Brunswick County. Water-level data for Brunswick County indicate that water levels ranged from about 110 feet above mean sea level to about 22 feet below mean sea level. Chloride concentrations measured in aquifers in Brunswick County ranged from near 0 to 15,000 milligrams per liter. Chloride levels in the Black Creek and Cape Fear aquifers were measured at well above the potable limit for ground water of 250 milligrams per liter set by the U.S. Environmental Protection Agency for safe drinking water.

  4. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 2: Country reports. Results of a co-ordinated research programme 1988-1993

    International Nuclear Information System (INIS)

    1998-02-01

    The coordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Funding for the RCM by the Government of Japan is gratefully acknowledged. The IAEA wishes to thank S. Kobayashi for his efforts in support of the CRP. The IAEA extends its appreciation to the Japanese National Institute of Radiological Sciences for acting as the technical secretariat to co-ordinate the work of data compilation. Specifically, the IAEA acknowledges the contributions of H. Kawamura, G. Tanaka and T. Koyanagi. Appreciation is also extended to the National Institute of Radiological Sciences, Japan, the Bhabha Atomic Research Centre, India, and the Chinese Academy of Medical Sciences for the valuable contribution they made to the CRP as hosts for the RCMS. The IAEA officers responsible for this publication were A. Moiseev and R.V. Griffith of the Division of Radiation and Waste Safety. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries

  5. Overexploitation of fishry resources, with particular reference to Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Ansari, Z.A.; Achuthankutty, C.T.; Dalal, S.G.

    consideration of physical reforms such as resource rent from which economic benefits can be derived in the form of value-added local employment, income, and food security, although these are highly location specific. Opportunities exist to review progress...

  6. News from the Library: Online particle physics information: a unique compilation of information resources in particle physics

    CERN Multimedia

    CERN Library

    2012-01-01

    Are you looking for some specific information in particle physics? For example, the main literature databases, data repositories or laboratories...   Just go to the Libary's Online Particle Physics Information page. There you'll find a wide selection of relevant information, as well as resources in particle physics and related areas. The collection covers all aspects of the discipline - in addition to traditional scientific information resources you can find, for example, a selection of relevant blogs and art websites. This webpage is an extended and regularly updated version of the chapter on Online Particle Physics Information in the Review of Particle Properties. It is maintained by the CERN Library team which welcomes suggestions for additions and updates: library.desk@cern.ch.  

  7. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement 34, 1988.

    Science.gov (United States)

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    The Environmental Quality Instructional Resources Center in Columbus, Ohio, acquires, reviews, indexes, and announces both print (books, modules, units, etc.) and non-print (films, slides, video tapes, etc.) materials related to water quality and water resources education and instruction. In addition some materials related to pesticides, hazardous…

  8. Water Quality Instructional Resources Information System (IRIS): A Compilation of Abstracts to Water Quality and Water Resources Materials. Supplement 32, 1987.

    Science.gov (United States)

    Ohio State Univ., Columbus, OH. Information Reference Center for Science, Mathematics, and Environmental Education.

    The Environmental Quality Instructional Resources Center in Columbus, Ohio, acquires, reviews, indexes, and announces both print (books, modules, units, etc.) and non-print (films, slides, video tapes, etc.) materials related to water quality and water resources education and instruction. In addition some materials related to pesticides, hazardous…

  9. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  10. Braille Literacy: Resources for Instruction, Writing Equipment, and Supplies. NLS Reference Circulars

    Science.gov (United States)

    Peaco, Freddie L., Comp.

    2004-01-01

    This reference circular lists instructional materials, supplies, and equipment currently available for learning braille, and cites sources about braille literacy. The resources given are intended to assist sighted individuals who are interested in learning braille or want to transcribe print materials into braille; instructors who teach braille;…

  11. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  12. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  13. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  14. Geologic utility of improved orbital measurement capabilities in reference to non-renewable resources

    Science.gov (United States)

    Stewart, H.; Marsh, S.

    1982-01-01

    Spectral and spatial characteristics necessary for future orbital remote sensing systems are defined. The conclusions are based on the past decade of experience in exploring for non-renewable resources with reference to data from ground, aircraft, and orbital systems. Two principle areas of investigation are used in the discussion: a structural interpretation in a basin area for hydrocarbon exploration, and a discrimination of altered areas in the Cuprite district in Nevada.

  15. Consideration of reference points for the management of renewable resources under an adaptive management paradigm

    Science.gov (United States)

    Irwin, Brian J.; Conroy, Michael J.

    2013-01-01

    The success of natural resource management depends on monitoring, assessment and enforcement. In support of these efforts, reference points (RPs) are often viewed as critical values of management-relevant indicators. This paper considers RPs from the standpoint of objective-driven decision making in dynamic resource systems, guided by principles of structured decision making (SDM) and adaptive resource management (AM). During the development of natural resource policy, RPs have been variously treated as either ‘targets’ or ‘triggers’. Under a SDM/AM paradigm, target RPs correspond approximately to value-based objectives, which may in turn be either of fundamental interest to stakeholders or intermediaries to other central objectives. By contrast, trigger RPs correspond to decision rules that are presumed to lead to desirable outcomes (such as the programme targets). Casting RPs as triggers or targets within a SDM framework is helpful towards clarifying why (or whether) a particular metric is appropriate. Further, the benefits of a SDM/AM process include elucidation of underlying untested assumptions that may reveal alternative metrics for use as RPs. Likewise, a structured decision-analytic framework may also reveal that failure to achieve management goals is not because the metrics are wrong, but because the decision-making process in which they are embedded is insufficiently robust to uncertainty, is not efficiently directed at producing a resource objective, or is incapable of adaptation to new knowledge.

  16. Evaluation of a fungal collection as certified reference material producer and as a biological resource center

    Directory of Open Access Journals (Sweden)

    Tatiana Forti

    2016-06-01

    Full Text Available Abstract Considering the absence of standards for culture collections and more specifically for biological resource centers in the world, in addition to the absence of certified biological material in Brazil, this study aimed to evaluate a Fungal Collection from Fiocruz, as a producer of certified reference material and as Biological Resource Center (BRC. For this evaluation, a checklist based on the requirements of ABNT ISO GUIA34:2012 correlated with the ABNT NBR ISO/IEC17025:2005, was designed and applied. Complementing the implementation of the checklist, an internal audit was performed. An evaluation of this Collection as a BRC was also conducted following the requirements of the NIT-DICLA-061, the Brazilian internal standard from Inmetro, based on ABNT NBR ISO/IEC 17025:2005, ABNT ISO GUIA 34:2012 and OECD Best Practice Guidelines for BRCs. This was the first time that the NIT DICLA-061 was applied in a culture collection during an internal audit. The assessments enabled the proposal for the adequacy of this Collection to assure the implementation of the management system for their future accreditation by Inmetro as a certified reference material producer as well as its future accreditation as a Biological Resource Center according to the NIT-DICLA-061.

  17. Evaluation of a fungal collection as certified reference material producer and as a biological resource center.

    Science.gov (United States)

    Forti, Tatiana; Souto, Aline da S S; do Nascimento, Carlos Roberto S; Nishikawa, Marilia M; Hubner, Marise T W; Sabagh, Fernanda P; Temporal, Rosane Maria; Rodrigues, Janaína M; da Silva, Manuela

    2016-01-01

    Considering the absence of standards for culture collections and more specifically for biological resource centers in the world, in addition to the absence of certified biological material in Brazil, this study aimed to evaluate a Fungal Collection from Fiocruz, as a producer of certified reference material and as Biological Resource Center (BRC). For this evaluation, a checklist based on the requirements of ABNT ISO GUIA34:2012 correlated with the ABNT NBR ISO/IEC17025:2005, was designed and applied. Complementing the implementation of the checklist, an internal audit was performed. An evaluation of this Collection as a BRC was also conducted following the requirements of the NIT-DICLA-061, the Brazilian internal standard from Inmetro, based on ABNT NBR ISO/IEC 17025:2005, ABNT ISO GUIA 34:2012 and OECD Best Practice Guidelines for BRCs. This was the first time that the NIT DICLA-061 was applied in a culture collection during an internal audit. The assessments enabled the proposal for the adequacy of this Collection to assure the implementation of the management system for their future accreditation by Inmetro as a certified reference material producer as well as its future accreditation as a Biological Resource Center according to the NIT-DICLA-061. Copyright © 2016 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.

  18. References:

    African Journals Online (AJOL)

    brain drain”'. Globalization and Health 2006, 2:12 doi: 10.1186/1744-8603-2-12. 3. Zijlstra, E., Broadhead, R. 2007. The College of Medicine in the. Republic of Malawi: towards sustainable staff development, Human. Resources for Health 2007, ...

  19. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  20. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  1. The artifactualization of reference and "substances" on the Web. : Why (HTTP) URIs do not (always) refer nor resources hold by themselves (post-print)

    OpenAIRE

    Monnin , Alexandre

    2012-01-01

    ISSN 2155-9708, http://www.apaonline.org/APAOnline/Publication_Info/Newsletters/APAOnline/Publicatio; International audience; In this paper we show that URIs, sometimes dubbed "philosophical proper names, in fact do not always refer as proper names does. We provide an account explaining why, centered around the notion of "resource", central to webarch, and that we qualify ontologically.

  2. Experiences on current national income measures with reference to environmental and natural resources

    International Nuclear Information System (INIS)

    Franzese, R.; Gaudioso, D.

    1995-06-01

    The environment provides both a source of goods and services and a 'sink' for residues of the production and consumption processes. This is not reflected into conventional estimate of GDP (gross domestic product), the most commonly used measure of aggregate income. The purpose of this paper is to explore whether environmentally-adjusted national income measure can be derived. In the first part, the authors discuss both the shortcomings of the current national income measures, with reference to environmental and natural resources, and the debate on this issues; then they analyse the existing experiences to provide environmentally-adjusted indicators of national accounts. In the second part, the authors present an evaluation of the costs of environmental degradation in Italy in the period 1988-1990, based on the methodologies adopted in a pilot study carried out by UNSO (United Nations Statistical Office) and the World Bank for Mexico

  3. Radioisotope techniques in water resources research and management with special reference to India

    International Nuclear Information System (INIS)

    Banerji, S.

    1977-01-01

    Nuclear techniques using radioisotopes finding applications in research and management of water resources are described briefly with special reference to and representative illustrations of their applications in hydrologic studies in India. As environmental isotopes including the man-made ones i.e. those released in nuclear explosions are intimately tied with the moisture and water in circulation pattern in nature, measurement of their variation provides diagnostic information about the hydrologic parameters of three phases, namely, atmospheric, surface and subsurface, of the hydrologic cycle. Artificial radioisotopes are used for measurement of water flow, sediment transport and seepage. Sealed radioisotope sources are employed in snow gauging, suspended sediment gauging and hydrologic logging. Areas for further research are suggested and need for emphasis on their use in India is indicated. (M.G.B.)

  4. Gulf Coast geopressured-geothermal program summary report compilation. Volume 2-A: Resource description, program history, wells tested, university and company based research, site restoration

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Geopressured-geothermal resource description; Resource origin and sediment type; Gulf Coast resource extent; Resource estimates; Project history; Authorizing legislation; Program objectives; Perceived constraints; Program activities and structure; Well testing; Program management; Program cost summary; Funding history; Resource characterization; Wells of opportunity; Edna Delcambre No. 1 well; Edna Delcambre well recompletion; Fairfax Foster Sutter No. 2 well; Beulah Simon No. 2 well; P.E. Girouard No. 1 well; Prairie Canal No. 1 well; Crown Zellerbach No. 2 well; Alice C. Plantation No. 2 well; Tenneco Fee N No. 1 well; Pauline Kraft No. 1 well; Saldana well No. 2; G.M. Koelemay well No. 1; Willis Hulin No. 1 well; Investigations of other wells of opportunity; Clovis A. Kennedy No. 1 well; Watkins-Miller No. 1 well; Lucien J. Richard et al No. 1 well; and the C and K-Frank A. Godchaux, III, well No. 1.

  5. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    International Nuclear Information System (INIS)

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries

  6. Compilation of anatomical, physiological and metabolic characteristics for a Reference Asian Man. Volume 1: data summary and conclusions. Results of a co-ordinated research programme 1988-1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    The Co-ordinated Research Programme (CRP) on Compilation of Anatomical, Physiological and Metabolic Characteristics for a Reference Asian Man has been conducted as a programme of the IAEA Regional Co-operative Agreement (RCA) for Asia and the Pacific. The CRP was conducted to provide data for radiation protection purposes that is relevant to the biokinetic and dosimetric characteristics of the ethnic populations in the Asian region. The radiological protection decisions that had to be made in the RCA member States following the Chernobyl accident were a significant motivation for establishing the CRP. Eleven RCA Member States participated in the CRP. Research co-ordination meetings (RCMs) for the CRP were held in Mito City, Japan, 17-21 October 1988 and Bhabha Atomic Research Centre, India, 8-12 April 1991. The concluding meeting was held in Tianjin, China, 25-29 October 1993. This publication is divided into two volumes: Volume 1 contains a summary of the data and conclusions from the project and Volume 2 the reports from participating countries. Refs, figs, tabs.

  7. Report compiled by Research Center for Carbonaceous Resources, Institute for Chemical Reaction Science, Tohoku University; Tohoku Daigaku Hanno Kagaku Kenkyusho tanso shigen hanno kenkyu center hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-04-01

    The Research Center for Carbonaceous Resources was established in April 1991 for the purpose of developing a comprehensive process for converting carbonaceous resources into clean fuels or into materials equipped with advanced functions. In this report, the track records etc. of the center are introduced. Under study in the conversion process research department is the organization of a comprehensive coal conversion process which will be a combination of solvent extraction, catalytic decomposition, and catalytic gasification, whose goal is to convert coal in a clean way at high efficiency. Under study in the conversion catalyst research department are the development of a coal denitrogenation method, development of a low-temperature gasification method by use of inexpensive catalysts, synthesis of C{sub 2} hydrocarbons in a methane/carbon dioxide reaction, etc. Other endeavors under way involve the designing and development of new organic materials such as new carbon materials and a study of the foundation on which such efforts stand, that is, the study of the control of reactions between solids. Furthermore, in the study of interfacial reaction control, the contact gasification of coal, brown coal ion exchange capacity and surface conditions, carbonization of cation exchanged brown coal, etc., are being developed. (NEDO)

  8. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  9. Gulf Coast geopressured-geothermal program summary report compilation. Volume 2-B: Resource description, program history, wells tested, university and company based research, site restoration

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Design well program; LaFourche Crossing; MG-T/DOE Amoco Fee No. 1 (Sweet Lake); Environmental monitoring at Sweet Lake; Air quality; Water quality; Microseismic monitoring; Subsidence; Dow/DOE L.R. Sweezy No. 1 well; Reservoir testing; Environmental monitoring at Parcperdue; Air monitoring; Water runoff; Groundwater; Microseismic events; Subsidence; Environmental consideration at site; Gladys McCall No. 1 well; Test results of Gladys McCall; Hydrocarbons in production gas and brine; Environmental monitoring at the Gladys McCall site; Pleasant Bayou No. 2 well; Pleasant Bayou hybrid power system; Environmental monitoring at Pleasant Bayou; and Plug abandonment and well site restoration of three geopressured-geothermal test sites. 197 figs., 64 tabs.

  10. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  11. Effects of resource activities upon repository siting and waste containment with reference to bedded salt

    International Nuclear Information System (INIS)

    Ashby, J.; Rowe, J.

    1980-02-01

    The primary consideration for the suitability of a nuclear waste repository site is the overall ability of the repository to safely contain radioactive waste. This report is a discussion of the past, present, and future effects of resource activities on waste containment. Past and present resource activities which provide release pathways (i.e., leaky boreholes, adjacent mines) will receive initial evaluation during the early stages of any repository site study. However, other resource activities which may have subtle effects on containment (e.g., long-term pumping causing increased groundwater gradients, invasion of saline water causing lower retardation) and all potential future resource activities must also be considered during the site evaluation process. Resource activities will affect both the siting and the designing of repositories. Ideally, sites should be located in areas of low resource activity and low potential for future activity, and repository design should seek to eliminate or minimize the adverse effects of any resource activity. Buffer zones should be created to provide areas in which resource activities that might adversely affect containment can be restricted or curtailed. This could mean removing large areas of land from resource development. The impact of these frozen assets should be assessed in terms of their economic value and of their effect upon resource reserves. This step could require a major effort in data acquisition and analysis followed by extensive numerical modeling of regional fluid flow and mass transport. Numerical models should be used to assess the effects of resource activity upon containment and should include the cumulative effects of different resource activities. Analysis by other methods is probably not possible except for relatively simple cases

  12. Prioritisation, resources and search terms: a study of decision-making at the virtual reference desk.

    OpenAIRE

    Attfield, Simon; Makri, Stephann; Kalbach, James; Blandford, Ann; De Gabrielle, Stephen; Edwards, Mark

    2008-01-01

    The reinterpretation of the traditional reference service in an online context is the virtual reference desk. Placing reference services into an online setting, however, presents many challenges. We report a study and analytic framework which addresses support for decision-making during virtual enquiry work. Focusing on specialist law-libraries, the study shows that enquirers do not volunteer important information to the service and that asynchronous communication media and some social obstac...

  13. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  14. Deaf-Blindness: National Organizations and Resources. Reference Circular No. 93-1.

    Science.gov (United States)

    Library of Congress, Washington, DC. National Library Service for the Blind and Physically Handicapped.

    This circular lists national organizations and print and audiovisual resources on areas of service to persons with deaf blindness, including rehabilitation, education, information and referral, recreation, and sources for adaptive devices and products. Section I is an alphabetical list of 40 national organizations and resources, including…

  15. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  16. Geological and geochemical aspects of uranium deposits: a selected, annotated bibliography. [474 references

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.M.; Garland, P.A.; White, M.B.; Daniel, E.W.

    1980-09-01

    This bibliography, a compilation of 474 references, is the fourth in a series compiled from the National Uranium Resource Evaluation (NURE) Bibliographic Data Base. This data base was created for the Grand Junction Office of the Department of Energy's National Uranium Resource Evaluation Project by the Ecological Sciences Information Center, Oak Ridge National Laboratory. The references in the bibliography are arranged by subject category: (1) geochemistry, (2) exploration, (3) mineralogy, (4) genesis of deposits, (5) geology of deposits, (6) uranium industry, (7) geology of potential uranium-bearing areas, and (8) reserves and resources. The references are indexed by author, geographic location, quadrangle name, geoformational feature, and keyword.

  17. Reference framework for integrating web resources as e-learning services in .LRN

    Directory of Open Access Journals (Sweden)

    Fabinton Sotelo Gómez

    2015-11-01

    Full Text Available The learning management platforms (LMS as Dot LRN (.LRN have been widely disseminated and used as a teaching tool. However, despite its great potential, most of these platforms do not allow easy integration of common services on the Web. Integration of external resources in LMS is critical to extend the quantity and quality of educational services LMS. This article presents a set of criteria and architectural guidelines for the integration of Web resources for e-learning in the LRN platform. To this end, three steps are performed: first; the possible integration technologies to be used are described, second; the Web resources that provide educational services and can be integrated into LMS platforms are analyzed, finally; some architectural aspects of the relevant platform are identified for integration. The main contributions of this paper are: a characterization of Web resources and educational services available today on the Web; and the definition of criteria and guidelines for the integration of Web resources to .LRN.

  18. Anaerobic Digestion. Selected Instructional Activities and References. Instructional Resources Monograph Series.

    Science.gov (United States)

    Townsend, Robert D., Comp.

    Focusing specifically on the wastewater treatment process of anaerobic digestion, this document identifies instructional and reference materials for use by professionals in the field in the development and implementation of new programs or in the updating of existing programs. It is designed to help trainers, plant operators, educators, engineers,…

  19. Religious Studies: The Shaping of a Field and a Guide to Reference Resources.

    Science.gov (United States)

    Lippy, Charles H.

    1992-01-01

    Discusses the development of religious studies as an academic discipline. Examines the work of leading thinkers in the field, including anthropologists Sir James Fraser and Edward Burnett Taylor, sociologist Max Weber, and psychologist Erik Erikson. Identifies some of the many reference works that deal with religious studies. (SG)

  20. Reference Manual on Making School Climate Improvements. School Climate Improvement Resource Package, 2017

    Science.gov (United States)

    Yoder, N.; Darling-Churchill, K.; Colombi, G. D.; Ruddy, S.; Neiman, S.; Chagnon, E.; Mayo, R.

    2017-01-01

    This reference manual identifies five overarching sets of activities for improving school climate, with the goal of improving student outcomes (e.g., achievement, attendance, behaviors, and skills). These sets of activities help to initiate, implement, and sustain school climate improvements. For each activity set, the manual presents a clear…

  1. Electronic Resource Expenditure and the Decline in Reference Transaction Statistics in Academic Libraries

    Science.gov (United States)

    Dubnjakovic, Ana

    2012-01-01

    The current study investigates factors influencing increase in reference transactions in a typical week in academic libraries across the United States of America. Employing multiple regression analysis and general linear modeling, variables of interest from the "Academic Library Survey (ALS) 2006" survey (sample size 3960 academic libraries) were…

  2. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    Science.gov (United States)

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  3. A Resource and Reference Bibliography in Early Childhood Education and Developmental Psychology: The Affective Domain.

    Science.gov (United States)

    Feldman, Ronald, Comp.; Coopersmith, Stanley, Comp.

    This bibliography provides a comprehensive listing of the reference literature in early childhood (ages 2-9) psychology and education dealing with the affective domain. Categories such as achievement motivation; aggression; anger and frustration; character and moral development; creativity; games; and social behavior are included. One of the 27…

  4. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  5. Improving amphibian genomic resources: a multitissue reference transcriptome of an iconic invader.

    Science.gov (United States)

    Richardson, Mark F; Sequeira, Fernando; Selechnik, Daniel; Carneiro, Miguel; Vallinoto, Marcelo; Reid, Jack G; West, Andrea J; Crossland, Michael R; Shine, Richard; Rollins, Lee A

    2018-01-01

    Cane toads (Rhinella marina) are an iconic invasive species introduced to 4 continents and well utilized for studies of rapid evolution in introduced environments. Despite the long introduction history of this species, its profound ecological impacts, and its utility for demonstrating evolutionary principles, genetic information is sparse. Here we produce a de novo transcriptome spanning multiple tissues and life stages to enable investigation of the genetic basis of previously identified rapid phenotypic change over the introduced range. Using approximately 1.9 billion reads from developing tadpoles and 6 adult tissue-specific cDNA libraries, as well as a transcriptome assembly pipeline encompassing 100 separate de novo assemblies, we constructed 62 202 transcripts, of which we functionally annotated ∼50%. Our transcriptome assembly exhibits 90% full-length completeness of the Benchmarking Universal Single-Copy Orthologs data set. Robust assembly metrics and comparisons with several available anuran transcriptomes and genomes indicate that our cane toad assembly is one of the most complete anuran genomic resources available. This comprehensive anuran transcriptome will provide a valuable resource for investigation of genes under selection during invasion in cane toads, but will also greatly expand our general knowledge of anuran genomes, which are underrepresented in the literature. The data set is publically available in NCBI and GigaDB to serve as a resource for other researchers. © The Authors 2017. Published by Oxford University Press.

  6. The Planteome database: an integrated resource for reference ontologies, plant genomics and phenomics

    Science.gov (United States)

    Cooper, Laurel; Meier, Austin; Laporte, Marie-Angélique; Elser, Justin L; Mungall, Chris; Sinn, Brandon T; Cavaliere, Dario; Carbon, Seth; Dunn, Nathan A; Smith, Barry; Qu, Botong; Preece, Justin; Zhang, Eugene; Todorovic, Sinisa; Gkoutos, Georgios; Doonan, John H; Stevenson, Dennis W; Arnaud, Elizabeth

    2018-01-01

    Abstract The Planteome project (http://www.planteome.org) provides a suite of reference and species-specific ontologies for plants and annotations to genes and phenotypes. Ontologies serve as common standards for semantic integration of a large and growing corpus of plant genomics, phenomics and genetics data. The reference ontologies include the Plant Ontology, Plant Trait Ontology and the Plant Experimental Conditions Ontology developed by the Planteome project, along with the Gene Ontology, Chemical Entities of Biological Interest, Phenotype and Attribute Ontology, and others. The project also provides access to species-specific Crop Ontologies developed by various plant breeding and research communities from around the world. We provide integrated data on plant traits, phenotypes, and gene function and expression from 95 plant taxa, annotated with reference ontology terms. The Planteome project is developing a plant gene annotation platform; Planteome Noctua, to facilitate community engagement. All the Planteome ontologies are publicly available and are maintained at the Planteome GitHub site (https://github.com/Planteome) for sharing, tracking revisions and new requests. The annotated data are freely accessible from the ontology browser (http://browser.planteome.org/amigo) and our data repository. PMID:29186578

  7. The Enterprise Resource Planning (ERP) in small businesses: facing theoretical references and the business world

    OpenAIRE

    Mendes, Juliana Veiga; Escrivão Filho, Edmundo

    2002-01-01

    Nos últimos anos, os sistemas integrados de gestão, ou ERP (Enterprise Resource Planning) , passaram a ser largamente utilizados pelas empresas. Eles são apresentados como "solução" para a maioria dos problemas empresariais. São sistemas genéricos capazes de integrar todas as informações que fluem pela empresa por intermédio de uma base de dados única. A literatura sobre o assunto apresenta uma série de resultados positivos e benefícios a serem obtidos com a adoção desses sistemas. Porém, as ...

  8. A User-Friendly, Keyword-Searchable Database of Geoscientific References Through 2007 for Afghanistan

    Science.gov (United States)

    Eppinger, Robert G.; Sipeki, Julianna; Scofield, M.L. Sco

    2008-01-01

    This report includes a document and accompanying Microsoft Access 2003 database of geoscientific references for the country of Afghanistan. The reference compilation is part of a larger joint study of Afghanistan?s energy, mineral, and water resources, and geologic hazards currently underway by the U.S. Geological Survey, the British Geological Survey, and the Afghanistan Geological Survey. The database includes both published (n = 2,489) and unpublished (n = 176) references compiled through calendar year 2007. The references comprise two separate tables in the Access database. The reference database includes a user-friendly, keyword-searchable interface and only minimum knowledge of the use of Microsoft Access is required.

  9. Pathbase: A new reference resource and database for laboratory mouse pathology

    International Nuclear Information System (INIS)

    Schofield, P. N.; Bard, J. B. L.; Boniver, J.; Covelli, V.; Delvenne, P.; Ellender, M.; Engstrom, W.; Goessner, W.; Gruenberger, M.; Hoefler, H.; Hopewell, J. W.; Mancuso, M.; Mothersill, C.; Quintanilla-Martinez, L.; Rozell, B.; Sariola, H.; Sundberg, J. P.; Ward, A.

    2004-01-01

    Pathbase (http:/www.pathbase.net) is a web accessible database of histopathological images of laboratory mice, developed as a resource for the coding and archiving of data derived from the analysis of mutant or genetically engineered mice and their background strains. The metadata for the images, which allows retrieval and inter-operability with other databases, is derived from a series of orthogonal ontologies, and controlled vocabularies. One of these controlled vocabularies, MPATH, was developed by the Pathbase Consortium as a formal description of the content of mouse histopathological images. The database currently has over 1000 images on-line with 2000 more under curation and presents a paradigm for the development of future databases dedicated to aspects of experimental biology. (authors)

  10. A contemporary Colombian skeletal reference collection: A resource for the development of population specific standards.

    Science.gov (United States)

    Sanabria-Medina, Cesar; González-Colmenares, Gretel; Restrepo, Hadaluz Osorio; Rodríguez, Juan Manuel Guerrero

    2016-09-01

    Several authors who have discussed human variability and its impact on the forensic identification of bodies pose the need for regional studies documenting the global variation of the attributes analyzed osteological characteristics that aid in establishing biological profile (sex, ancestry, biological age and height). This is primarily accomplished by studying documented human skeletal collections in order to investigate secular trends in skeletal development and aging, among others in the Colombian population. The purpose of this paper is to disclose the details of the new "Contemporary Colombian Skeletal Reference Collection" that currently comprises 600 identified skeletons of both sexes, who died between 2005 and 2008; and which contain information about their cause of death. This collection has infinite potential for research, open to the national and international community, and still has pending opportunities to address a variety of topics such as studies on osteopathology, bone trauma and taphonomic studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. A comprehensive reference transcriptome resource for the common house spider Parasteatoda tepidariorum.

    Directory of Open Access Journals (Sweden)

    Nico Posnien

    Full Text Available Parasteatoda tepidariorum is an increasingly popular model for the study of spider development and the evolution of development more broadly. However, fully understanding the regulation and evolution of P. tepidariorum development in comparison to other animals requires a genomic perspective. Although research on P. tepidariorum has provided major new insights, gene analysis to date has been limited to candidate gene approaches. Furthermore, the few available EST collections are based on embryonic transcripts, which have not been systematically annotated and are unlikely to contain transcripts specific to post-embryonic stages of development. We therefore generated cDNA from pooled embryos representing all described embryonic stages, as well as post-embryonic stages including nymphs, larvae and adults, and using Illumina HiSeq technology obtained a total of 625,076,514 100-bp paired end reads. We combined these data with 24,360 ESTs available in GenBank, and 1,040,006 reads newly generated from 454 pyrosequencing of a mixed-stage embryo cDNA library. The combined sequence data were assembled using a custom de novo assembly strategy designed to optimize assembly product length, number of predicted transcripts, and proportion of raw reads incorporated into the assembly. The de novo assembly generated 446,427 contigs with an N50 of 1,875 bp. These sequences obtained 62,799 unique BLAST hits against the NCBI non-redundant protein data base, including putative orthologs to 8,917 Drosophila melanogaster genes based on best reciprocal BLAST hit identity compared with the D. melanogaster proteome. Finally, we explored the utility of the transcriptome for RNA-Seq studies, and showed that this resource can be used as a mapping scaffold to detect differential gene expression in different cDNA libraries. This resource will therefore provide a platform for future genomic, gene expression and functional approaches using P. tepidariorum.

  12. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  13. Pick up a book or "google it?" a survey of radiologist and trainee-preferred references and resources.

    Science.gov (United States)

    Niederhauser, Blake D; Liaw, Kevin; McDonald, Robert J; Thomas, Kristen B; Hudson, Kathleen T; Kallmes, David F

    2014-02-01

    The purpose of this study was to investigate radiologist and trainee-preferred sources for solving imaging questions. The institutional review board determined this study to be exempt from informed consent requirements. Web-based surveys were distributed to radiology staff and trainees at 16 academic institutions. Surveys queried ownership and use of tablet computers and habits of utilization of various electronic and hardcopy resources for general reference. For investigating specific cases, respondents identified a single primary resource. Comparisons were performed using Fisher's exact test. For staff, use of Google and online journals was nearly universal for general imaging questions (93 [103/111] and 94 % [104/111], respectively). For trainees, Google and resident-generated study materials were commonly utilized for such questions (82 [111/135] and 74 % [100/135], respectively). For specific imaging questions, online journals and PubMed were rarely chosen as a primary resource; the most common primary resources were STATdx for trainees and Google for staff (44 [55/126] and 52 % [51/99], respectively). Use of hard copy journals was nearly absent among trainees. Sixty percent of trainees (78/130) own a tablet computer versus 41 % of staff (46/111; p = 0.005), and 71 % (55/78) of those trainees reported at least weekly use of radiology-specific tablet applications, compared to 48 % (22/46) of staff (p Staff radiologists rely heavily on Google for both general and specific imaging queries, while residents utilize customized, radiology-focused products and apps. Interestingly, residents note continued use of hard copy books but have replaced hard copy journals with online resources.

  14. Hawaii demand-side management resource assessment. Final report, Reference Volume 1: Building prototype analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This report provides a detailed description of, and the baseline assumptions and simulation results for, the building prototype simulations conducted for the building types designated in the Work Plan for Demand-side Management Assessment of Hawaii`s Demand-Side Resources (HES-4, Phase 2). This report represents the second revision to the initial building prototype description report provided to DBEDT early in the project. Modifications and revisions to the prototypes, based on further calibration efforts and on comments received from DBEDT Staff have been incorporated into this final version. These baseline prototypes form the basis upon which the DSM measure impact estimates and the DSM measure data base were developed for this project. This report presents detailed information for each of the 17 different building prototypes developed for use with the DOE-21E program (23 buildings in total, including resorts and hotels defined separately for each island) to estimate the impact of the building technologies and measures included in this project. The remainder of this section presents some nomenclature and terminology utilized in the reports, tables, and data bases developed from this project to denote building type and vintage. Section 2 contains a more detailed discussion of the data sources, the definition of the residential sector building prototypes, and results of the DOE-2 analysis. Section 3 provides a similar discussion for the commercial sector. The prototype and baseline simulation results are presented in a separate section for each building type. Where possible, comparison of the baseline simulation results with benchmark data from the ENERGY 2020 model or other demand forecasting models specific to Hawaii is included for each building. Appendix A contains a detailed listing of the commercial sector baseline indoor lighting technologies included in the existing and new prototypes by building type.

  15. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  16. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  17. Contaminant Hazard Reviews (compilation)

    Science.gov (United States)

    Eisler, R.; Munro, R.E.; Loges, L.M.; Boone, K.; Paul, M.M.; Garrett, L.J.

    2000-01-01

    This compact disc (CD) contains the 35 reports in the Contaminant Hazard Reviews (CHR) that were published originally between 1985 and 1999 in the U.S. Department of the Interior Biological Report series. The CD was produced because printed supplies of these reviews--a total of 105,000--became exhausted and demand remained high. Each review was prepared at the request of environmental specialists of the U.S. Fish and Wildlife Service and each contained specific information on the following: mirex, cadmium, carbofuran, toxaphene, selenium, chromium, polychlorinated biphenyls, dioxins, diazinon, mercury, polycyclic aromatic hydrocarbons, arsenic, chlorpyrifos, lead, tin, index issue, pentachlorophenol, atrazine, molybdenum, boron, chlordane, paraquat, cyanide, fenvalerate, diflubenzuron, zinc, famphur, acrolein, radiation, sodium monofluoroacetate, planar PCBs, silver, copper, nickel, and a cumulative index to chemicals and species. Each report reviewed and synthesized the technical literature on a single contaminant and its effects on terrestrial plants and invertebrates, aquatic plants and animals, avian and mammalian wildlife, and other natural resources. The subtopics include contaminant sources and uses; physical, chemical, and metabolic properties; concentrations in field collections of abiotic materials and living organisms; deficiency effects, where appropriate; lethal and sublethal effects, including effects on survival, growth, reproduction, metabolism, mutagenicity, teratogenicity, and carcinogenicity; proposed criteria for the protection of human health and sensitive natural resources; and recommendations for additional research.

  18. Internet resources for dentistry: computer, Internet, reference, and sites for enhancing personal productivity of the dental professional.

    Science.gov (United States)

    Guest, G F

    2000-08-15

    At the onset of the new millennium the Internet has become the new standard means of distributing information. In the last two to three years there has been an explosion of e-commerce with hundreds of new web sites being created every minute. For most corporate entities, a web site is as essential as the phone book listing used to be. Twenty years ago technologist directed how computer-based systems were utilized. Now it is the end users of personal computers that have gained expertise and drive the functionality of software applications. The computer, initially invented for mathematical functions, has transitioned from this role to an integrated communications device that provides the portal to the digital world. The Web needs to be used by healthcare professionals, not only for professional activities, but also for instant access to information and services "just when they need it." This will facilitate the longitudinal use of information as society continues to gain better information access skills. With the demand for current "just in time" information and the standards established by Internet protocols, reference sources of information may be maintained in dynamic fashion. News services have been available through the Internet for several years, but now reference materials such as online journals and digital textbooks have become available and have the potential to change the traditional publishing industry. The pace of change should make us consider Will Rogers' advice, "It isn't good enough to be moving in the right direction. If you are not moving fast enough, you can still get run over!" The intent of this article is to complement previous articles on Internet Resources published in this journal, by presenting information about web sites that present information on computer and Internet technologies, reference materials, news information, and information that lets us improve personal productivity. Neither the author, nor the Journal endorses any of the

  19. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  20. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  1. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  2. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  3. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  4. The provision of Primary Health Care in two rural districts of the Eastern Cape Province with particular reference to human resources and accessibility

    Directory of Open Access Journals (Sweden)

    M. Thipanyana

    1998-09-01

    Full Text Available The provision of Primary Health Care Services (PHC is still a problem in developing countries like South Africa. In other countries, one finds enough human resources whilst in other countries there may be enough material resources. A both qualitative and quantitative research was conducted at Mqanduli and part of the Eastern Elliotdale districts in the Eastern Cape Province with the aim of investigating the provision of Primary health Care Services, reference was made to the availability of human resources and accessibility of PHC services.

  5. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  6. Cosmetics Europe compilation of historical serious eye damage/eye irritation in vivo data analysed by drivers of classification to support the selection of chemicals for development and evaluation of alternative methods/strategies: the Draize eye test Reference Database (DRD).

    Science.gov (United States)

    Barroso, João; Pfannenbecker, Uwe; Adriaens, Els; Alépée, Nathalie; Cluzel, Magalie; De Smedt, Ann; Hibatallah, Jalila; Klaric, Martina; Mewes, Karsten R; Millet, Marion; Templier, Marie; McNamee, Pauline

    2017-02-01

    A thorough understanding of which of the effects assessed in the in vivo Draize eye test are responsible for driving UN GHS/EU CLP classification is critical for an adequate selection of chemicals to be used in the development and/or evaluation of alternative methods/strategies and for properly assessing their predictive capacity and limitations. For this reason, Cosmetics Europe has compiled a database of Draize data (Draize eye test Reference Database, DRD) from external lists that were created to support past validation activities. This database contains 681 independent in vivo studies on 634 individual chemicals representing a wide range of chemical classes. A description of all the ocular effects observed in vivo, i.e. degree of severity and persistence of corneal opacity (CO), iritis, and/or conjunctiva effects, was added for each individual study in the database, and the studies were categorised according to their UN GHS/EU CLP classification and the main effect driving the classification. An evaluation of the various in vivo drivers of classification compiled in the database was performed to establish which of these are most important from a regulatory point of view. These analyses established that the most important drivers for Cat 1 Classification are (1) CO mean ≥ 3 (days 1-3) (severity) and (2) CO persistence on day 21 in the absence of severity, and those for Cat 2 classification are (3) CO mean ≥ 1 and (4) conjunctival redness mean ≥ 2. Moreover, it is shown that all classifiable effects (including persistence and CO = 4) should be present in ≥60 % of the animals to drive a classification. As a consequence, our analyses suggest the need for a critical revision of the UN GHS/EU CLP decision criteria for the Cat 1 classification of chemicals. Finally, a number of key criteria are identified that should be taken into consideration when selecting reference chemicals for the development, evaluation and/or validation of alternative methods and

  7. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  8. Use of Validation by Enterprises for Human Resource and Career Development Purposes. Cedefop Reference Series No 96

    Science.gov (United States)

    Cedefop - European Centre for the Development of Vocational Training, 2014

    2014-01-01

    European enterprises give high priority to assessing skills and competences, seeing this as crucial for recruitment and human resource management. Based on a survey of 400 enterprises, 20 in-depth case studies and interviews with human resource experts in 10 countries, this report analyses the main purposes of competence assessment, the standards…

  9. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  10. There is a Relationship between Resource Expenditures and Reference Transactions in Academic Libraries. A Review of: Dubnjakovic, A. (2012. Electronic resource expenditure and the decline in reference transaction statistics in academic libraries. Journal of Academic Librarianship, 38(2, 94-100. doi:10.1016/j.acalib.2012.01.001

    Directory of Open Access Journals (Sweden)

    Annie M. Hughes

    2013-03-01

    Full Text Available Objective – To provide an analysis of the impact of expenditures on electronic resourcesand gate counts on the increase or decrease in reference transactions.Design – Analysis of results of existing survey data from the National Center for Educational Statistics (NCES 2006 Academic Library Survey(ALS.Setting – Academic libraries in the United States.Subjects – 3925 academic library respondents.Methods – The author chose to use survey data collected from the 2006 ALS conducted bythe NCES. The survey included data on various topics related to academic libraries, but in the case of this study, the author chose to analyze three of the 193 variables included. The three variables: electronic books expenditure, computer hardware and software, and expenditures on bibliographic utilities, were combined into one variable called electronic resource expenditure. Gate counts were also considered as a variable. Electronic resource expenditure was also split as a variable into three groups: low, medium, and high. Multiple regression analysis and general linear modeling, along with tests of reliability, were employed. Main Results – The author determined that low, medium, and high spenders with regard to electronic resources exhibited differences in gate counts, and gate counts have an effect on reference transactions in any given week. Gate counts tend to not have much of an effect on reference transactions for the higher spenders, and higher spenders tend to have a higher number of reference transactions overall. Low spenders have lower gate counts and also a lower amount of reference transactions.Conclusion – The findings from this study show that academic libraries spending more on electronic resources also tend to have an increase with regard to reference transactions. The author also concludes that library spaces are no longer the determining factor with regard to number of reference transactions. Spending more on electronic resources is

  11. Management strategies of marine food resources under multiple stressors with particular reference of the Yellow Sea large marine ecosystem

    Directory of Open Access Journals (Sweden)

    Qisheng TANG

    2014-02-01

    Full Text Available In this study two main management strategies are discussed: one is to develop resource conservation-based capture fisheries, and the other is to develop environmentally friendly aquaculture. During the resource recovery period, the development of environmentally friendly aquaculture should be encouraged, especially in integrated multi-trophic aquaculture, which is adaptive, efficient and sustainable. For future development and better understanding the ecosystem, it is necessary to further strengthen basic research.

  12. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  13. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  14. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  15. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  16. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  17. Distribution, utilization structure and potential of biomass resources in rural China: With special references of crop residues

    Energy Technology Data Exchange (ETDEWEB)

    Liu, H [Laboratory of Quantitative Vegetation Ecology, Institute of Botany, Chinese Academy of Sciences, 20 Nanxincun, Xiangshan, Beijing 100093 (China); Graduate University of Chinese Academy of Sciences, Beijing 100049 (China); Jiang, G M [Laboratory of Quantitative Vegetation Ecology, Institute of Botany, Chinese Academy of Sciences, 20 Nanxincun, Xiangshan, Beijing 100093 (China); Agronomy Department, Shandong Agricultural University, Tai' an 271018, Shandong Province (China); Zhuang, H Y [National Bio-Energy CO., LTD, No. 26B, Financial Street, Xicheng District, Beijing 100032 (China); Shandong Academy of Sciences, No. 19, Keyuan Road, Ji' nan 250014, Shandong Province (China); Wang, K J [Agronomy Department, Shandong Agricultural University, Tai' an 271018, Shandong Province (China)

    2008-06-15

    As the largest developing country in the world, China is urgently in short of energy and natural resources. However, biological resources such as crop residues are burnt in the field, which cause serious environmental pollution. Still it is not clear how much storage and potential of these huge crop residues are in China. This paper firstly reported the distribution, utilization structure and potential of crop biomass and provided the tangible information of crop residues in rural China through careful collecting and recalculating data. From 1995 to 2005, China produces some 630 million tons of crop residues per year, 50% of which comes from east and central south of China. The amount of crop residues is 1.3 times of the total yield of crops, 2 times of the total fodder of grassland, which covers 41% of China's territory. Crop residues of corn, wheat and rice amounted to 239, 137 and 116 million tons, respectively, accounting for nearly 80% of the total crop residues. Unfortunately, the utilizing structure is seriously improper for such abundant biomass resources. Although 23% of the crop residues are used for forage, 4% for industry materials and 0.5% for biogas, the large parts are used with lower efficiency or wasted, with 37% being directly combusted by farmers, 15% lost during collection and the rest 20.5% discarded or directly burnt in the field. Reasonable adjustment of the utilizing pattern and popularization of the recycling agriculture are essential out-ways for residues, with the development of the forage industry being the breakthrough point. We suggested that utilizing the abandoned 20.5% of the total residues for forage and combining agriculture and stock raising can greatly improve the farm system and cut down fertilizer pollution. Through the development of forage industries, the use efficiency of crop residues could be largely enhanced. Commercializing and popularizing technologies of biomass gasification and liquefaction might be substitute

  18. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  19. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  20. Geological and geochemical aspects of uranium deposits: a selected, annotated bibliography. Vol. 2, Rev. 1. [490 references

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.M.; Brock, M.L.; Garland, P.A.; White, M.B.; Daniel, E.W. (comps.)

    1979-07-01

    This bibliography, a compilation of 490 references, is the second in a series compiled from the National Uranium Resource Evaluation (NURE) Bibliographic Data Base. This data base is one of six data bases created by the Ecological Sciences Information Center, Oak Ridge National Laboratory, for the Grand Junction Office of the Department of Energy. Major emphasis for this volume has been placed on uranium geology, encompassing deposition, genesis of ore deposits, and ore controls; and prospecting techniques, including geochemistry and aerial reconnaissance. The following indexes are provided to aid the user in locating references of interest: author, geographic location, quadrangle name, geoformational feature, taxonomic name, and keyword.

  1. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  2. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  3. Implicit computational complexity and compilers

    DEFF Research Database (Denmark)

    Rubiano, Thomas

    Complexity theory helps us predict and control resources, usually time and space, consumed by programs. Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is to observe the program’s data’s behavior. For instance, the detection of non...... evolution and a lot of research came from this theory. Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesis applies implicit computational complexity methods into “real life” programs by manipulating intermediate representation languages...

  4. Water resources of southeastern Florida, with special reference to geology and ground water of the Miami area

    Science.gov (United States)

    Parker, Garald G.; Ferguson, G.E.; Love, S.K.

    1955-01-01

    The circulation of water, in any form, from the surface of the earth to the atmosphere and back again is called the hydrologic cycle. A comprehensive study of the water resources of any area must, therefore, include data on the climate of the area. The humid subtropical climate of southeast Florida is characterized by relatively high temperatures, alternating semi-annual wet and dry season, and usually light put persistent winds. The recurrence of drought in an area having relatively large rainfall such as southeastern Florida indicates that the agencies that remove water are especially effective. Two of the most important of the agencies associated with climate are evaporation and transpiration, or 'evapotranspiraton'. Evaporation losses from permanent water areas are believed to average between 40 and 45 inches per year. Over land areas indirect methods much be used to determine losses by evapotranspiration; necessarily, there values are not precise. Because of their importance in the occurrence and movement of both surface and ground waters, detailed studies were made of the geology and geomorphology of southern Florida. As a result of widespread crustal movements, southern Florida emerged from the sea in later Pliocene time and probably was slightly tilted to the west. At the beginning of the Pleistocene the continent emerged still farther as a result of the lowering of sea level attending the first widespread glaciation. During this epoch, south Florida may have stood several hundred feet above sea level. During the interglacial ages the sea repeatedly flooded southern Florida. The marine members of the Fort Thompson formation in the Lake Okeechobee-Everglades depression and the Calossahatchee River Valley apparently are the deposits of the interglacial invasions by the sea. The fresh-water marls, sands, and organic deposits of the Fort Thompson formation appear to have accumulated during glacial ages when seas level was low and the area was a land surface

  5. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  6. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  7. Experiences on current national income measures with reference to environmental and natural resources; Esperienze e proposte relative alla correzione in senso ambientale delle misure del reddito nazionale

    Energy Technology Data Exchange (ETDEWEB)

    Franzese, R; Gaudioso, D [ENEA, Casaccia (Italy). Dipt. Ambiente

    1995-06-01

    The environment provides both a source of goods and services and a `sink` for residues of the production and consumption processes. This is not reflected into conventional estimate of GDP (gross domestic product), the most commonly used measure of aggregate income. The purpose of this paper is to explore whether environmentally-adjusted national income measure can be derived. In the first part, the authors discuss both the shortcomings of the current national income measures, with reference to environmental and natural resources, and the debate on this issues; then they analyse the existing experiences to provide environmentally-adjusted indicators of national accounts. In the second part, the authors present an evaluation of the costs of environmental degradation in Italy in the period 1988-1990, based on the methodologies adopted in a pilot study carried out by UNSO (United Nations Statistical Office) and the World Bank for Mexico.

  8. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  9. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  10. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  11. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  12. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  13. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  14. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  15. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  16. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  17. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  18. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  19. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  20. Reference List About Implicit and Unconscious Bias

    DEFF Research Database (Denmark)

    Munar, Ana Maria; Villeseche, Florence; Wiedemann, Cecilie Dam

    to publications accessible through the CBS library website and/or specifications of where and how to access each publication. In addition, as part of this effort and in line with the task list of the Council for Diversity and Inclusion, the report “Gender and Leadership Practices at Copenhagen Business School......The compilation of this reference list is one of the initiatives of the action plan developed by the Council for Diversity and Inclusion at Copenhagen Business School (CBS). This reference list is the first in a series of efforts initiated by this Council to develop an academic resource pool......, everyday human thought and activity” (Hardin and Banaji, 2013, pp. 13-14). Research also indicates that it is possible to implement procedures and strategic actions that help reduce implicit biases (Devine, Forscher, Austin, & Cox, 2012). Although extensive, this list does not include all existing academic...

  1. Neptunium: a bibliographic reference

    International Nuclear Information System (INIS)

    Mosley, R.E.

    1979-06-01

    A comprehensive bibliograhy of the literature on the element neptunium published prior to January 1976 is presented. A short abstract is given for each listed reference, with a few exceptions. The references are divided into sections categorized as General, Man-Made Sources (Reactors), Man-Made Sources (Fuel Reprocessing), Chemistry (Solubility), Chemistry (Compounds), Chemistry (Isotopes), Analyses (Instrumental), Analyses (Chemical), Chemical (Animal), Biological (Effects), Biological (Animal-Metabolism-Retention), Biological (Air Movement), Biological (Human Inhalation), Measurement, and Dosimetry. The bibliography contains author and keyword indexes and was compiled to serve as a quick reference source for neptunium-related work. 184 citations

  2. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  3. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  4. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  5. MSeqDR mvTool: A mitochondrial DNA Web and API resource for comprehensive variant annotation, universal nomenclature collation, and reference genome conversion.

    Science.gov (United States)

    Shen, Lishuang; Attimonelli, Marcella; Bai, Renkui; Lott, Marie T; Wallace, Douglas C; Falk, Marni J; Gai, Xiaowu

    2018-06-01

    Accurate mitochondrial DNA (mtDNA) variant annotation is essential for the clinical diagnosis of diverse human diseases. Substantial challenges to this process include the inconsistency in mtDNA nomenclatures, the existence of multiple reference genomes, and a lack of reference population frequency data. Clinicians need a simple bioinformatics tool that is user-friendly, and bioinformaticians need a powerful informatics resource for programmatic usage. Here, we report the development and functionality of the MSeqDR mtDNA Variant Tool set (mvTool), a one-stop mtDNA variant annotation and analysis Web service. mvTool is built upon the MSeqDR infrastructure (https://mseqdr.org), with contributions of expert curated data from MITOMAP (https://www.mitomap.org) and HmtDB (https://www.hmtdb.uniba.it/hmdb). mvTool supports all mtDNA nomenclatures, converts variants to standard rCRS- and HGVS-based nomenclatures, and annotates novel mtDNA variants. Besides generic annotations from dbNSFP and Variant Effect Predictor (VEP), mvTool provides allele frequencies in more than 47,000 germline mitogenomes, and disease and pathogenicity classifications from MSeqDR, Mitomap, HmtDB and ClinVar (Landrum et al., 2013). mvTools also provides mtDNA somatic variants annotations. "mvTool API" is implemented for programmatic access using inputs in VCF, HGVS, or classical mtDNA variant nomenclatures. The results are reported as hyperlinked html tables, JSON, Excel, and VCF formats. MSeqDR mvTool is freely accessible at https://mseqdr.org/mvtool.php. © 2018 Wiley Periodicals, Inc.

  6. Nevada low-temperaure geothermal resource assessment: 1994. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Garside, L.J.

    1994-12-31

    Data compilation for the low-temperature program is being done by State Teams in two western states. Final products of the study include: a geothermal database, in hardcopy and as digital data (diskette) listing information on all known low- and moderate- temperature springs and wells in Nevada; a 1:1,000,000-scale map displaying these geothermal localities, and a bibliography of references on Nevada geothermal resources.

  7. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  8. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  9. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  10. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  11. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  12. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  13. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  14. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  15. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  16. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  17. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  18. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  19. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  20. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  1. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  2. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  3. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  4. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  5. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  6. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  7. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  8. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  9. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-02-15

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process. Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule{>=}3 mm,''''nodule<3 mm,'' and ''non-nodule{>=}3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked &apos

  10. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    International Nuclear Information System (INIS)

    2011-01-01

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process. Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule≥3 mm,''''nodule<3 mm,'' and ''non-nodule≥3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked ''nodule≥3 mm'' by at least one radiologist, of which 928 (34.7%) received such marks from all

  11. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  12. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  13. Compilation of LLNL CUP-2 Data

    Energy Technology Data Exchange (ETDEWEB)

    Eppich, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kips, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lindvall, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-31

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived from the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results

  14. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  15. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  16. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  17. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  18. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  19. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  20. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  1. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  2. Building and Managing Electronic Resources in Digital Era in India with Special Reference to IUCAA and NIV, Pune: A Comparative Case Study

    Science.gov (United States)

    Sahu, H. K.; Singh, S. N.

    2015-04-01

    This paper discusses and presents a comparative case study of two libraries in Pune, India, Inter-University Centre for Astronomy and Astrophysics and Information Centre and Library of National Institute of Virology (Indian Council of Medical Research). It compares how both libraries have managed their e-resource collections, including acquisitions, subscriptions, and consortia arrangements, while also developing a collection of their own resources, including pre-prints and publications, video lectures, and other materials in an institutional repository. This study illustrates how difficult it is to manage electronic resources in a developing country like India, even though electronic resources are used more than print resources. Electronic resource management can be daunting, but with a systematic approach, various problems can be solved, and use of the materials will be enhanced.

  3. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  4. Mentoring in Early Childhood Education: A Compilation of Thinking, Pedagogy and Practice

    Science.gov (United States)

    Murphy, Caterina, Ed.; Thornton, Kate, Ed.

    2015-01-01

    Mentoring is a fundamental and increasingly important part of professional learning and development for teachers in Aotearoa New Zealand. This book is a much-needed resource for mentors, leaders and teachers in early childhood education. It is the first of its kind: a wide ranging compilation that explores the thinking, pedagogy and practice of…

  5. Comparison of Capability of Digitizing Methods to Predict Soil classification According to the Soil Taxonomy and World Reference Base for Soil Resources

    Directory of Open Access Journals (Sweden)

    zohreh mosleh

    2017-02-01

    Full Text Available Introduction: Soil classification generally aims to establish a taxonomy based on breaking the soil continuum into homogeneous groups that can highlight the essential differences in soil properties and functions between classes.The two most widely used modern soil classification schemes are Soil Taxonomy (ST and World Reference Base for Soil Resources (WRB.With the development of computers and technology, digital and quantitative approaches have been developed. These new techniques that include the spatial prediction of soil properties or classes, relies on finding the relationships between soil and the auxiliary information that explain the soil forming factors or processes and finally predict soil patterns on the landscape. These approaches are commonly referred to as digital soil mapping (DSM (14. A key component of any DSM mapping activity is the method used to define the relationship between soil observation and auxiliary information (4. Several types of machine learning approaches have been applied for digital soil mapping of soil classes, such as logistic and multinomial logistic regressions (10,12, random forests (15, neural networks (3,13 and classification trees (22,4. Many decisions about the soil use and management are based on the soil differences that cannot be captured by higher taxonomic levels (i.e., order, suborder and great group (4. In low relief areas such as plains, it is expected that the soil forming factors are more homogenous and auxiliary information explaining soil forming factors may have low variation and cannot show the soil variability. Materials and Methods: The study area is located in the Shahrekord plain of Chaharmahal-Va-Bakhtiari province. According tothe semi-detailed soil survey (16, 120 pedons with approximate distance of 750 m were excavated and described according to the “field book for describing and sampling soils” (19. Soil samples were taken from different genetic horizons, air dried and

  6. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  7. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  8. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  9. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  10. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  11. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  12. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  13. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  14. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  15. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  16. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  17. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  18. Introduction to selected references on fossil fuels of the central and southern Appalachian basin: Chapter H.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    Science.gov (United States)

    Ruppert, Leslie F.; Lentz, Erika E.; Tewalt, Susan J.; Román Colón, Yomayra A.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    The Appalachian basin contains abundant coal and petroleum resources that have been studied and extracted for at least 150 years. In this volume, U.S. Geological Survey (USGS) scientists describe the geologic framework and geochemical character of the fossil-fuel resources of the central and southern Appalachian basin. Separate subchapters (some previously published) contain geologic cross sections; seismic profiles; burial history models; assessments of Carboniferous coalbed methane and Devonian shale gas; distribution information for oil, gas, and coal fields; data on the geochemistry of natural gas and oil; and the fossil-fuel production history of the basin. Although each chapter and subchapter includes references cited, many historical or other important references on Appalachian basin and global fossil-fuel science were omitted because they were not directly applicable to the chapters.

  19. Virtual Reference Services.

    Science.gov (United States)

    Brewer, Sally

    2003-01-01

    As the need to access information increases, school librarians must create virtual libraries. Linked to reliable reference resources, the virtual library extends the physical collection and library hours and lets students learn to use Web-based resources in a protected learning environment. The growing number of virtual schools increases the need…

  20. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  1. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  2. Discussion on water resources value accounting and its application

    Science.gov (United States)

    Guo, Biying; Huang, Xiaorong; Ma, Kai; Gao, Linyun; Wang, Yanqiu

    2018-06-01

    The exploration of the compilation of natural resources balance sheet has been proposed since 2013. Several elements of water resources balance sheet have been discussed positively in China, including basic concept, framework and accounting methods, which focused on calculating the amount of water resources with statistical methods but lacked the analysis of the interrelationship between physical volume and magnitude of value. Based on the study of physical accounting of water resources balance sheet, the connotation of water resources value is analyzed in combination with research on the value of water resources in the world. What's more, the theoretical framework, form of measurement and research methods of water resources value accounting are further explored. Taking Chengdu, China as an example, the index system of water resources balance sheet in Chengdu which includes both physical and valuable volume is established to account the depletion of water resources, environmental damage and ecological water occupation caused by economic and social water use. Moreover, the water resources balance sheet in this region which reflects the negative impact of the economy on the environment is established. It provides a reference for advancing water resources management, improving government and social investment, realizing scientific and rational allocation of water resources.

  3. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  4. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  5. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  6. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  7. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  8. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  9. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  10. Children with Disabilities Are Often Misdiagnosed Initially and Children with Neuropsychiatric Disorders Are Referred to Adequate Resources 30 Months Later than Children with Other Disabilities

    Science.gov (United States)

    Tuominen-Eriksson, Alli-Marie; Svensson, Yvonne; Gunnarsson, Ronny K.

    2013-01-01

    Disabilities in a child may lead to low self-esteem and social problems. The lives of parents and siblings are also affected. Early intervention may decrease these consequences. To promote early intervention early referral to adequate resources is essential. In a longitudinal retrospective observational study it was found that children with…

  11. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  12. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  13. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  14. Construction experiences from underground works at Oskarshamn. Compilation report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-12-01

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m 3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  15. Proceedings of the fifth international groundwater conference on the assessment and management of groundwater resources in hard rock systems with special reference to basaltic terrain

    International Nuclear Information System (INIS)

    Thangarajan, M.; Mayilswami, C.; Kulkarni, P.S.; Singh, V.P.

    2012-01-01

    Groundwater resources in hard rock regions with limited renewable potential have to be managed judiciously to ensure adequate supplies of dependable quantity and quality. It is a natural resource with economic, strategic and environmental value, which is under stress both due to changing climatic and anthropogenic factors. Therefore the management strategies need to be aimed at sustenance of this limited resource. In India, and also elsewhere in the world major parts of the semi-arid regions are characterized by hard rocks and it is of vital importance to understand the nature of the aquifer systems and its current stress conditions. Though the achievements through scientific development in exploration and exploitation are commendable, it has adversely affected the hard rock aquifer system, both in terms of quantity and quality; which is of major concern today. In order to reverse the situation, better management strategy of groundwater resources needs to be devised for prevention of further degradation of quality and meeting out the future demand of quantity. This necessitates: understanding the flow mechanism, evaluating the potential and evolving optimal utilization schemes, and assessing and monitoring quality in the changing scenario of anthropogenically induced agricultural, urban, industrial and climatic change. The groundwater flow mechanism through fractures in hard rocks is yet to be fully understood in terms of fracture geometry and its relation to groundwater flow. The characterization of flow geometry in basaltic aquifer is yet to be fully explored. Groundwater pollution due to anthropogenic factors is very slow process with long-term impacts on carbon cycle and global climatic change on one hand and quality on the other. It is generally recognized that the prevention of groundwater pollution is cheaper than its remedial measures in the long run. Furthermore, because of the nature of groundwater flow and the complexity and management uncertainty of

  16. Radio-contaminated patients intake at the Nancy teaching hospital: study of the resources and implication possibilities of radiation protection referent personnel

    International Nuclear Information System (INIS)

    Guionnet, Christophe

    2011-01-01

    In the events implying radioactive materials, the organization of care in the health facilities requires both the implementation of some skills, which go beyond the usual trade and professional duties of each of the partners, and the identification of the departments and referent staff that are likely to be most affected by these risks. This study, which was conducted within the Radiologie and Medecine Nucleaire Departments of the Centre Hospitalier Universitaire de Nancy, highlights the existence of reference material as means of detecting radiation, the conditions under which they will be developed remain to be defined. Human means can be called for for missions of information, detection, patient screening and decontamination. Volunteering persons, with competences in radiation protection (Personnes Competentes en Radioprotection), and radiographers can lead to the realization of their missions, after an practical training. (author) [fr

  17. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  18. One hundred prime references on hydrogeochemical and stream sediment surveying for uranium as internationally practiced, including 60 annotated references

    International Nuclear Information System (INIS)

    Sharp, R.R. Jr.; Bolivar, S.L.

    1981-04-01

    The United States Department of Energy (DOE), formerly the US ERDA, has initiated a nationwide Hydrogeochemical and Stream Sediment Reconnaissance (HSSR). This program is part of the US National Uranium Resource Evaluation, designed to provide an improved estimate for the availability and economics of nuclear fuel resources and make available to industry information for use in exploration and development of uranium resources. The Los Alamos National Laboratory is responsible for completing the HSSR in Rocky Mountain states of New Mexico, Colorado, Wyoming, and Montana and in the state of Alaska. This report contains a compilation of 100 prime references on uranium hydrogeochemical and stream sediment reconnaissance as internationally practiced prior to 1977. The major emphasis in selection of these references was directed toward constructing a HSSR program with the purpose of identifying uranium in the Los Alamos National Laboratory area of responsibility. The context of the annotated abstracts are the authors' concept of what the respective article contains relative to uranium geochemistry and hydrogeochemical and stream sediment surveying. Consequently, in many cases, significant portions of the original articles are not discussed. The text consists of two parts. Part I contains 100 prime references, alphabetically arranged. Part II contains 60 select annotated abstracts, listed in chronological order

  19. Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations

    Energy Technology Data Exchange (ETDEWEB)

    Rachel Henderson

    2007-09-30

    The project is titled 'Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations'. The Interstate Oil and Gas Compact Commission (IOGCC), headquartered in Oklahoma City, Oklahoma, is the principal investigator and the IOGCC has partnered with ALL Consulting, Inc., headquartered in Tulsa, Oklahoma, in this project. State agencies that also have partnered in the project are the Wyoming Oil and Gas Conservation Commission, the Montana Board of Oil and Gas Conservation, the Kansas Oil and Gas Conservation Division, the Oklahoma Oil and Gas Conservation Division and the Alaska Oil and Gas Conservation Commission. The objective is to characterize produced water quality and management practices for the handling, treating, and disposing of produced water from conventional oil and gas operations throughout the industry nationwide. Water produced from these operations varies greatly in quality and quantity and is often the single largest barrier to the economic viability of wells. The lack of data, coupled with renewed emphasis on domestic oil and gas development, has prompted many experts to speculate that the number of wells drilled over the next 20 years will approach 3 million, or near the number of current wells. This level of exploration and development undoubtedly will draw the attention of environmental communities, focusing their concerns on produced water management based on perceived potential impacts to fresh water resources. Therefore, it is imperative that produced water management practices be performed in a manner that best minimizes environmental impacts. This is being accomplished by compiling current best management practices for produced water from conventional oil and gas operations and to develop an analysis tool based on a geographic information system (GIS) to assist in the understanding of watershed-issued permits. That would allow management costs to be kept in

  20. Compiling a Monolingual Dictionary for Native Speakers

    Directory of Open Access Journals (Sweden)

    Patrick Hanks

    2011-10-01

    Full Text Available

    ABSTRACT: This article gives a survey of the main issues confronting the compilers of monolingual dictionaries in the age of the Internet. Among others, it discusses the relationship between a lexical database and a monolingual dictionary, the role of corpus evidence, historical principles in lexicography vs. synchronic principles, the instability of word meaning, the need for full vocabulary coverage, principles of definition writing, the role of dictionaries in society, and the need for dictionaries to give guidance on matters of disputed word usage. It concludes with some questions about the future of dictionary publishing.

    OPSOMMING: Die samestelling van 'n eentalige woordeboek vir moedertaalsprekers. Hierdie artikel gee 'n oorsig van die hoofkwessies waarmee die samestellers van eentalige woordeboeke in die eeu van die Internet te kampe het. Dit bespreek onder andere die verhouding tussen 'n leksikale databasis en 'n eentalige woordeboek, die rol van korpusgetuienis, historiese beginsels vs sinchroniese beginsels in die leksikografie, die onstabiliteit van woordbetekenis, die noodsaak van 'n volledige woordeskatdekking, beginsels van die skryf van definisies, die rol van woordeboeke in die maatskappy, en die noodsaak vir woordeboeke om leiding te gee oor sake van betwiste woordgebruik. Dit sluit af met 'n aantal vrae oor die toekoms van die publikasie van woordeboeke.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, LEKSIKALE DATABASIS, WOORDEBOEKSTRUKTUUR, WOORDBETEKENIS, BETEKENISVERANDERING, GEBRUIK, GEBRUIKSAANTEKENINGE, HISTORIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, SINCHRONIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, REGISTER, SLANG, STANDAARDENGELS, WOORDESKATDEKKING, KONSEKWENSIE VAN VERSAMELINGS, FRASEOLOGIE, SINTAGMATIESE PATRONE, PROBLEME VAN KOMPOSISIONALITEIT, LINGUISTIESE PRESKRIPTIVISME, LEKSIKALE GETUIENIS

  1. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  2. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  3. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  4. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  5. Growth references

    NARCIS (Netherlands)

    Buuren, S. van

    2007-01-01

    A growth reference describes the variation of an anthropometric measurement within a group of individuals. A reference is a tool for grouping and analyzing data and provides a common basis for comparing populations.1 A well known type of reference is the age-conditional growth diagram. The

  6. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  7. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  8. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  9. 1989 OCRWM [Office of Civilian Radioactive Waste Management] Bulletin compilation and index

    International Nuclear Information System (INIS)

    1990-02-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1989 calendar year. A table of contents and one index have been provided to assist in finding information contained in this year's Bulletins. The pages have been numbered consecutively at the bottom for easy reference. 7 figs

  10. A compilation of experimental burnout data for axial flow of water in rod bundles

    International Nuclear Information System (INIS)

    Chapman, A.G.; Carrard, G.

    1981-02-01

    A compilation has been made of burnout (critical heat flux) data from the results of more thant 12,000 tests on 321 electrically-heated, water-cooled experimental assemblies each simulating, to some extent, the operating or postulated accident conditions in the fuel elements of water-cooled nuclear power reactors. The main geometric characteristics of the assemblies are listed and references are given for the sources of information from which the data were gathered

  11. Projet WalRB : traduction de la légende de la Carte des Sols de la Belgique dans le système World Reference Base for Soil Resources (WRB

    Directory of Open Access Journals (Sweden)

    Bouhon, A.

    2011-01-01

    Full Text Available WalRB project: translation of the legend of the soil map of Belgium into World Reference Base for Soil Resources (WRB. Soil maps are among the most important reference maps in environmental and agriculture fields. Determination of land, agricultural potential, erosion thread, land management or soil pollution are some topics that need spatial soil data. Attention to cross-border environmental matters, such as soil protection, has become an international concern that requires harmonized soil information. This is why the World Reference Base for Soil Resources has been selected by European Union as official soil classification system (IUSS Working Group WRB, 2007. Belgium is one of the first nations to have achieved the whole country soil survey at large scale (1:20,000. The legend of the soil map of Belgium is based on three or four main soil specifications, texture, drainage class, profile development and stoniness nature (for stony soil, each one represented by a letter. Those three or four letters all together form the main soil series. Prefix and suffix may be added to further detail it. The WRB system based on soil morphology is formed of two levels, 32 Reference Soil Groups (RSGs, and various qualifiers (prefix, suffix or both. A common methodology between Flanders, Luxembourg and Wallonia (that use the same soil map legend is requested to carry out the translation. Data from different databases, digital soil maps, soil profile descriptions, soil analytical data, Digital Elevation Model, other thematic maps (e.g. flooding hazard areas are collected and organized under a common PostgreSQL database [Belgian Soil Profile Database (BSP], with PostGIS geographical extension, hosted under a dedicated server. Data validation is proposed to be done under the auspices of National Soil Committee of Royal Academy for Sciences and Arts of Belgium. Algorithms are implemented in Perl and R languages.

  12. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  13. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  14. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  15. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  16. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  17. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  18. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  19. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  20. [Reference citation].

    Science.gov (United States)

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  1. A Hybrid Approach to Proving Memory Reference Monotonicity

    KAUST Repository

    Oancea, Cosmin E.; Rauchwerger, Lawrence

    2013-01-01

    Array references indexed by non-linear expressions or subscript arrays represent a major obstacle to compiler analysis and to automatic parallelization. Most previous proposed solutions either enhance the static analysis repertoire to recognize more

  2. Reference Assessment

    Science.gov (United States)

    Bivens-Tatum, Wayne

    2006-01-01

    This article presents interesting articles that explore several different areas of reference assessment, including practical case studies and theoretical articles that address a range of issues such as librarian behavior, patron satisfaction, virtual reference, or evaluation design. They include: (1) "Evaluating the Quality of a Chat Service"…

  3. Compilation of kinetic data for geochemical calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.; Savage, D.; Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the mineralogical and

  4. Genetics Home Reference

    Science.gov (United States)

    ... Page Search Home Health Conditions Genes Chromosomes & mtDNA Resources Help Me Understand Genetics Share: Email Facebook Twitter Genetics Home Reference provides consumer-friendly information about the effects of genetic variation on human health. Health Conditions More than 1,200 health ...

  5. Compilation of anatomical, physiological and metabolic characteristics for a reference Vietnamese man

    International Nuclear Information System (INIS)

    Nguyen Manh Lien

    1998-01-01

    In general, over the course of the time, the phenomenon of acceleration in physical development may be observed, i.e. the children and adults of the next generation are taller and heavier than in former generation. Our data presented in this paper show a regular trend of acceleration in the development on Vietnamese, but the trend is still slow and was mostly probably influenced by our difficulties in a long time of war. It is hoped that, the acceleration in the development may be increased in the future following the economical acceleration of our country, however it is known that the ratio between the length of different parts of human body is a specific characteristic for human race, sex and group age. Therefore we may estimate these ratio for the prolongation of the utilization of our measured physical data. The result of studies on water balance of Vietnamese living in comfortable environment air temperature conditions and working in hot environment with different levels of energy expenditure and the elemental composition of sweat of workers in hot environment are also presented as well as the mass of major internal organs of Vietnamese. The data of food consumption in Viet Nam National institute of Nutrition (1986) show an unbalanced state and deficient food intake in the nutrition of Vietnamese. However, after economical reconstruction in the last years the data of food consumption and food supply are varied. The quantity of protein, fat and milk products increase every time in people's food. (author)

  6. Compilation of references, data sources and analysis methods for LMFBR primary piping system components

    International Nuclear Information System (INIS)

    Reich, M.; Esztergar, E.P.; Ellison, E.G.; Erdogan, F.; Gray, T.G.F.; Wells, C.W.

    1977-03-01

    A survey and review program for application of fracture mechanics methods in elevated temperature design and safety analysis has been initiated in December of 1976. This is the first of a series of reports, the aim of which is to provide a critical review of the theories of fracture and the application of fracture mechanics methods to life prediction, reliability and safety analysis of piping components in nuclear plants undergoing sub-creep and elevated temperature service conditions

  7. Using Compilers to Enhance Cryptographic Product Development

    Science.gov (United States)

    Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.

    Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.

  8. The technical results of the Swedish nuclear weapons programme - a compilation of FOAs annual reports 1945-1972

    International Nuclear Information System (INIS)

    Oliver, L.; Stenholm, L.

    2002-02-01

    The aim with this report is to summarise FOAs nuclear weapons related research that was performed 1945-1972. The report is a compilation of FOAs annual reports that originally were in a classified form but have now - mostly - been declassified. References to separate reports in the different research areas are included in the report

  9. Compilation of comments concerning the 3rd draft revision of the IAEA regulations for safety transport of radioactive materials

    International Nuclear Information System (INIS)

    1983-08-01

    The report contains comments made by Member States and International Organizations to the third draft revision of the International Energy Agency's regulations for the safe transport of radioactive materials. The comments are compiled in logical groups referring to various aspects of the regulations

  10. Oil Well Bottom Hole Locations, This GIS data set was produced as a general reference for the Department of Natural Resources, the oil and gas industry, environmental and regulatory agencies, landowners, and the public., Published in 2007, 1:24000 (1in=2000ft) scale, Louisiana State University (LSU).

    Data.gov (United States)

    NSGIC Education | GIS Inventory — Oil Well Bottom Hole Locations dataset current as of 2007. This GIS data set was produced as a general reference for the Department of Natural Resources, the oil and...

  11. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  12. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  13. Recent references

    International Nuclear Information System (INIS)

    Ramavataram, S.

    1991-01-01

    In support of a continuing program of systematic evaluation of nuclear structure data, the National Nuclear Data Center maintains a complete computer file of references to the nuclear physics literature. Each reference is tagged by a keyword string, which indicates the kinds of data contained in the article. This master file of Nuclear Structure References (NSR) contains complete keyword indexes to literature published since 1969, with partial indexing of older references. Any reader who finds errors in the keyword descriptions is urged to report them to the National Nuclear Data Center so that the master NSR file can be corrected. In 1966, the first collection of Recent References was published as a separate issue of Nuclear Data Sheets. Every four months since 1970, a similar indexed bibliography to new nuclear experiments has been prepared from additions to the NSR file and published. Beginning in 1978, Recent References was cumulated annually, with the third issue completely superseding the two issues previously published during a given year. Due to publication policy changes, cumulation of Recent Reference was discontinued in 1986. The volume and issue number of all the cumulative issues published to date are given. NNDC will continue to respond to individual requests for special bibliographies on nuclear physics topics, in addition to those easily obtained from Recent References. If the required information is available from the keyword string, a reference list can be prepared automatically from the computer files. This service can be provided on request, in exchange for the timely communication of new nuclear physics results (e.g., preprints). A current copy of the NSR file may also be obtained in a standard format on magnetic tape from NNDC. Requests for special searches of the NSR file may also be directed to the National Nuclear Data Center

  14. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  15. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  16. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  17. The discovery of isotopes a complete compilation

    CERN Document Server

    Thoennessen, Michael

    2016-01-01

    This book describes the exciting discovery of every isotope observed on earth to date, which currently numbers some 3000. For each isotope a short essay highlights the authors of the first publication for the isotope, the laboratory and year where and when the isotope was discovered, as well as details about the production and detection methods used. In controversial cases previously claims are also discussed. At the end a comprehensive table lists all isotopes sorted by elements and a complete list of references. Preliminary versions of these paragraphs have been published over the last few years as separate articles in the journal "Atomic Data and Nuclear Data Tables". The work re-evaluates all assignments judging them with a uniform set of criteria. In addition, the author includes over 100 new isotopes which have been discovered since the articles published. This book is a source of information for researchers as well as enthusiastic laymen alike. From the prepublication review: “The explanations focus ...

  18. Compilation of Cognitive and Personality Norms for Military Aviators.

    Science.gov (United States)

    Carretta, Thomas R; King, Raymond E; Ree, Malcolm James; Teachout, Mark S; Barto, Erica

    2016-09-01

    The assessment of individuals on abilities or other characteristics is based on comparison to a representative sample. General population norms provide an appropriate reference group when the distribution of scores in the sample can be expected to be similar to those for the general population (e.g., comparing high school students at a particular school to national high school norms on a college entrance test). Specialized norms are needed, however, when subsets of the population differ from the population at large. Military pilot trainees represent a special population; they are highly screened on cognitive ability and other characteristics thought to be related to job performance. Other characteristics (e.g., personality) are thought to be "self-selected," resulting in distinctive profiles. Normative tables were developed for U.S. Air Force pilot trainees for two widely used tests, the Multidimensional Aptitude Battery-II (MAB-II) and NEO Personality Inventory-Revised (NEO PI-R). The MAB-II and NEO PI-R were administered to large samples of USAF cadets, ROTC students, and officers selected for pilot training. The mean MAB-II full-scale IQ was about 1.5 SD above the adult population norm and was much less variable, supporting the need for specialized norms. Tables showing the percentile equivalents are provided for use by clinicians. Use of these tables, in addition to, or in lieu of, commercially published norms, will prove helpful when clinical psychologists perform assessments on pilots; in particular when evaluating them for return-to-duty status following a disqualifying condition that may have affected cognitive functioning or emotional stability. Carretta TR, King RE, Ree MJ, Teachout MS, Barto E. Compilation of cognitive and personality norms for military aviators. Aerosp Med Hum Perform. 2016; 87(9):764-771.

  19. Evaluation and compilation of fission product yields 1993

    International Nuclear Information System (INIS)

    England, T.R.; Rider, B.F.

    1995-01-01

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993

  20. Evaluation and compilation of fission product yields 1993

    Energy Technology Data Exchange (ETDEWEB)

    England, T.R.; Rider, B.F.

    1995-12-31

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993.

  1. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  2. Exploring Global Exposure Factors Resources URLs

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset is a compilation of hyperlinks (URLs) for resources (databases, compendia, published articles, etc.) useful for exposure assessment specific to consumer...

  3. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  4. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  5. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  6. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  7. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  8. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  9. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  10. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  11. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  12. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  13. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  14. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  15. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  16. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  17. Uranium 2009 resources, production and demand

    CERN Document Server

    Organisation for Economic Cooperation and Development. Paris

    2010-01-01

    With several countries currently building nuclear power plants and planning the construction of more to meet long-term increases in electricity demand, uranium resources, production and demand remain topics of notable interest. In response to the projected growth in demand for uranium and declining inventories, the uranium industry – the first critical link in the fuel supply chain for nuclear reactors – is boosting production and developing plans for further increases in the near future. Strong market conditions will, however, be necessary to trigger the investments required to meet projected demand. The "Red Book", jointly prepared by the OECD Nuclear Energy Agency and the International Atomic Energy Agency, is a recognised world reference on uranium. It is based on information compiled in 40 countries, including those that are major producers and consumers of uranium. This 23rd edition provides a comprehensive review of world uranium supply and demand as of 1 January 2009, as well as data on global ur...

  18. Internet Resources for Reference: Finance and Investment.

    Science.gov (United States)

    Mai, Brent Alan

    1997-01-01

    When called upon to aid in filtering through finance and investment information on the Internet, the business librarian is also faced with knowing what is available and how to find it. Web sites are identified that provide information about stocks and their exchanges, mutual funds, bonds, company annual reports, and taxes. (Author/AEF)

  19. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  20. JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX

    Science.gov (United States)

    Omidian, Hossein; Lemieux, Guy G. F.

    2018-04-01

    Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.

  1. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  2. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  3. Compilation of floristic and herbarium specimen datain Iran: proposal to data structure

    Directory of Open Access Journals (Sweden)

    Majid Sharifi-Tehrani

    2013-09-01

    Full Text Available Floristic databases constitute the second level of plant information systems, after taxonomic-nomenclatural databases. This paper provided the details of data structure and available data resources to develop a floristic database, along with some explanations on taxonomic and floristic databases. Also, this paper proposed the availability and possibility of a shortcut to constructing a national floristic database through uniforming and compilation of dispersed floristic data contained in various botanical centers of Iran. Therefore, Iran could be the second country in SW Asia region to have a national floristic database, and the resulted services can be presented to national scientific community.

  4. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  5. Sensitivity of annual and seasonal reference crop ...

    Indian Academy of Sciences (India)

    scheduling and water resources management. Ref- ... time, and refers to evapotranspiration rate from a reference ... variable per unit increase in independent variable. Sensitivity ...... Pereira L S 2007 Relating water productivity and crop.

  6. Hawaii demand-side management resource assessment. Final report, Reference Volume 3 -- Residential and commercial sector DSM analyses: Detailed results from the DBEDT DSM assessment model; Part 1, Technical potential

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The Hawaii Demand-Side Management Resource Assessment was the fourth of seven projects in the Hawaii Energy Strategy (HES) program. HES was designed by the Department of Business, Economic Development, and Tourism (DBEDT) to produce an integrated energy strategy for the State of Hawaii. The purpose of Project 4 was to develop a comprehensive assessment of Hawaii`s demand-side management (DSM) resources. To meet this objective, the project was divided into two phases. The first phase included development of a DSM technology database and the identification of Hawaii commercial building characteristics through on-site audits. These Phase 1 products were then used in Phase 2 to identify expected energy impacts from DSM measures in typical residential and commercial buildings in Hawaii. The building energy simulation model DOE-2.1E was utilized to identify the DSM energy impacts. More detailed information on the typical buildings and the DOE-2.1E modeling effort is available in Reference Volume 1, ``Building Prototype Analysis``. In addition to the DOE-2.1E analysis, estimates of residential and commercial sector gas and electric DSM potential for the four counties of Honolulu, Hawaii, Maui, and Kauai through 2014 were forecasted by the new DBEDT DSM Assessment Model. Results from DBEDTs energy forecasting model, ENERGY 2020, were linked with results from DOE-2.1E building energy simulation runs and estimates of DSM measure impacts, costs, lifetime, and anticipated market penetration rates in the DBEDT DSM Model. Through its algorithms, estimates of DSM potential for each forecast year were developed. Using the load shape information from the DOE-2.1E simulation runs, estimates of electric peak demand impacts were developed. Numerous tables and figures illustrating the technical potential for demand-side management are included.

  7. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  8. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  9. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  10. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  11. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  12. Resource management plan for the Oak Ridge Reservation. Volume 30, Oak Ridge National Environmental Research Park natural areas and reference areas--Oak Ridge Reservation environmentally sensitive sites containing special plants, animals, and communities

    Energy Technology Data Exchange (ETDEWEB)

    Pounds, L.R. [Univ. of Tennessee, Knoxville, TN (US); Parr, P.D.; Ryon, M.G. [Oak Ridge National Lab., TN (United States)

    1993-08-01

    Areas on the Oak Ridge Reservation (ORR) that contain rare plant or animal species or are special habitats are protected through National Environmental Research Park Natural Area (NA) or Reference Area (RA) designations. The US Department of Energy`s Oak Ridge National Environmental Research Park program is responsible for identifying species of vascular plants that are endangered, threatened, or rare and, as much as possible, for conserving those areas in which such species grow. This report includes a listing of Research Park NAs and RAs with general habitat descriptions and a computer-generated map with the areas identified. These are the locations of rare plant or animal species or special habitats that are known at this time. As the Reservation continues to be surveyed, it is expected that additional sites will be designated as Research Park NAs or RAs. This document is a component of a larger effort to identify environmentally sensitive areas on ORR. This report identifies the currently known locations of rare plant species, rare animal species, and special biological communities. Floodplains, wetlands (except those in RAs or NAs), and cultural resources are not included in this report.

  13. Run-Time and Compiler Support for Programming in Adaptive Parallel Environments

    Directory of Open Access Journals (Sweden)

    Guy Edjlali

    1997-01-01

    Full Text Available For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have developed a run-time library to provide this support. We discuss how the run-time library can be used by compilers of high-performance Fortran (HPF-like languages to generate code for an adaptive environment. We present performance results for a Navier-Stokes solver and a multigrid template run on a network of workstations and an IBM SP-2. Our experiments show that if the number of processors is not varied frequently, the cost of data redistribution is not significant compared to the time required for the actual computation. Overall, our work establishes the feasibility of compiling HPF for a network of nondedicated workstations, which are likely to be an important resource for parallel programming in the future.

  14. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-07-28

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLS compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.

  15. Enterprise Reference Library

    Science.gov (United States)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and

  16. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  17. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  18. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  19. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  20. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  1. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  2. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  3. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  4. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  5. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  6. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  7. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  8. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  9. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  10. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  11. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  12. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  13. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  14. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  15. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  16. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  17. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  18. IDEAS international contamination database: a compilation of published internal contamination cases. A tool for the internal dosimetry community

    International Nuclear Information System (INIS)

    Hurtgen, C.

    2007-01-01

    The aim of the IDEAS project was to develop General Guidelines for the Assessment of Internal Dose from Monitoring Data. The project was divided into 5 Work Packages for the major tasks. Work Package 1 entitled Collection of incorporation cases was devoted to the collection of data by means of bibliographic research (survey of the open literature), contacting and collecting data from specific organisations and using information from existing databases on incorporation cases. To ensure that the guidelines would be applicable to a wide range of practical situations, a database of cases of internal contamination including monitoring data suitable for dose assessment was compiled. The IDEAS Bibliography database and the IDEAS Internal Contamination database were prepared and some reference cases were selected for use in Work Package 3. The other Work packages of the IDEAS Project (WP-2 Preparation of evaluation software, WP-3 Evaluation of incorporation cases, WP-4 Development of the general guidelines and WP-5 Practical testing of general guidelines) have been described in detail elsewhere and can be found on the IDEAS website. A search for reference from the open literature, which contained information on cases of internal contamination from which intake and committed doses could be assessed, has been compiled into a database. The IDEAS Bibliography Database includes references to papers which might (but were not certain to) contain such information, or which included references to papers which contained such information. This database contains the usual bibliographical information: authors' name(s), year of publication, title of publication and the journal or report number. Up to now, a comprehensive Bibliography Database containing 563 references has been compiled. Not surprisingly more than half of the references are from Health Physics and Radiation Protection Dosimetry Journals.The next step was for the partners of the IDEAS project to obtain the references

  19. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  20. Gulf Coast geopressured-geothermal program summary report compilation. Volume 4: Bibliography (annotated only for all major reports)

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    This bibliography contains US Department of Energy sponsored Geopressured-Geothermal reports published after 1984. Reports published prior to 1984 are documented in the Geopressured Geothermal bibliography Volumes 1, 2, and 3 that the Center for Energy Studies at the University of Texas at Austin compiled in May 1985. It represents reports, papers and articles covering topics from the scientific and technical aspects of geopressured geothermal reservoirs to the social, environmental, and legal considerations of exploiting those reservoirs for their energy resources.

  1. Uranium. Resources, production and demand

    International Nuclear Information System (INIS)

    1997-01-01

    The events characterising the world uranium market in the last several years illustrate the persistent uncertainly faced by uranium producers and consumers worldwide. With world nuclear capacity expanding and uranium production satisfying only about 60 per cent of demand, uranium stockpiles continue to be depleted at a high rate. The uncertainty related to the remaining levels of world uranium stockpiles and to the amount of surplus defence material that will be entering the market makes it difficult to determine when a closer balance between uranium supply and demand will be reached. Information in this report provides insights into changes expected in uranium supply and demand until well into the next century. The 'Red Book', jointly prepared by the OECD Nuclear Energy Agency and the International Atomic Energy Agency, is the foremost reference on uranium. This world report is based on official information from 59 countries and includes compilations of statistics on resources, exploration, production and demand as of 1 January 1997. It provides substantial new information from all of the major uranium producing centres in Africa, Australia, Eastern Europe, North America and the New Independent States, including the first-ever official reports on uranium production in Estonia, Mongolia, the Russian Federation and Uzbekistan. It also contains an international expert analysis of industry statistics and worldwide projections of nuclear energy growth, uranium requirements and uranium supply

  2. Geology in coal resource utilization

    International Nuclear Information System (INIS)

    Peters, D.C.

    1991-01-01

    The 37 papers in this book were compiled with an overriding theme in mind: to provide the coal industry with a comprehensive source of information on how geology and geologic concepts can be applied to the many facets of coal resource location, extraction, and utilization. The chapters have been arranged to address the major coal geology subfields of Exploration and Reserve Definition, Reserve Estimation, Coalbed Methane, Underground Coal Gasification, Mining, Coal Quality Concerns, and Environmental Impacts, with papers distributed on the basis of their primary emphasis. To help guide one through the collection, the author has included prefaces at the beginning of each chapter. They are intended as a brief lead-in to the subject of the chapter and an acknowledgement of the papers' connections to the subject and contributions to the chapter. In addition, a brief cross-reference section has been included in each preface to help one find papers of interest in other chapters. The subfields of coal geology are intimately intertwined, and investigations in one area may impact problems in another area. Some subfields tend to blur at their edges, such as with reserve definition and reserve estimation. Papers have been processed separately for inclusion on the data base

  3. Summary of the mineral- and energy-resource endowment, BLM roswell resource area, east-central New Mexico

    Science.gov (United States)

    Bartsch-Winkler, S.; Sutphin, D.M.; Ball, M.M.; Korzeb, S.L.; Kness, R.F.; Dutchover, J.T.

    1993-01-01

    In this summary of two comprehensive resource reports produced by the U.S. Bureau of Mines and the U.S. Geological Survey for the U.S. Bureau of Land Management, we discuss the mineral- and energyresource endowment of the 14-millon-acre Roswell Resource Area, New Mexico, managed by the Bureau of Land Management. The Bureau and Survey reports result from separate studies that are compilations of published and unpublished data and integrate new findings on the geology, geochemistry, geophysics, mineral, industrial, and energy commodities, and resources for the seven-county area. The reports have been used by the Bureau of Land Management in preparation of the Roswell Resource Area Resource Management Plan, and will have future use in nationwide mineral- and energy-resource inventories and assessments, as reference and training documents, and as public-information tools. In the Roswell Resource Area, many metals, industrial mineral commodities, and energy resources are being, or have been, produced or prospected. These include metals and high-technology materials, such as copper, gold, silver, thorium, uranium and/or vanadium, rare-earth element minerals, iron, manganese, tungsten, lead, zinc, and molybdenum; industrial mineral resources, including barite, limestone/dolomite, caliche, clay, fluorspar, gypsum, scoria, aggregate, and sand and gravel; and fuels and associated resources, such as oil, gas, tar sand and heavy oil, coal, and gases associated with hydrocarbons. Other commodities that have yet to be identified in economic concentrations include potash, halite, polyhalite, anhydrite, sulfur, feldspar, building stone and decorative rock, brines, various gases associated with oil and gas exploration, and carbon dioxide. ?? 1993 Oxford University Press.

  4. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  5. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  6. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  7. OSH technical reference manual

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    In an evaluation of the Department of Energy (DOE) Occupational Safety and Health programs for government-owned contractor-operated (GOCO) activities, the Department of Labor`s Occupational Safety and Health Administration (OSHA) recommended a technical information exchange program. The intent was to share written safety and health programs, plans, training manuals, and materials within the entire DOE community. The OSH Technical Reference (OTR) helps support the secretary`s response to the OSHA finding by providing a one-stop resource and referral for technical information that relates to safe operations and practice. It also serves as a technical information exchange tool to reference DOE-wide materials pertinent to specific safety topics and, with some modification, as a training aid. The OTR bridges the gap between general safety documents and very specific requirements documents. It is tailored to the DOE community and incorporates DOE field experience.

  8. Coal Data: A reference

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of Coal Data: A Reference is to provide basic information on the mining and use of coal, an important source of energy in the United States. The report is written for a general audience. The goal is to cover basic material and strike a reasonable compromise between overly generalized statements and detailed analyses. The section ''Coal Terminology and Related Information'' provides additional information about terms mentioned in the text and introduces new terms. Topics covered are US coal deposits, resources and reserves, mining, production, employment and productivity, health and safety, preparation, transportation, supply and stocks, use, coal, the environment, and more. (VC)

  9. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  10. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  11. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  12. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  13. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  14. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  15. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  16. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  17. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  18. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  19. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  20. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  1. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  2. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  3. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  4. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  5. Year 2000 Readiness Kit: A Compilation of Y2K Resources for Schools, Colleges and Universities.

    Science.gov (United States)

    Department of Education, Washington, DC.

    This kit was developed to assist the postsecondary education community's efforts to resolve the Year 2000 (Y2K) computer problem. The kit includes a description of the Y2K problem, an assessment of the readiness of colleges and universities, a checklist for institutions, a Y2K communications strategy, articles on addressing the problem in academic…

  6. Final Report Low-temperature Resource Assessment Program

    Energy Technology Data Exchange (ETDEWEB)

    Lienau, P.J. [Geo-Heat Center, Oregon Institute of Technology, Klamath Falls, OR (US); Ross, H. [Earth Sciences and Resources Institute, University of Utah

    1996-02-01

    The U.S. Department of Energy - Geothermal Division (DOE/GD) recently sponsored the Low-Temperature Resource Assessment project to update the inventory of the nation's low- and moderate-temperature geothermal resources and to encourage development of these resources. A database of 8,977 thermal wells and springs that are in the temperature range of 20 degrees Celsius to 150 degrees Celsius has been compiled for ten western states, an impressive increase of 82% compared to the previous assessments. The database includes location, descriptive data, physical parameters, water chemistry and references for sources of data. Computer-generated maps are also available for each state. State Teams have identified 48 high-priority areas for near-term comprehensive resource studies and development. Resources with temperatures greater than 50 degrees Celsius located within 8 km of a population center were identified for 271 collocated cities. Geothermal energy costevaluation software has been developed to quickly identify the cost of geothermally supplied heat to these areas in a fashion similar to that used for conventionally fueled heat sources.

  7. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  8. Online Resources

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Online Resources. Journal of Genetics. Online Resources. Volume 97. 2018 | Online resources. Volume 96. 2017 | Online resources. Volume 95. 2016 | Online resources. Volume 94. 2015 | Online resources. Volume 93. 2014 | Online resources. Volume 92. 2013 | Online resources ...

  9. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  10. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  11. AREVA - 2013 Reference document

    International Nuclear Information System (INIS)

    2014-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies, as well as estimates of the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the Reference Document; 2 - Statutory auditors; 3 - Selected financial information; 4 - Description of major risks confronting the company; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Situation and activities of the company and its subsidiaries; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts or estimates; 14 - Management and supervisory bodies; 15 - Compensation and benefits; 16 - Functioning of the management and supervisory bodies; 17 - Human resources information; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - Information on holdings; Appendix 1: report of the supervisory board chairman on the preparation and organization of the board's activities and internal control procedures; Appendix 2: statutory auditors' reports; Appendix 3: environmental report; Appendix 4: non-financial reporting methodology and independent third-party report on social, environmental and societal data; Appendix 5: ordinary and extraordinary general shareholders' meeting; Appendix 6: values charter; Appendix 7: table of concordance of the management report; glossaries

  12. Compilation of data relating to the erosive response of 608 recently-burned basins in the western United States

    Science.gov (United States)

    Gartner, Joseph E.; Cannon, Susan H.; Bigio, Erica R.; Davis, Nicole K.; Parrett, Charles; Pierce, Kenneth L.; Rupert, Michael G.; Thurston, Brandon L.; Trebesch, Matthew J.; Garcia, Steve P.; Rea, Alan H.

    2005-01-01

    This report presents a compilation of data on the erosive response, debris-flow initiation processes, basin morphology, burn severity, event-triggering rainfall, rock type, and soils for 608 basins recently burned by 53 fires located throughout the Western United States.  The data presented here are a combination of those collected during our own field research and those reported in the literature.  In some cases, data from a Geographic Information System (GIS) and Digital Elevation Models (DEMs) were used to supplement the data from the primary source.  Due to gaps in the information available, not all parameters are characterized for all basins. This database provides a resource for researchers and land managers interested in examining relations between the runoff response of recently burned basins and their morphology, burn severity, soils and rock type, and triggering rainfall.  The purpose of this compilation is to provide a single resource for future studies addressing problems associated with wildfire-related erosion.  For example, data in this compilation have been used to develop a model for debris flow probability from recently burned basins using logistic multiple regression analysis (Cannon and others, 2004).  This database provides a convenient starting point for other studies.  For additional information on estimated post-fire runoff peak discharges and debris-flow volumes, see Gartner and others (2004).

  13. Reference nuclear data for space technology

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.; Pearlstein, S.

    1977-01-01

    Specialized bibliographic searches, data compilations, and data evaluations help the basic and applied research scientist in his work. The National Nuclear Data Center (NNDC) collates and analyzes nuclear physics information, and is concerned with the timely production and revision of reference nuclear data. A frequently revised reference data base in computerized form has the advantage of large quantities of data available without publication delays. The information normally handled by coordinated efforts of NNDC consists of neutron, charged-particle, nuclear structure, radioactive decay, and photonuclear data. 2 figures

  14. 2007 Survey of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This WEC study is a unique comprehensive compilation of global energy resources. Complementing the BP Statistical Review and the World Energy Outlook, it details 16 key energy resources with the latest data provided by 96 WEC Member Committees worldwide. This highly regarded publication is an essential tool for governments, NGOs, industry, academia and the finance community. This 21st edition is the latest in a long series of reviews of the status of the world's major energy resources. It covers not only the fossil fuels but also the major types of traditional and novel sources of energy.

  15. 2007 Survey of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This WEC study is a unique comprehensive compilation of global energy resources. Complementing the BP Statistical Review and the World Energy Outlook, it details 16 key energy resources with the latest data provided by 96 WEC Member Committees worldwide. This highly regarded publication is an essential tool for governments, NGOs, industry, academia and the finance community. This 21st edition is the latest in a long series of reviews of the status of the world's major energy resources. It covers not only the fossil fuels but also the major types of traditional and novel sources of energy.

  16. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  17. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  18. Selected Resources and Bibliography

    Science.gov (United States)

    New Directions for Higher Education, 2011

    2011-01-01

    This chapter provides an annotated bibliography of resources pertaining to international branch campuses (IBCs). This collection of references has been selected to represent the breadth of emerging scholarship on cross-border higher education and is intended to provide further resources on a range of concerns surrounding cross-border higher…

  19. Annual accumulation over the Greenland ice sheet interpolated from historical and newly compiled observation data

    Science.gov (United States)

    Shen, Dayong; Liu, Yuling; Huang, Shengli

    2012-01-01

    The estimation of ice/snow accumulation is of great significance in quantifying the mass balance of ice sheets and variation in water resources. Improving the accuracy and reducing uncertainty has been a challenge for the estimation of annual accumulation over the Greenland ice sheet. In this study, we kriged and analyzed the spatial pattern of accumulation based on an observation data series including 315 points used in a recent research, plus 101 ice cores and snow pits and newly compiled 23 coastal weather station data. The estimated annual accumulation over the Greenland ice sheet is 31.2 g cm−2 yr−1, with a standard error of 0.9 g cm−2 yr−1. The main differences between the improved map developed in this study and the recently published accumulation maps are in the coastal areas, especially southeast and southwest regions. The analysis of accumulations versus elevation reveals the distribution patterns of accumulation over the Greenland ice sheet.

  20. Uranium 2009: Resources, Production and Demand

    International Nuclear Information System (INIS)

    2010-01-01

    With several countries currently building nuclear power plants and planning the construction of more to meet long-term increases in electricity demand, uranium resources, production and demand remain topics of notable interest. In response to the projected growth in demand for uranium and declining inventories, the uranium industry - the first critical link in the fuel supply chain for nuclear reactors - is boosting production and developing plans for further increases in the near future. Strong market conditions will, however, be necessary to trigger the investments required to meet projected demand. The 'Red Book', jointly prepared by the OECD Nuclear Energy Agency and the International Atomic Energy Agency, is a recognised world reference on uranium. It is based on information compiled in 40 countries, including those that are major producers and consumers of uranium. This 23. edition provides a comprehensive review of world uranium supply and demand as of 1 January 2009, as well as data on global uranium exploration, resources, production and reactor-related requirements. It provides substantive new information from major uranium production centres around the world, as well as from countries developing production centres for the first time. Projections of nuclear generating capacity and reactor-related uranium requirements through 2035 are also featured, along with an analysis of long-term uranium supply and demand issues

  1. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  2. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  3. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  4. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  5. The Concept of "Simultaneous Feedback": Towards a New Methodology for Compiling Dictionaries

    Directory of Open Access Journals (Sweden)

    Gilles-Maurice de Schryver

    2011-10-01

    Full Text Available

    Abstract: Good lexicographers are constantly striving to enhance the quality of their dictionaries. Since dictionaries are ultimately judged by their target users, there is an urgency to provide for the target users' needs. In order to determine such needs more accurately, it has become common practice to submit users of a dictionary to a series of tests to monitor their success in information rehieval. In most cases such feedback unfortunately comes too late so that it can at best be considered for. implementation in the next or revised edition of the dictionary. In this article it is argued that feedback from the target users should be obtained while the compilation of the dictionary is still in progress, a process referred to as "simultaneous feedback". This concept, which offers a new methodology for compiling dictionaries, overcomes the major problem 'of creating and publishing entire dictionaries before feedback from target users can be obtained. By this new methodology, the release of several small-scale parallel dictionaries triggers feedback that is immediately channelled to the compilation process of a main dictionary. As such, the target users constantly guide the compilers during the entire compilation process. After a theoretical presentation of the new concept, the feasibility of simultaneous feedback is illustrated with reference to the creation of a bilingual CiIuba-Dutch leamer's dictionary. It is shown how this main project has been successfully complemented by three parallel projects.

    Keywords: SIMULTANEOUS FEEDBACK, NEW METHOOOLOGY, MAIN DICTIONARY, PARALLEL DICTIONARIES, TARGET USERS' DESIRES, QUESTIONNAIRES, ELECTRONIC CORPORA, WORD-FREQUENCY STUDIES, CONCORDANCES, AFRICAN LANGUAGES, CILUBÀ

    Opsomming: Die konsep van "gelyktydige terugvoering": Onderweg na Innuwe metodologie vir die samestelling van. woordeboeke. Goeie leksikograwestreef voortdurend daama om die gehalte van hul woordeboeke te verbeter

  6. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  7. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  8. Herpes - resources

    Science.gov (United States)

    Genital herpes - resources; Resources - genital herpes ... following organizations are good resources for information on genital herpes : March of Dimes -- www.marchofdimes.org/complications/sexually- ...

  9. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  10. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  11. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  12. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  13. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  14. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  15. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  16. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  17. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  18. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  19. NACRE II: an update of the NACRE compilation of charged-particle-induced thermonuclear reaction rates for nuclei with mass number A<16

    International Nuclear Information System (INIS)

    Xu, Y.; Takahashi, K.; Goriely, S.; Arnould, M.; Ohta, M.; Utsunomiya, H.

    2013-01-01

    An update of the NACRE compilation [3] is presented. This new compilation, referred to as NACRE II, reports thermonuclear reaction rates for 34 charged-particle induced, two-body exoergic reactions on nuclides with mass number A 6 ≲T⩽10 10 K range. Along with the ‘adopted’ rates, their low and high limits are provided. The new rates are available in electronic form as part of the Brussels Library (BRUSLIB) of nuclear data. The NACRE II rates also supersede the previous NACRE rates in the Nuclear Network Generator (NETGEN) for astrophysics. [ (http://www.astro.ulb.ac.be/databases.html)

  20. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  1. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  2. A Compilation and Review of over 500 Geoscience Misconceptions

    Science.gov (United States)

    Francek, Mark

    2013-01-01

    This paper organizes and analyses over 500 geoscience misconceptions relating to earthquakes, earth structure, geologic resources, glaciers, historical geology, karst (limestone terrains), plate tectonics, rivers, rocks and minerals, soils, volcanoes, and weathering and erosion. Journal and reliable web resources were reviewed to discover (1) the…

  3. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  4. Global peat resources

    Energy Technology Data Exchange (ETDEWEB)

    Lappalainen, E. [ed.] [Geological Survey of Finland (Finland)

    1996-12-31

    The book provides a detailed review of the world`s peat and peatland resources and their role in the biosphere. It was compiled by 68 peat experts. Reports present the valuable mire ecosystem, its characteristics, and the use of peatlands. Maps and photographs illustrate the distribution of mines and their special characteristics, including raised bogs, aapa mires, blanket bogs, mangrove swamps, swamp forests etc. The book contains a total of 57 chapters, the bulk of then giving surveys of peat resources and use in individual countries. They are grouped under the headings: peatlands in biosphere; general review; Europe; Asia; Africa; North America; Central and South America; Australia (and New Zealand); and use of peatlands. One chapter has been abstracted separately for the IEA Coal Research CD-ROM. 7 apps.

  5. AREVA - 2012 Reference document

    International Nuclear Information System (INIS)

    2013-03-01

    After a presentation of the person responsible for this Reference Document, of statutory auditors, and of a summary of financial information, this report address the different risk factors: risk management and coverage, legal risk, industrial and environmental risk, operational risk, risk related to major projects, liquidity and market risk, and other risks (related to political and economic conditions, to Group's structure, and to human resources). The next parts propose information about the issuer, a business overview (markets for nuclear power and renewable energies, customers and suppliers, group's strategy, operations), a brief presentation of the organizational structure, a presentation of properties, plants and equipment (principal sites, environmental issues which may affect these items), analysis and comments on the group's financial position and performance, a presentation of capital resources, a presentation of research and development activities (programs, patents and licenses), a brief description of financial objectives and profit forecasts or estimates, a presentation of administration, management and supervision bodies, a description of the operation of corporate bodies, an overview of personnel, of principal shareholders, and of transactions with related parties, a more detailed presentation of financial information concerning assets, financial positions and financial performance. Addition information regarding share capital is given, as well as an indication of major contracts, third party information, available documents, and information on holdings

  6. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  7. Reporting session of UWTF operation. Compilation of documents

    International Nuclear Information System (INIS)

    Shimizu, Kaoru; Togashi, Akio; Irinouchi, Shigenori

    1999-07-01

    This is the compilation of the papers and OHP transparencies presented, as well as discussions and comments, on the occasion of UWTF reporting session. UWTF stands for The Second Uranium Waste Treatment Facility, which was constructed for compression of metallic wastes and used filters, which are parts of uranium bearing solid wastes generated from Tokai Works, Japan Nuclear Cycle Development Institute. UWTF has been processing wastes since June 4 1998. In the session, based on the one year experience of UWTF operation, the difficulties met and the suggestions to the waste sources are mainly discussed. A brief summary of the UWTF construction, description of waste treatment process, and operation report of fiscal year 1998 are attached. (A. Yamamoto)

  8. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    International Nuclear Information System (INIS)

    Harrington, S.J.

    2011-01-01

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  9. [Version and compilation of Harikyuuhousouyou of Vietnamese medical book].

    Science.gov (United States)

    Du, Fengjuan; Xiao, Yongzhi

    2018-02-12

    Harikyuuhousouyou (《》)was written in 1827 and the author is unknown. The book has only one version which is collected by the National Library of Vietnam. The book contains one volume and includes contraindication of acupuncture and moxibustion, meridian points, point locations, indications and the therapeutic methods at extraordinary points. They are mainly cited from Zhen Jiu Da Quan (《》 Great Compendium on Acupuncture and Moxibustion ) by XU Feng , Yi Xue Ru Men (《》 Elementary Medicine ) by LI Chan and Shou Shi Bao Yuan (《》 Longevity and Health Preservation ) by GONG Tingxian in the Ming Dynasty. In the paper, in view of the characteristics of version and compilation, the hand-coped book was introduced. It was explored that Vietnam acupuncture absorbed Chinese medicine and emphasized clinical practice rather than theoretic statement.

  10. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  11. Neutron data compilation at the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Lemmel, H.D.; Attree, P.M.; Byer, T.A.; Good, W.M.; Hjaerne, L.; Konshin, V.A.; Lorens, A.

    1968-03-01

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  12. Neutron data compilation at the International Atomic Energy Agency

    Energy Technology Data Exchange (ETDEWEB)

    Lemmel, H D; Attree, P M; Byer, T A; Good, W M; Hjaerne, L; Konshin, V A; Lorens, A [Nuclear Data Unit, International Atomic Energy Agency, Vienna (Austria)

    1968-03-15

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  13. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  14. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  15. Compilation of nuclear decay data used for dose calculations. Data for radionuclides not listed in ICRP publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Tsutomu

    1999-07-01

    Nuclear decay data used for dose calculations were compiled for 162 nuclides with half-lives greater than or equal to 10 min that are not listed in ICRP Publication 38 (Publ. 38) and their 28 daughter nuclides. Additional 14 nuclides that are considered to be important in fusion reactor facilities were also included. The data were compiled using decay data sets of the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Investigations of the data sets were performed to check their consistency by referring to recent literature and NUBASE, the database for nuclear and decay properties of nuclides, and by using the utility programs of ENSDF. Possible revisions of the data sets were made for their format and syntax errors, level schemes, normalization records, and so on. The revised data sets were processed by EDISTR in order to calculate the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays including annihilation photons, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformations of the radionuclides. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were presented in two types of format; Publ. 38 and NUCDECAY formats. This report provides the decay data in the Publ. 38 format along with decay scheme drawings. The data will be widely used for internal and external dose calculations in radiation protection. (author)

  16. Library resources on the Internet

    Science.gov (United States)

    Buchanan, Nancy L.

    1995-07-01

    Library resources are prevalent on the Internet. Library catalogs, electronic books, electronic periodicals, periodical indexes, reference sources, and U.S. Government documents are available by telnet, Gopher, World Wide Web, and FTP. Comparatively few copyrighted library resources are available freely on the Internet. Internet implementations of library resources can add useful features, such as full-text searching. There are discussion lists, Gophers, and World Wide Web pages to help users keep up with new resources and changes to existing ones. The future will bring more library resources, more types of library resources, and more integrated implementations of such resources to the Internet.

  17. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the ecotoxicological effect concentrations, the occurrence

  18. Compilation of monitoring data on environmental concentration and pharmaceuticals; Zusammenstellung von Monitoringdaten zu Umweltkonzentrationen und Arzneimitteln

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Axel; Fohrmann, Reinhard [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Muelheim an der Ruhr (Germany); Weber, Frank-Andreas [IWW Rheinisch-Westfaelisches Institut fuer Wasser Beratungs- und Entwicklungsgesellschaft mbH, Biebesheim am Rhein (Germany)

    2011-10-15

    In a comprehensive literature review we compiled an inventory of German and European monitoring data on the occurrence and behavior of pharmaceuticals in the environment. Environmental concentrations measured in various field campaigns and results of ecotoxicological and physico-chemical investigations were integrated in three databases. The analysis of these databases was used to identify priority pharmaceuticals and to suggest strategies for further monitoring. The database MEC reports 274 pharmaceuticals (both human and veterinary pharmaceuticals, of which 27 are metabolites), for which measured concentrations were available for one of the matrices sewage effluent, surface water, groundwater, drinking water, sewage sludge, manure, soil or sediment (10,150 database entries). The database OeKOTOX compiles 251 pharmaceuticals, for which ecotoxicological effect concentrations for at least one test organism are available in the literature and the database ''Umweltverhalten'' includes physico-chemical parameters of 183 compounds. The compiled citations of the relevant literature (1,382 citations) were provided for further use in the bibliographic software Reference Manager. The analysis of the databases shows that for only a subset of 70 pharmaceuticals measured concentrations can be evaluated based on ecotoxicological effect concentrations. The estimation of PNEC-values (Predicted No Effect Concentration) allowed for the identification of 19 pharmaceuticals with sufficient and 9 pharmaceuticals with poor ecotoxicological data which presumably endanger ecosystems in at least one river section in Germany. Special attention should be paid to ''novel'' pharmaceuticals, for which missing environmental and/or ecotoxicological data prevent a reliable risk assessment, but dramatically increasing consumption rates point to a high risk potential. The prioritization of pharmaceuticals presented by the authors considers the

  19. The Compilation of the Shona–English Biomedical Dictionary: Problems and Challenges

    Directory of Open Access Journals (Sweden)

    Nomalanga Mpofu

    2011-10-01

    Full Text Available

    ABSTRACT: The bilingual Shona–English dictionary of biomedical terms, Duramazwi reUrapi neUtano, was compiled with the aim of improving the efficiency of communication between doctor and patient. The dictionary is composed of terms from both modern and traditional medicinal practices. The article seeks to look at the methods of production of the dictionary, the presentation of entries in the dictionary and the problems and challenges encountered in the compilation proc-ess, namely, developing Shona medical terminology in the cultural context and especially the as-pect of equivalence between English and Shona biomedical terms.

    Keywords: BIOMEDICAL, ADOPTIVES, ENTRIES, SYNONYMS, CROSS-REFERENCES, IDIOMS, CIRCUMLOCUTION, STANDARDISATION, HEADWORD, EQUIVALENCE, VARI-ANTS, DEFINITION, CULTURE, EUPHEMISMS, MODERN, TRADITIONAL, MONOLINGUAL, BILINGUAL, CORPUS, BORROWING, SHONA, COMMUNICATION

    *****

    OPSOMMING: Die samestelling van die Sjona–Engelse biomediese woorde-boek: Probleme en uitdagings. Die tweetalige Sjona–Engelse woordeboek van biomediese terme, Duramazwi reUrapi neUtano, is saamgestel met die doel om die effektiwiteit van kommunika-sie tussen dokter en pasiënt te verbeter. Die woordeboek bestaan uit terme van sowel moderne as tradisionele geneeskundige praktyke. Die artikel wil die metodes van die totstandkoming van die woordeboek beskou, die aanbieding van die inskrywings in die woordeboek en die probleme en uitdagings wat in die samestellingsproses teëgekom is, naamlik, die ontwikkeling van Sjona- mediese terminolgie binne die kulturele konteks en veral die aspek van ekwivalensie tussen Engel-se en Sjona- biomediese terme.

    Sleutelwoorde: BIOMEDIES, LEENWOORDE, INSKRYWINGS, SINONIEME, KRUISVER-WYSINGS, IDIOME, OMSKRYWING, STANDAARDISASIE, TREFWOORD, EKWIVALENSIE, WISSELVORME, DEFINISIE, KULTUUR, EUFEMISMES, MODERN, TRADISIONEEL, EEN-TALIG, TWEETALIG, KORPUS, ONTLENING, KOMMUNIKASIE, SJONA

  20. Electrical engineering a pocket reference

    CERN Document Server

    Schmidt-Walter, Heinz

    2007-01-01

    This essential reference offers you a well-organized resource for accessing the basic electrical engineering knowledge you need for your work. Whether you're an experienced engineer who appreciates an occasional refresher in key areas, or a student preparing to enter the field, Electrical Engineering: A Pocket Reference provides quick and easy access to fundamental principles and their applications. You also find an extensive collection of time-saving equations that help simplify your daily projects.Supported with more than 500 diagrams and figures, 60 tables, and an extensive index, this uniq

  1. Nuclear science references coding manual

    International Nuclear Information System (INIS)

    Ramavataram, S.; Dunford, C.L.

    1996-08-01

    This manual is intended as a guide to Nuclear Science References (NSR) compilers. The basic conventions followed at the National Nuclear Data Center (NNDC), which are compatible with the maintenance and updating of and retrieval from the Nuclear Science References (NSR) file, are outlined. In Section H, the structure of the NSR file such as the valid record identifiers, record contents, text fields as well as the major TOPICS for which are prepared are enumerated. Relevant comments regarding a new entry into the NSR file, assignment of , generation of and linkage characteristics are also given in Section II. In Section III, a brief definition of the Keyword abstract is given followed by specific examples; for each TOPIC, the criteria for inclusion of an article as an entry into the NSR file as well as coding procedures are described. Authors preparing Keyword abstracts either to be published in a Journal (e.g., Nucl. Phys. A) or to be sent directly to NNDC (e.g., Phys. Rev. C) should follow the illustrations in Section III. The scope of the literature covered at the NNDC, the categorization into Primary and Secondary sources, etc., is discussed in Section IV. Useful information regarding permitted character sets, recommended abbreviations, etc., is given under Section V as Appendices

  2. Nuclear structure references coding manual

    International Nuclear Information System (INIS)

    Ramavataram, S.; Dunford, C.L.

    1984-02-01

    This manual is intended as a guide to Nuclear Structure References (NSR) compilers. The basic conventions followed at the National Nuclear Data Center (NNDC), which are compatible with the maintenance and updating of and retrieval from the Nuclear Structure References (NSR) file, are outlined. The structure of the NSR file such as the valid record identifiers, record contents, text fields as well as the major topics for which [KEYWORDS] are prepared are ennumerated. Relevant comments regarding a new entry into the NSR file, assignment of [KEYNO ], generation of [SELECTRS] and linkage characteristics are also given. A brief definition of the Keyword abstract is given followed by specific examples; for each TOPIC, the criteria for inclusion of an article as an entry into the NSR file as well as coding procedures are described. Authors submitting articles to Journals which require Keyword abstracts should follow the illustrations. The scope of the literature covered at NNDC, the categorization into Primary and Secondary sources, etc. is discussed. Useful information regarding permitted character sets, recommended abbreviations, etc. is given

  3. ebibliographical compilation of phd theses produced at the faculty

    African Journals Online (AJOL)

    USER

    2015-06-01

    Jun 1, 2015 ... Faculty of Science, Bayero University, Kano from 2004 to 2014. Aim was to enable the ... bibliography refers to the systematic list of books and other information ... Project (West Zone) and Management. Issues. Unpublished ...

  4. Not Mere Lexicographic Cosmetics: The Compilation and Structural ...

    African Journals Online (AJOL)

    Riette Ruthven

    ERENCE SKILLS, OUTER TEXTS, LEMMATA, DICTIONARY ARTICLE ..... lexicographic conventions is explained and illustrated by means of articles extracted ..... effect paradigmatic relations as in itshelo (cello) where users are referred to all.

  5. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  6. Contents and assessment of basic tourism resources

    OpenAIRE

    Knezevic, Rade

    2008-01-01

    The article looks at the attractive factors of basic tourism resources and the structure of their attractions. The general term ‘resource’ refers to both natural and anthropogenic resources, while the content of this concept refers to elements used in creating a tourism product. Basic tourism resources are the most important factors of tourism processes, with a vital attribute of direct and indirect tourism resources being their substitutability. Natural (biotropic) resources are consid...

  7. Technology: Trigger for Change in Reference Librarianship.

    Science.gov (United States)

    Hallman, Clark N.

    1990-01-01

    Discussion of the influence of technological developments on social change focuses on the effects of information technology on academic reference librarianship. Highlights include reference skills; electronic resources; microcomputer technology; online catalogs; interaction and communication with users; the need to teach information skills; and…

  8. Resource Guide for Crisis Management in Schools.

    Science.gov (United States)

    LaPointe, Richard T.; And Others

    A crisis can occur at any time, whether or not a school's staff plans for it. This resource guide is a compilation of user-friendly examples of policies, procedures, guidelines, checklists, and forms to help Virginia schools develop and implement a systematic crisis-management plan. Chapter 1 provides an introductory overview of the essential…

  9. Resources on Academic Bargaining and Governance.

    Science.gov (United States)

    Tice, Terrence N.

    In recent years several bibliographies have been compiled on the subject of collective bargaining in higher education. This publication is an attempt to provide laymen with an up-to-date and comprehensive bibliography. Citations are presented in three categories: (1) agencies, bibliographies, periodicals, and other basic resources; (2) public…

  10. Resource Papers No. 4-12.

    Science.gov (United States)

    National Council for Resource Development, Washington, DC.

    This document compiles nine papers issued by the National Council for Resource Development. Papers include: (1) "How to Be Successful at Grantsmanship--Guidelines for Proposal Writing--Foundation Proposals"; (2) "A Federal Glossary" (acronyms); (3) "Special Projects" (a working model for an institutional development office); (4) "The Role of the…

  11. Immigration Stress: Families in Crisis. Resource Guide.

    Science.gov (United States)

    Leon County Schools, Tallahassee, FL.

    This resource guide has been compiled to assist teachers of English for Speakers of Other Languages (ESOL) in meeting the needs of immigrant families. Its purpose is to help reduce immigrant stress by making important information readily available to immigrant families. The guide is divided into the major categories of socialization, education,…

  12. Forest Genetic Resources Conservation and Management

    DEFF Research Database (Denmark)

    Ukendt, FAO; Ukendt, DFSC; Ukendt, ICRAF

    FAO, IPGRI/SAFORGEN, DFSCand ICRAF have cooperated on the compilation of17 booklets on the state of Forest Genetic Resources for thecountries listed below. When ordering your book please remember to write the country required on the email. Benin, Burkina Faso, Cote d\\Ivoire, Ethiopia, Gambia......, Guinee, Ghana, Kenya, Mali, Mauritania, Niger, North of Nigeria, North Cameroon, Senegal, Sudan, Tchad and Togo....

  13. Silicon compilation: From the circuit to the system

    Science.gov (United States)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  14. Development of automatic cross section compilation system for MCNP

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Sakurai, Kiyoshi

    1999-01-01

    A development of a code system to automatically convert cross-sections for MCNP is in progress. The NJOY code is, in general, used to convert the data compiled in the ENDF format (Evaluated Nuclear Data Files by BNL) into the cross-section libraries required by various reactor physics codes. While the cross-section library: FSXLIB-J3R2 was already converted from the JENDL-3.2 version of Japanese Evaluated Nuclear Data Library for a continuous energy Monte Carlo code MCNP, the library keeps only the cross-sections at room temperature (300 K). According to the users requirements which want to have cross-sections at higher temperature, say 600 K or 900 K, a code system named 'autonj' is under development to provide a set of cross-section library of arbitrary temperature for the MCNP code. This system can accept any of data formats adopted JENDL that may not be treated by NJOY code. The input preparation that is repeatedly required at every nuclide on NJOY execution is greatly reduced by permitting the conversion process of as many nuclides as the user wants in one execution. A few MCNP runs were achieved for verification purpose by using two libraries FSXLIB-J3R2 and the output of autonj'. The almost identical MCNP results within the statistical errors show the 'autonj' output library is correct. In FY 1998, the system will be completed, and in FY 1999, the user's manual will be published. (K. Tsuchihashi)

  15. Mars Pathfinder and Mars Global Surveyor Outreach Compilation

    Science.gov (United States)

    1999-09-01

    This videotape is a compilation of the best NASA JPL (Jet Propulsion Laboratory) videos of the Mars Pathfinder and Mars Global Surveyor missions. The mission is described using animation and narration as well as some actual footage of the entire sequence of mission events. Included within these animations are the spacecraft orbit insertion; descent to the Mars surface; deployment of the airbags and instruments; and exploration by Sojourner, the Mars rover. JPL activities at spacecraft control during significant mission events are also included at the end. The spacecraft cameras pan the surrounding Mars terrain and film Sojourner traversing the surface and inspecting rocks. A single, brief, processed image of the Cydonia region (Mars face) at an oblique angle from the Mars Global Surveyor is presented. A description of the Mars Pathfinder mission, instruments, landing and deployment process, Mars approach, spacecraft orbit insertion, rover operation are all described using computer animation. Actual color footage of Sojourner as well as a 360 deg pan of the Mars terrain surrounding the spacecraft is provided. Lower quality black and white photography depicting Sojourner traversing the Mars surface and inspecting Martian rocks also is included.

  16. The significant event compilation tree-sect: Theory and application

    International Nuclear Information System (INIS)

    Ishack, G.A.

    1990-01-01

    The Significant Event Compilation Tree (SECT) is a computer programme that was developed by staff of the Canadian Atomic Energy Control Board during the period 1984-86. Its primary purpose is to link seemingly unrelated events, or parts of events, that could have occurred at different points in time at various nuclear power plants. Using such a software tool aids in the identification of potential paths and/or scenarios that: a. may not have been foreseen in the accident analysis (including fault tree verification), b. could lead to a certain failure; or c. could have been caused by a certain initiating event (which may have ended or been terminated at an earlier stage). This paper describes: a. the basic idea of SECT; b. the criteria whereby events are selected and coded; c. the options available to the user; d. an example of the programme's application in Canada; and e. a demonstration of its possible use in conjunction with the NEA-IRS

  17. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1987-03-01

    The Panel on Basic Nuclear Data Compilations believes that it is of paramount importance to achieve as short a cycle time as is reasonably possible in the evaluation and publication of the A-chains. The panel, therefore, has concentrated its efforts on identifying those factors that have tended to increase the cycle time and on finding ways to remove the obstacles. An important step was made during the past year to address reduction of the size of the published evaluations - another factor that can reduce cycle time. The Nuclear Structure and Decay Data (NSDD) network adopted new format guidelines, which generated a 30% reduction by eliminating redundancy and/or duplication. A current problem appears to be the rate at which the A-chains are being evaluated, which, on the average, is only about one-half of what it could be. It is hoped that the situation will improve with an increase in the number of foreign centers and an increase in efficiency as more A-chains are recycled by the same evaluator who did the previous evaluation. Progress has been made in the area of on-line access to the nuclear data files in that a subcommittee report describing the requirements of an on-line system has been produced. 2 tabs

  18. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  19. Compiling Utility Requirements For New Nuclear Power Plant Project

    International Nuclear Information System (INIS)

    Patrakka, Eero

    2002-01-01

    Teollisuuden Voima Oy (TVO) submitted in November 2000 to the Finnish Government an application for a Decision-in-Principle concerning the construction of a new nuclear power plant in Finland. The actual investment decision can be made first after a positive decision has been made by the Government and the Parliament. Parallel to the licensing process, technical preparedness has been upheld so that the procurement process can be commenced without delay, when needed. This includes the definition of requirements for the plant and preliminary preparation of bid inquiry specifications. The core of the technical requirements corresponds to the specifications presented in the European Utility Requirement (EUR) document, compiled by major European electricity producers. Quite naturally, an amount of modifications to the EUR document are needed that take into account the country- and site-specific conditions as well as the experiences gained in the operation of the existing NPP units. Along with the EUR-related requirements concerning the nuclear island and power generation plant, requirements are specified for scope of supply as well as for a variety of issues related to project implementation. (author)

  20. A compilation of structure functions in deep inelastic scattering

    International Nuclear Information System (INIS)

    Gehrmann, T.; Roberts, R.G.; Whalley, M.R.

    1999-01-01

    A compilation of all the available data on the unpolarized structure functions F 2 and xF 3 , R=(σ L /σ T ), the virtual photon asymmetries A 1 and A 2 and the polarized structure functions g 1 and g 2 , from deep inelastic lepton scattering off protons, deuterium and nuclei is presented. The relevant experiments at CERN, DESY, Fermilab and SLAC from 1991, the date of our earlier review [1], to the present day are covered. A brief general theoretical introduction is given followed by the data presented both in tabular and graphical form and, for the F 2 and xF 3 data, the predictions based on the MRST98 and CTEQ4 parton distribution functions are also displayed. All the data in this review, together with data on a wide variety of other reactions, can be found in and retrieved from the Durham-RAL HEP Databases on the World-Wide-Web (http://durpdg.dur.ac.uk/HEPDATA). (author)

  1. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  2. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  3. Sustainability of common pool resources

    OpenAIRE

    Timilsina, Raja Rajendra; Kotani, Koji; Kamijo, Yoshio

    2017-01-01

    Sustainability has become a key issue in managing natural resources together with growing concerns for capitalism, environmental and resource problems. We hypothesize that the ongoing modernization of competitive societies, which we refer to as "capitalism," affects human nature for utilizing common pool resources, thus compromising sustainability. To test this hypothesis, we design and implement a set of dynamic common pool resource games and experiments in the following two types of Nepales...

  4. 2007 Electronic Reference Services In Nigerian Law Libraries.

    African Journals Online (AJOL)

    ezra

    interaction or interviewing techniques. The ... change. Abid, (2002), observed it is now common to find reference resources such as dictionaries .... Com,) Live Helper http:www. Wehelper. ..... sources, resources and strategies for legal research ...

  5. Evaluation of users' perception and satisfaction with reference ...

    African Journals Online (AJOL)

    Evaluation of users' perception and satisfaction with reference services in Olusegun ... limitations in good internet facility for searching online reference resources, ... It is recommended that there is need for short courses in customer care to be ...

  6. Standard Reference Tables -

    Data.gov (United States)

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  7. Areva, reference document 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains information on the markets, market shares and competitive position of the AREVA group. Content: - 1 Person responsible for the reference document and persons responsible for auditing the financial statements; - 2 Information pertaining to the transaction (Not applicable); - 3 General information on the company and its share capital: Information on AREVA, on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; - 4 Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts, The principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and development programs, intellectual property and trademarks, Risk and insurance; - 5 Assets - Financial position - Financial performance: Analysis of and comments on the group's financial position and performance, 2006 Human Resources Report, Environmental Report, Consolidated financial statements, Notes to the consolidated financial statements, AREVA SA financial statements, Notes to the corporate financial statements; 6 - Corporate Governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Combined General Meeting of Shareholders of May 3, 2007; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2006, Outlook; 8 - Glossary; 9 - Table of concordance

  8. Areva reference document 2007

    International Nuclear Information System (INIS)

    2008-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains also information on the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the reference document and persons responsible for auditing the financial statements; 2 - Information pertaining to the transaction (not applicable); 3 - General information on the company and its share capital: Information on Areva, Information on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; 4 - Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts 140 Principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and Development programs, Intellectual Property and Trademarks, Risk and insurance; 5 - Assets financial position financial performance: Analysis of and comments on the group's financial position and performance, Human Resources report, Environmental report, Consolidated financial statements 2007, Notes to the consolidated financial statements, Annual financial statements 2007, Notes to the corporate financial statements; 6 - Corporate governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Ordinary General Meeting of Shareholders of April 17, 2008; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2007, Outlook; Glossary; table of concordance

  9. JavaScript programmer's reference

    CERN Document Server

    Valentine, Thomas

    2013-01-01

    JavaScript Programmer's Reference is an invaluable resource that won't stray far from your desktop (or your tablet!). It contains detailed information on every JavaScript object and command, and combines that reference with practical examples showcasing how you can use those commands in the real world. Whether you're just checking the syntax of a method or you're starting out on the road to JavaScript mastery, the JavaScript Programmer's Reference will be an essential aid.  With a detailed and informative tutorial section giving you the ins and outs of programming with JavaScript and the DOM f

  10. Uranium 1999. Resources, production and demand

    International Nuclear Information System (INIS)

    2000-01-01

    In recent years, the world uranium market has been characterised by an imbalance between demand and supply and persistently depressed uranium prices. World uranium production currently satisfies between 55 and 60 per cent of the total reactor-related requirements, while the rest of the demand is met by secondary sources including the conversion of excess defence material and stockpiles, primarily from Eastern Europe. Although the future availability of these secondary sources remains unclear, projected low-cost production capability is expected to satisfy a considerable part of demand through to 2015. Information in this report provides insights into changes expected in uranium supply and demand over the next 15 years. The 'Red Book', jointly prepared by the OECD Nuclear Energy Agency and the International Atomic Energy Agency, is the foremost world reference on uranium. It is based on official information from 49 countries and includes compilations of statistics on resources, exploration, production and demand as of 1 January 1999. It provides substantial new information from all of the major uranium producing centres in Africa, Australia, Eastern Europe, North America and the New Independent States. It also contains an international expert analysis of industry statistics and world-wide projections of nuclear energy growth, uranium requirements and uranium supply. (authors)

  11. Applied Sciences Department (R&D) Patents; a Compilation

    Science.gov (United States)

    1978-12-28

    References Cited the first ampoule so that the two ampoules can he bro- UNITED STATES PATENTS ken simultaneously, and, upon mixing of the chemilu...Che-V222/541 miluminescent material and the other ampoule con-.tains an activator material. The almpoules can be bro-(561 References Cited ken by...Eaminer.-Robe" F. Stshl Aimmweys- Edgar J. broere. Ii. H Loach . and Paul S. isat `PYROTECHNCIGNALING ID E 1cER IT4 COMtPiOm WA!Eft EACTII’tIGN11M 4

  12. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  13. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  14. AREVA 2009 reference document

    International Nuclear Information System (INIS)

    2009-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies. It contains information on the markets, market shares and competitive position of the AREVA group. This information provides an adequate picture of the size of these markets and of the AREVA group's competitive position. Content: 1 - Person responsible for the Reference Document and Attestation by the person responsible for the Reference Document; 2 - Statutory and Deputy Auditors; 3 - Selected financial information; 4 - Risks: Risk management and coverage, Legal risk, Industrial and environmental risk, Operating risk, Risk related to major projects, Liquidity and market risk, Other risk; 5 - Information about the issuer: History and development, Investments; 6 - Business overview: Markets for nuclear power and renewable energies, AREVA customers and suppliers, Overview and strategy of the group, Business divisions, Discontinued operations: AREVA Transmission and Distribution; 7 - Organizational structure; 8 - Property, plant and equipment: Principal sites of the AREVA group, Environmental issues that may affect the issuer's; 9 - Analysis of and comments on the group's financial position and performance: Overview, Financial position, Cash flow, Statement of financial position, Events subsequent to year-end closing for 2009; 10 - Capital Resources; 11 - Research and development programs, patents and licenses; 12 -trend information: Current situation, Financial objectives; 13 - Profit forecasts or estimates; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of corporate bodies; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties: French state, CEA, EDF group; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information: Share capital, Certificate of incorporation and by-laws; 22 - Major

  15. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  16. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  17. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  18. Information resources

    Science.gov (United States)

    Friend, Milton; Franson, J. Christian; Friend, Milton; Gibbs, Samantha E.J.; Wild, Margaret A.

    2015-10-19

    During recent decades, natural resources agency personnel and others involved with the management and stewardship of wildlife have experienced an increasing need to access information and obtain technical assistance for addressing a diverse array of wildlife disease issues. This Chapter provides a broad overview of selected sources for obtaining supplemental information and technical assistance for addressing wildlife disease issues in North America. Specifically, examples of existing major wildlife disease programs focusing on free-ranging wildlife populations are highlighted; training opportunities for enhancing within-agency wildlife disease response are identified; a selected reading list of wildlife disease references is provided; and selected Web sites providing timely information on wildlife disease are highlighted. No attempt is made to detail all the North American programs and capabilities that address disease in free-ranging wildlife populations. Instead, this Chapter is focused on enhancing awareness of the types of capabilities that exist as potential sources for assistance and collaboration between wildlife conservation agency personnel and others in addressing wildlife disease issues.

  19. Nuclear physics at Ganil. A compilation 1989-1991

    International Nuclear Information System (INIS)

    1991-01-01

    This compilation deals with experimental and theoretical work performed at GANIL for the 1989-1991 years about the nuclear structure and nuclear reactions. During this period, the accelerator performances have been strongly increased, as well for the delivered energies and intensities as for the span of accelerated ions. In the experimental areas, a totally new data acquisition system has been set up, and the adding of a Wien filter to the Lise spectrometer results now in a versatile and efficient isotope separator, called LISE III. The time structure and the large intensity of the beam were decisive in identifying, for the first time, kaon production in heavy ions collisions at the GANIL subthreshold energies. Nucleons have to undergo several collisions before inducing such a process, and the strange particle emission should be very sensitive to the physical conditions of the hot and compressed interacting zone. Lead and Uranium beams now available at the Fermi energy, have been used to study the nuclear disassembly of very large and heavy systems. New results have been obtained on the collective flow in heavy ion reactions, giving new insights on the Equation of State problematics. In the field of nuclear structure, the magnetic spectrometer SPEG, coupled with large particle or gamma detectors shed light on new aspects of the giant resonance excitations. Exotic nuclei are extensively studied, with a particular emphasis on the 11Li nucleus. A new method of mass measurement, using the CSS2 as a mass separator, has been successfully tested; it will greatly improve the accuracy achieved on intermediate and heavy nuclei. Last but not least, the theory group is actively working to include fluctuations in the description of the nuclear dynamics and to characterise the onset of the multifragmentation process in heavy ion collisions. Author index and publication list are added

  20. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    Riemer, R.L.

    1992-01-01

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets

  1. A compiled checklist of seaweeds of Sudanese Red Sea coast

    Directory of Open Access Journals (Sweden)

    Nahid Abdel Rahim Osman

    2016-02-01

    Full Text Available Objective: To present an updated and compiled checklist of Sudanese seaweeds as an example for the region for conservational as well as developmental purposes. Methods: The checklist was developed based on both field investigations using line transect method at 4 sites along the Red Sea coast of Sudan and review of available studies done on Sudanese seaweeds. Results: In total 114 macroalgal names were recorded and were found to be distributed in 16 orders, 34 families, and 62 genera. The Rhodophyceae macroalgae contained 8 orders, 17 families, 32 genera and 47 species. The Phaeophyceae macroalgae composed of 4 orders, 5 families, 17 genera, and 28 species. The 39 species of the Chlorophyceae macroalgae belong to 2 classes, 4 orders, 12 families, and 14 genera. The present paper proposed the addition of 11 macroalgal taxa to be included in Sudan seaweeds species list. These include 3 red seaweed species, 1 brown seaweed species and 7 green seaweed species. Conclusions: This list is not yet inclusive and it only represents the macroalgal species common to the intertidal areas of Sudan Red Sea coast. Further investigation may reveal the presence of more species. While significant levels of diversity and endemism were revealed for other groups of organisms in the Red Sea region, similar work still has to be performed for seaweeds. Considering the impact of climate change on communities’ structure and composition and the growing risk of maritime transportation through the Red Sea particularly that may originate from oil tankers as well as that may emanate from oil exploration, baseline data on seaweeds are highly required for management purposes.

  2. Fifth Baltic Sea pollution load compilation (PLC-5)

    Energy Technology Data Exchange (ETDEWEB)

    Knuuttila, S.; Svendsen, L. M.; Staaf, H.; Kotilainen, P.; Boutrup, S.; Pyhala, M.; Durkin, M.

    2011-07-01

    This report includes the main results from the Fifth Pollution Load Compilation abbreviated PLC-5. It includes quantified annual waterborne total loads (from rivers, unmonitored and coastal areas as well as direct point and diffuse sources discharging directly to the Baltic Sea) from 1994 to 2008 to provide a basis for evaluating any decreasing (or increasing) trends in the total waterborne inputs to the Baltic Sea. Chapter 1 contains the objectives of PLC and the framework on classification of inputs and sources. Chapter 2 includes a short description of the Baltic Sea catchment area, while the methods for quantification and analysis together with quality assurance topics are briefly introduced in Chapter 3. More detailed information on methodologies is presented in the PLC-5 guidelines (HELCOM 2006). Chapter 4 reports the total inputs to the Baltic Sea of nutrients and selected heavy metals. Furthermore, the results of the quatification of discharges and losses of nitrogen and phosphorus from point and diffuse sources into inland surface waters within the Baltic Sea catchment area (source-oriented approach or gross loads) as well as the total load to the maritime area (load-oriented approarch or net loads) in 2006 are shown. Typically, results are presented by country and by main Baltic Sea sub-region. In Chapter 5, flow normalization is introduced and the results of trend analyses on 1994-2008 time series data on total waterborne loads of nitrogen and phosphorus are given together with a first evaluation of progress in obtaining the provisional reduction targets by country and by main Baltic Sea sub-region. Chapter 6 includes discussion of some of the main conclusions and advice for future PLCs. The annexes contain the flow-normalized annual load data and figures and tables with results from the PLC-5.

  3. Overconsumption? Our use of the world's natural resources

    International Nuclear Information System (INIS)

    Giljum, S.; Hinterberger, F.; Bruckner, M.; Burger, E.; Fruehmann, J.; Lutter, S.; Pirgmaier, E.; Polzin, C.; Waxwender, H.; Kernegger, L.; Warhurst, M.

    2009-09-01

    It is essential to start a debate about European resource use and its environmental and social impacts around the world. In order to help facilitate this debate, this report aims to provide a compilation of information on current trends in European and global resource use.

  4. Technology, safety, and costs of decommissioning a reference pressurized water reactor power station. Appendices

    International Nuclear Information System (INIS)

    Smith, R.I.; Konzek, G.J.; Kennedy, W.E. Jr.

    1978-05-01

    Detailed appendices are presented under the following headings: reference PWR facility description, reference PWR site description, estimates of residual radioactivity, alternative methods for financing decommissioning, radiation dose methodology, generic decommissioning activities, intermediate dismantlement activities, safe storage and deferred dismantlement activities, compilation of unit cost factors, and safety assessment details

  5. Elemental distributions in surficial sediments and potential offshore mineral resources from the western continental margin of India. Part 2. Potential offshore mineral resources

    Digital Repository Service at National Institute of Oceanography (India)

    Paropkari, A.L.; Mascarenhas, A.; Rao, Ch.M.; PrakashBabu, C.; Murty, P.S.N.

    patterns of ten selected elements is surficial sediments. Part 2 projects the potential offshore mineral resources. Target areas for future exploration and indicated and exploration strategies are recommended. Appendix 1 is a compilation of the bibliography...

  6. A reference guide to quality assurance for diagnostic radiography

    International Nuclear Information System (INIS)

    1986-01-01

    The College of Radiographers, through its Professional and Technical Committee, set up a small Working Party to compile a list of references which would help radiographers to become experts at Quality Assurance in Diagnostic Imaging departments. The list is not comprehensive, but includes references which the Working Party have experience of and have found useful. The references provided should aid in the objectives of:- 1. determining acceptable standards of new equipment; 2. making comparisons during use with accepted base line performance; 3. establishing loss of cost effectiveness and the need for replacement. (author)

  7. Southern California Water Bulletin for 1953: General review of the water resources of Southern California for the water year of 1952-53 with special reference to the surface runoff for the water year of 1951-52

    Science.gov (United States)

    Hofman, Walter; Briggs, R.C.; Littlefield, W.M.

    1954-01-01

    This WATER BULLETTIN is one of a series issued annually since June 1944. Its main purpose is to present a brief analysis of those phases of the local water supply associated with the work of the Geological Survey. The first part of this review deals with the water resources for the water year ending September 30, 1953. It contains a brief analysis of the annual precipitation, the provisional runoff at a few stations, the changes in water reserves both in surface reservoirs and underground, and the imported waters. It concludes by pointing out the deficiences in the local water reserves. This bulletin has been prepared by the Surface Water Branch; the section on ground-water conditions was prepared chiefly from information supplied by the Ground Hater Branch.

  8. Palaeoecological studies as a source of peat depth data: A discussion and data compilation for Scotland

    Directory of Open Access Journals (Sweden)

    J. Ratcliffe

    2016-06-01

    Full Text Available The regional/national carbon (C stock of peatlands is often poorly characterised, even for comparatively well-studied areas. A key obstacle to better estimates of landscape C stock is the scarcity of data on peat depth, leading to simplistic assumptions. New measurements of peat depth become unrealistically resource-intensive when considering large areas. Therefore, it is imperative to maximise the use of pre-existing datasets. Here we propose that one potentially valuable and currently unexploited source of peat depth data is palaeoecological studies. We discuss the value of these data and present an initial compilation for Scotland (United Kingdom which consists of records from 437 sites and yields an average depth of 282 cm per site. This figure is likely to be an over-estimate of true average peat depth and is greater than figures used in current estimates of peatland C stock. Depth data from palaeoecological studies have the advantages of wide distribution, high quality, and often the inclusion of valuable supporting information; but also the disadvantage of spatial bias due to the differing motivations of the original researchers. When combined with other data sources, each with its own advantages and limitations, we believe that palaeoecological datasets can make an important contribution to better-constrained estimates of peat depth which, in turn, will lead to better estimates of peatland landscape carbon stock.

  9. Nuclear EQ sourcebook: A compilation of documents for nuclear equipment qualification

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    In the nuclear power industry, environmental and seismic qualification of safety-related electrical and instrumentation equipment is collectively known as Equipment Qualification (EQ). Related technology, as well as regulatory requirements, have evolved rapidly during the last 15 years. For environmental qualification, what began in 1971 with one trial-use guide (IEEE Std 323-1971), now stands as a full complement of Nuclear Regulatory Commission (NRC) rules, guides and industry standards. In addition to the Institute of Electrical and Electronics Engineers (IEEE), the American Society of Mechanical Engineers (ASME) has also undertaken development of its own set of standards for use in qualifying safety-related mechanical equipment. To ensure that the original design and qualification is preserved, engineers need to select and use the correct set of NRC regulations, regulatory guides, industry standards, and generic correspondence. Given that the total number of these documents exceed 200, this task becomes resource intensive. This compilation is the first known publication available to serve the user-community need for a complete and exhaustive single source. Approximately 180 items (Bulletins, Federal Rules, Generic Letters, Notices, Regulatory Guides, IEEE Standards, IEEE Recommended Practices, and IEEE Guides) have been processed separately for inclusion on the data base

  10. Compilation and analysis of multiple groundwater-quality datasets for Idaho

    Science.gov (United States)

    Hundt, Stephen A.; Hopkins, Candice B.

    2018-05-09

    Groundwater is an important source of drinking and irrigation water throughout Idaho, and groundwater quality is monitored by various Federal, State, and local agencies. The historical, multi-agency records of groundwater quality include a valuable dataset that has yet to be compiled or analyzed on a statewide level. The purpose of this study is to combine groundwater-quality data from multiple sources into a single database, to summarize this dataset, and to perform bulk analyses to reveal spatial and temporal patterns of water quality throughout Idaho. Data were retrieved from the Water Quality Portal (https://www.waterqualitydata.us/), the Idaho Department of Environmental Quality, and the Idaho Department of Water Resources. Analyses included counting the number of times a sample location had concentrations above Maximum Contaminant Levels (MCL), performing trends tests, and calculating correlations between water-quality analytes. The water-quality database and the analysis results are available through USGS ScienceBase (https://doi.org/10.5066/F72V2FBG).

  11. Compilation of selected deep-sea biological data for the US subseabed disposal project

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-03-01

    The US Subseabed Disposal Project (SDP) has compiled an extensive deep-sea biological data base to be used in calculating biological parameters of state and rate included in mathematical models of oceanographic transport of radionuclides. The data base is organized around a model deep-sea ecosystem which includes the following components: zooplankton, fish and other nekton, invertebrate benthic megafauna, benthic macrofauna, benthic meiofauna, heterotrophic microbiota, as well as suspended and sediment particulate organic carbon. Measurements of abundance and activity rates (e.g., respiration, production, sedimentation, etc.) reported in the international oceanographic literature are summarized in 23 tables. Included in these tables are the latitudinal position of the studies, as well as information describing sampling techniques and any special notes needed to better assess the data presented. This report has been prepared primarily as a resource document to be used in calculating parameter values for various modeling applications, and for preparing historical data reviews for other SDP reports. Depending on the intended use, these data will require further reduction and unit conversion

  12. A compilation of reports of The Advisory Committee on Nuclear Waste, July 1988--June 1990

    International Nuclear Information System (INIS)

    1990-08-01

    This compilation contains 37 reports issued by the Advisory Committee on Nuclear Waste (ACNW) during the first two years of its operation. The reports were submitted to the Chairman or to the Executive Director for Operations, US Nuclear Regulatory Commission (NRC). Topics include the NRC analysis of the US Department of Energy Site Characterization Plan for the high-level radioactive waste repository, the standards promulgated by the US Environmental Protection Agency for the disposal of high-level waste, the NRC policy statement on Below Regulatory Concern, technical documents prepared by the NRC Staff relative to the decommissioning of nuclear power plants, the stabilization of uranium mill tailings piles, and environmental monitoring. All reports prepared by the Committee have been made available to the public through the NRC Public Document Room and the US Library of Congress. Included in an Appendix is a listing of references to related reports on nuclear waste matters that were issued by the Advisory Committee on Reactor Safeguards prior to the establishment of the ACNW

  13. HotpathVM: An Effective JIT for Resource-constrained Devices

    DEFF Research Database (Denmark)

    Gal, Andreas; Franz, Michael; Probst, Christian

    2006-01-01

    We present a just-in-time compiler for a Java VM that is small enough to fit on resource-constrained devices, yet surprisingly effective. Our system dynamically identifies traces of frequently executed bytecode instructions (which may span several basic blocks across several methods) and compiles...

  14. Compiler and Runtime Support for Programming in Adaptive Parallel Environments

    Science.gov (United States)

    1998-10-15

    noother job is waiting for resources, and use a smaller number of processors when other jobs needresources. Setia et al. [15, 20] have shown that such...15] Vijay K. Naik, Sanjeev Setia , and Mark Squillante. Performance analysis of job scheduling policiesin parallel supercomputing environments. In...on networks ofheterogeneous workstations. Technical Report CSE-94-012, Oregon Graduate Institute of Scienceand Technology, 1994.[20] Sanjeev Setia

  15. Connecticut's forest resources, 2010

    Science.gov (United States)

    Brett J. Butler; Cassandra Kurtz; Christopher Martin; W. Keith Moser

    2011-01-01

    This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...

  16. Connecticut's forest resources, 2009

    Science.gov (United States)

    Brett J. Butler; Christopher Martin

    2011-01-01

    This publication provides an overview of forest resource attributes for Connecticut based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...

  17. Pennsylvania's forest resources, 2012

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; C.J. Barnett

    2013-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...

  18. Pennsylvania's forest resources, 2009

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen

    2011-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...

  19. Pennsylvania's forest resources, 2011

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; C.J. Barnett

    2012-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...

  20. Pennsylvania's forest resources, 2008

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen

    2011-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...

  1. Pennsylvania's forest resources, 2010

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; C.J. Barnett

    2011-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of...

  2. Pennsylvania's forest resources, 2007

    Science.gov (United States)

    G.L. McCaskill; W.H. McWilliams; B.J. Butler; D.M. Meneguzzo; C.J. Barnett; M.H. Hansen

    2011-01-01

    This publication provides an overview of forest resource attributes for Pennsylvania based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These annual estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 6 of...

  3. Wisconsin's forest resources, 2010

    Science.gov (United States)

    C.H. Perry

    2011-01-01

    This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information please refer to page 4 of this report...

  4. Wisconsin's Forest Resources, 2007

    Science.gov (United States)

    C.H. Perry; V.A. Everson

    2008-01-01

    This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station. These estimates, along with web-posted core tables, are updated annually. For more information please refer to page 4 of this report.

  5. Wisconsin's forest resources, 2009

    Science.gov (United States)

    C.H. Perry

    2011-01-01

    This publication provides an overview of forest resource attributes for Wisconsin based on an annual inventory conducted by the Forest Inventory and Analysis (FIA) program at the Northern Research Station of the U.S. Forest Service. These estimates, along with web-posted core tables, will be updated annually. For more information, please refer to page 4 of this report...

  6. Evaluation of the ground-water resources of coastal Georgia: preliminary report of the data available as of July 1983

    Science.gov (United States)

    Krause, Richard E.

    1984-01-01

    A compilation of ground-water data that have been collected for nearly 100 years in the coastal area of Georgia is presented in this report. The compilation of pertinent data indicates what information is available for use in the evaluation of the ground-water resources of the 13 counties of coastal Georgia. Also included in this report is a fairly complete discussion of previous and ongoing investigations and monitoring networks, and an extensive list of references. Maps at 1:24,000 and 1:1,000,000 scales contain well locations and identifiers for all wells in the Ground Water Site Inventory (GWSI) data base of the National Water Data Storage and retrieval System (WATSTORE). Tabular summaries of selected site information from GWSI, including well identifiers and names, latitude-longitude location, depth of well, altitude of land surface, and use of water are presented. Water-use data from the National Water Use Data System, and water use for irrigation from the University of Georgia, Department of Agriculture survey, also are tabulated. Also included are pertinent information on geophysical surveys and data obtained, and proposed project activities, particularly test-monitor well drilling. The data in this report were collected and compiled as part of the cooperative activities between the U.S. Geological Survey and other agencies.

  7. 2002 reference document; Document de reference 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  8. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  9. Compilation and network analyses of cambrian food webs.

    Directory of Open Access Journals (Sweden)

    Jennifer A Dunne

    2008-04-01

    Full Text Available A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid

  10. Compilation and network analyses of cambrian food webs.

    Science.gov (United States)

    Dunne, Jennifer A; Williams, Richard J; Martinez, Neo D; Wood, Rachel A; Erwin, Douglas H

    2008-04-29

    A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid diversification of species, body

  11. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  12. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  13. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  14. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  15. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  16. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  17. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  18. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  19. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  20. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  1. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  2. Neutron data compilation. Report of a Panel sponsored by the International Atomic Energy Agency and held in Brookhaven, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-02-15

    The IAEA organized and convened a Panel on Neutron Data Compilation. This Panel was organized by the Agency following the recommendations made by the International Nuclear Data Committee (INDC) which agreed that a general review of world neutron data compilation activities was desirable. In this context neutron data compilation encompasses the collection, storage and dissemination of bibliographic information and of qualitative and numerical data on the interaction of neutrons with nuclei and atoms for all incident energies. Such information and data have important applications in low energy neutron physics and many important. areas of nuclear technology. The principal objective of the Panel on Neutron Data Compilation, Which was held at Brookhaven National Laboratory during 10-14 February 1969, was to review how the world's principal data centers located at Brookhaven, Saclay, Obninsk and Vienna could ideally meet the demands and needs of experimental and theoretical neutron physicists, evaluators, reactor physicists as well as other existing and potential users. Fourteen papers were considered during formal sessions of the Panel and are reported on the following pages. The members of the Panel separated into five working groups to consider specific terms of references and make recommendations. Their reports were discussed.

  3. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  4. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  5. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  6. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  7. Drift and diffusion of electrons in gases: A compilation

    International Nuclear Information System (INIS)

    Peisert, A.; Sauli, F.

    1984-01-01

    This report is organized in two sections. The first contains an elementary introduction to the theory of electron transport in gases under the action of electric and magnetic fields, and gives indications on the use of two programs to compute drift and diffusion properties of electrons in gas mixtures. The second section contains an extensive collection of experimental and computed data on electron drift velocity and diffusion, as a function of electric field; an index allows one to find the data referring to any given gas mixture. (orig.)

  8. Hardware Compilation of Application-Specific Memory-Access Interconnect

    DEFF Research Database (Denmark)

    Venkataramani, Girish; Bjerregaard, Tobias; Chelcea, Tiberiu

    2006-01-01

    operations dependent on memory reads. More fundamental is that dependences between accesses may not be statically provable (e.g., if the specification language permits pointers), which introduces memory-consistency problems. Addressing these issues with static scheduling results in overly conservative...... enables specifications to include arbitrary memory references (e.g., pointers) and allows the memory system to incorporate features that might cause the latency of a memory access to vary dynamically. This results in raising the level of abstraction in the input specification, enabling faster design times...

  9. Low-temperature geothermal water in Utah: A compilation of data for thermal wells and springs through 1993

    Energy Technology Data Exchange (ETDEWEB)

    Blackett, R.E.

    1994-07-01

    The Geothermal Division of DOE initiated the Low-Temperature Geothermal Resources and Technology Transfer Program, following a special appropriation by Congress in 1991, to encourage wider use of lower-temperature geothermal resources through direct-use, geothermal heat-pump, and binary-cycle power conversion technologies. The Oregon Institute of Technology (OIT), the University of Utah Research Institute (UURI), and the Idaho Water Resources Research Institute organized the federally-funded program and enlisted the help of ten western states to carry out phase one. This first phase involves updating the inventory of thermal wells and springs with the help of the participating state agencies. The state resource teams inventory thermal wells and springs, and compile relevant information on each sources. OIT and UURI cooperatively administer the program. OIT provides overall contract management while UURI provides technical direction to the state teams. Phase one of the program focuses on replacing part of GEOTHERM by building a new database of low- and moderate-temperature geothermal systems for use on personal computers. For Utah, this involved (1) identifying sources of geothermal date, (2) designing a database structure, (3) entering the new date; (4) checking for errors, inconsistencies, and duplicate records; (5) organizing the data into reporting formats; and (6) generating a map (1:750,000 scale) of Utah showing the locations and record identification numbers of thermal wells and springs.

  10. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    International Nuclear Information System (INIS)

    Endo, Akira; Yamaguchi, Yasuhiro

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of α particles, β particles, γ rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt γ rays, delayed γ rays, and β particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  11. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  12. Genetics Home Reference: adermatoglyphia

    Science.gov (United States)

    ... a feature of several rare syndromes classified as ectodermal dysplasias, including a condition called Naegeli-Franceschetti-Jadassohn syndrome/ ... and Advocacy Resources (2 links) National Foundation for Ectodermal Dysplasias Resource List from the University of Kansas Medical ...

  13. VBE reference framework

    NARCIS (Netherlands)

    Afsarmanesh, H.; Camarinha-Matos, L.M.; Ermilova, E.; Camarinha-Matos, L.M.; Afsarmanesh, H.; Ollus, M.

    2008-01-01

    Defining a comprehensive and generic "reference framework" for Virtual organizations Breeding Environments (VBEs), addressing all their features and characteristics, is challenging. While the definition and modeling of VBEs has become more formalized during the last five years, "reference models"

  14. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  15. Areva - 2014 Reference document

    International Nuclear Information System (INIS)

    2015-01-01

    Areva supplies high added-value products and services to support the operation of the global nuclear fleet. The company is present throughout the entire nuclear cycle, from uranium mining to used fuel recycling, including nuclear reactor design and operating services. Areva is recognized by utilities around the world for its expertise, its skills in cutting-edge technologies and its dedication to the highest level of safety. Areva's 44,000 employees are helping build tomorrow's energy model: supplying ever safer, cleaner and more economical energy to the greatest number of people. This Reference Document contains information on Areva's objectives, prospects and development strategies. It contains estimates of the markets, market shares and competitive position of Areva. Contents: 1 - Person responsible; 2 - Statutory auditors; 3 - Selected financial information; 4 - Risk factors; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Analysis of and comments on the group's financial position and performance; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of administrative, management and supervisory bodies and senior management; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - information on holdings; appendix: Report of the Chairman of the Board of Directors on governance, internal control procedures and risk management, Statutory Auditors' report, Corporate social

  16. Compilation of radiation damage test data. Pt. 3

    International Nuclear Information System (INIS)

    Beynel, P.; Maier, P.; Schoenbacher, H.

    1982-01-01

    This handbook gives the results of radiation damage tests on various engineering materials and components intended for installation in radiation areas of the CERN high-energy particle accelerators. It complements two previous volumes covering organic cable-insulating materials and thermoplastic and thermosetting resins. The irradiation have been carried out at various radiation sources and the results of the different tests are reported, sometimes illustrated by tables and graphs to show the variation of the measured property with absorbed radiation dose. For each entry, an appreciation of the radiation resistance is given, based on measurement data, indicating the range of damage (moderate to severe) for doses from 10 to 10 8 Gy. Also included are tables, selected from published reports, of general relative radiation effects for several groups of materials, to which there are systematic cross-references in the alphabetical part. This third and last volume contains cross-references to all the materials presented up to now, so that it can be used as a guide to the three volumes. (orig.)

  17. Changing quantum reference frames

    OpenAIRE

    Palmer, Matthew C.; Girelli, Florian; Bartlett, Stephen D.

    2013-01-01

    We consider the process of changing reference frames in the case where the reference frames are quantum systems. We find that, as part of this process, decoherence is necessarily induced on any quantum system described relative to these frames. We explore this process with examples involving reference frames for phase and orientation. Quantifying the effect of changing quantum reference frames serves as a first step in developing a relativity principle for theories in which all objects includ...

  18. Discipline, Dilemmas, Decisions and Data Distribution in the Planning and Compilation of Monolingual Dictionaries

    Directory of Open Access Journals (Sweden)

    Rufus H Gouws

    2011-10-01

    Full Text Available

    Abstract: Bilingual dictionaries play an important role in the standardisation of a language and are often the first dictionary type to be compiled for a given speech community. However, this may never lead to an underestimation of the role and importance of monolingual descriptive dictionaries in the early lexicographic development of a language. In the planning of first descriptive dictionaries the choice of the proper subtype and a consistent application of theoretical principles should be regarded as of extreme importance. Even the compilation of a restricted descriptive dictionary should be done according to similar theoretical principles as those applying to comprehensive dictionaries. This contribution indicates a number of dilemmas confronting the lexicographer during the compilation of restricted monolingual descriptive dictionaries. Attention is given to the role of lexicographic functions and the choice and presentation of lexicographic data, with special reference to the presentation of certain types of polysemous senses which are subjected to frequency of use restrictions. Emphasis is placed on the value of a heterogeneous article structure and a micro-architecture in the articles of restricted dictionaries.

    Keywords: ACCESS STRUCTURE, DATA DISTRIBUTION, FRAME STRUCTURE, FRE-QUENCY OF USE, HETEROGENEOUS ARTICLE STRUCTURE, LEXICOGRAPHIC FUNC-TIONS, LEXICOGRAPHIC PROCESS, MICRO-ARCHITECTURE, MONOLINGUAL DICTION-ARY, POLYSEMY, SEMANTIC DATA, TEXT BLOCK, USER-FRIENDLINESS, USER-PERSPEC-TIVE, VERTICAL ARCHITECTONIC EXTENSION

    Opsomming: Dissipline, dilemmas, besluite en dataverspreiding in die beplanning en samestelling van eentalige woordeboeke. Tweetalige woordeboeke speel 'n belangrike rol in die standaardisering van taal en is dikwels die eerste woordeboektipe wat vir 'n bepaalde taalgemeenskap saamgestel word. Dit mag egter nie tot 'n geringskatting lei van die rol en waarde van eentalige verklarende woordeboeke in die

  19. Dynamic HTML The Definitive Reference

    CERN Document Server

    Goodman, Danny

    2007-01-01

    Packed with information on the latest web specifications and browser features, this new edition is your ultimate one-stop resource for HTML, XHTML, CSS, Document Object Model (DOM), and JavaScript development. Here is the comprehensive reference for designers of Rich Internet Applications who need to operate in all modern browsers, including Internet Explorer 7, Firefox 2, Safari, and Opera. With this book, you can instantly see browser support for the latest standards-based technologies, including CSS Level 3, DOM Level 3, Web Forms 2.0, XMLHttpRequest for AJAX applications, JavaScript 1.7

  20. MIPS: curated databases and comprehensive secondary data resources in 2010.

    Science.gov (United States)

    Mewes, H Werner; Ruepp, Andreas; Theis, Fabian; Rattei, Thomas; Walter, Mathias; Frishman, Dmitrij; Suhre, Karsten; Spannagl, Manuel; Mayer, Klaus F X; Stümpflen, Volker; Antonov, Alexey

    2011-01-01

    The Munich Information Center for Protein Sequences (MIPS at the Helmholtz Center for Environmental Health, Neuherberg, Germany) has many years of experience in providing annotated collections of biological data. Selected data sets of high relevance, such as model genomes, are subjected to careful manual curation, while the bulk of high-throughput data is annotated by automatic means. High-quality reference resources developed in the past and still actively maintained include Saccharomyces cerevisiae, Neurospora crassa and Arabidopsis thaliana genome databases as well as several protein interaction data sets (MPACT, MPPI and CORUM). More recent projects are PhenomiR, the database on microRNA-related phenotypes, and MIPS PlantsDB for integrative and comparative plant genome research. The interlinked resources SIMAP and PEDANT provide homology relationships as well as up-to-date and consistent annotation for 38,000,000 protein sequences. PPLIPS and CCancer are versatile tools for proteomics and functional genomics interfacing to a database of compilations from gene lists extracted from literature. A novel literature-mining tool, EXCERBT, gives access to structured information on classified relations between genes, proteins, phenotypes and diseases extracted from Medline abstracts by semantic analysis. All databases described here, as well as the detailed descriptions of our projects can be accessed through the MIPS WWW server (http://mips.helmholtz-muenchen.de).

  1. Understanding brittle deformation at the Olkiluoto site. Literature compilation for site characterization and geological modelling

    International Nuclear Information System (INIS)

    Millnes, A.G.

    2006-07-01

    The present report arose from the belief that geological modelling at Olkiluoto, Finland, where an underground repository for spent nuclear fuel is at present under construction, could be significantly improved by an increased understanding of the phenomena being modelled, in conjunction with the more sophisticated data acquisition and processing methods which are now being introduced. Since the geological model is the necessary basis for the rock engineering and hydrological models, which in turn provide the foundation for identifying suitable rock volumes underground and for demonstrating longterm safety, its scientific basis is of critical importance. As a contribution to improving this scientific basis, the literature on brittle deformation in the Earth's crust has been reviewed, and key references chosen and arranged, with the particular geology of the Olkiluoto site in mind. The result is a compilation of scientific articles, reports and books on some of the key topics, which are of significance for an improved understanding of brittle deformation of hard, crystalline rocks, such as those typical for Olkiluoto. The report is subdivided into six Chapters, covering (1) background information, (2) important aspects of the fabric of intact rock, (3) fracture mechanics and brittle microtectonics, (4) fracture data acquisition and processing, for the statistical characterisation and modelling of fracture systems, (5) the characterisation of brittle deformation zones for deterministic modelling, and (6) the regional geological framework of the Olkiluoto site. The Chapters are subdivided into a number of Sections, and each Section into a number of Topics. The citations are mainly collected under each Topic, embedded in a short explanatory text or listed chronologically without comment. The systematic arrangement of Chapters, Sections and Topics is such that the Table of Contents can be used to focus quickly on the theme of interest without the necessity of looking

  2. Modern water resources engineering

    CERN Document Server

    Yang, Chih

    2014-01-01

    The Handbook of Environmental Engineering series is an incredible collection of methodologies that study the effects of pollution and waste in their three basic forms: gas, solid, and liquid. This exciting new addition to the series, Volume 15: Modern Water Resources Engineering , has been designed to serve as a water resources engineering reference book as well as a supplemental textbook. We hope and expect it will prove of equal high value to advanced undergraduate and graduate students, to designers of water resources systems, and to scientists and researchers. A critical volume in the Handbook of Environmental Engineering series, chapters employ methods of practical design and calculation illustrated by numerical examples, include pertinent cost data whenever possible, and explore in great detail the fundamental principles of the field. Volume 15: Modern Water Resources Engineering, provides information on some of the most innovative and ground-breaking advances in the field today from a panel of esteemed...

  3. Programming languages and compiler design for realistic quantum hardware

    Science.gov (United States)

    Chong, Frederic T.; Franklin, Diana; Martonosi, Margaret

    2017-09-01

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  4. Programming languages and compiler design for realistic quantum hardware.

    Science.gov (United States)

    Chong, Frederic T; Franklin, Diana; Martonosi, Margaret

    2017-09-13

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  5. The technical results of the Swedish nuclear weapons programme - a compilation of FOAs annual reports 1945-1972; Det svenska kaernvapenprogrammets tekniska resultat - en sammanstaellning av FOAs aarsrapporter 1945-1972

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, L.; Stenholm, L

    2002-02-01

    The aim with this report is to summarise FOAs nuclear weapons related research that was performed 1945-1972. The report is a compilation of FOAs annual reports that originally were in a classified form but have now - mostly - been declassified. References to separate reports in the different research areas are included in the report.

  6. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    Science.gov (United States)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  7. IAEA biological reference materials

    International Nuclear Information System (INIS)

    Parr, R.M.; Schelenz, R.; Ballestra, S.

    1988-01-01

    The Analytical Quality Control Services programme of the IAEA encompasses a wide variety of intercomparisons and reference materials. This paper reviews only those aspects of the subject having to do with biological reference materials. The 1988 programme foresees 13 new intercomparison exercises, one for major, minor and trace elements, five for radionuclides, and seven for stable isotopes. Twenty-two natural matrix biological reference materials are available: twelve for major, minor and trace elements, six for radionuclides, and four for chlorinated hydrocarbons. Seven new intercomparisons and reference materials are in preparation or under active consideration. Guidelines on the correct use of reference materials are being prepared for publication in 1989 in consultation with other major international producers and users of biological reference materials. The IAEA database on available reference materials is being updated and expanded in scope, and a new publication is planned for 1989. (orig.)

  8. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  9. Standard and reference materials for environmental science. Part 1. Technical memo

    Energy Technology Data Exchange (ETDEWEB)

    Cantillo, A.Y.

    1995-11-01

    This is the fourth edition of the catalog of reference materials suited for use in environmental science, originally compiled in 1986 for NOAA, IOC, and UNEP. The catalog lists more than 1200 reference materials from 28 producers and contains information about their proper use, sources, availability, and analyte concentrations. Indices are included for elements, isotopes, and organic compounds, as are cross references to CAS registry numbers, alternate names, and chemical structures of selected organic compounds.

  10. Standard and reference materials for environmental science. Part 2. Technical memo

    Energy Technology Data Exchange (ETDEWEB)

    Cantillo, A.Y.

    1995-11-01

    This is the fourth edition of the catalog of reference materials suited for use in environmental science, originally compiled in 1986 for NOAA, IOC, and UNEP. The catalog lists more than 1200 reference materials from 28 producers and contains information about their proper use, sources, availability, and analyte concentrations. Indices are included for elements, isotopes, and organic compounds, as are cross references to CAS registry numbers, alternate names, and chemical structures of selected organic compounds.

  11. A Hybrid Approach to Proving Memory Reference Monotonicity

    KAUST Repository

    Oancea, Cosmin E.

    2013-01-01

    Array references indexed by non-linear expressions or subscript arrays represent a major obstacle to compiler analysis and to automatic parallelization. Most previous proposed solutions either enhance the static analysis repertoire to recognize more patterns, to infer array-value properties, and to refine the mathematical support, or apply expensive run time analysis of memory reference traces to disambiguate these accesses. This paper presents an automated solution based on static construction of access summaries, in which the reference non-linearity problem can be solved for a large number of reference patterns by extracting arbitrarily-shaped predicates that can (in)validate the reference monotonicity property and thus (dis)prove loop independence. Experiments on six benchmarks show that our general technique for dynamic validation of the monotonicity property can cover a large class of codes, incurs minimal run-time overhead and obtains good speedups. © 2013 Springer-Verlag.

  12. World uranium exploration, resources, production and related activities

    International Nuclear Information System (INIS)

    Hanly, A.

    2014-01-01

    A Nuclear Energy Series publication entitled “World Uranium Exploration, Resources, Production and Related Activities” (WUERPRA) will soon be published by the IAEA. The objective of the publication is to provide a comprehensive compilation of historic uranium exploration, resources, production and related activities based primarily on information from the 1966 to 2009 editions of the publication “Uranium Resources, Production and Demand”, a joint publication of the International Atomic Energy Agency and the Nuclear Energy Agency/Organization for Economic Cooperation and Development commonly known as the ‘Red Book’. This has been supplemented by historic information from other reliable sources. The publications also include, where enough information was available, descriptions of the relative potential for discovery of new uranium resources on a per country basis. To recover complete historic information it is frequently necessary to refer to earlier editions of the Red Book, many of which may not be readily available. This publication aims to provide one comprehensive source for much of this type of information which will reduce the effort required to prepare future editions of the Red Book, as well as make the historic Red Book information, together with select related information from other sources, more readily available to all users with an interest in uranium. WUERPRA comprises 6 volumes containing 164 country reports, each organized by region; Volume 1: Africa (53 countries); Volume 2: Central, Eastern and Southeastern Europe (25 countries); Volume 3: Southeastern Asia, Pacific, East Asia (18 countries); Volume 4: Western Europe (22 countries); Volume 5: Middle East, Central and Southern Asia (19 countries), and; Volume 6: North America, Central America and South America (27 countries). The report also contains information on countries that have not reported to the Red Book. The poster will summarize select major highlights from each volume

  13. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  14. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  15. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  16. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  17. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  18. USGS compilation of geographic information system (GIS) data of coal mines and coal-bearing areas in Mongolia

    Science.gov (United States)

    Trippi, Michael H.; Belkin, Harvey E.

    2015-09-10

    Geographic information system (GIS) information may facilitate energy studies, which in turn provide input for energy policy decisions. The U.S. Geological Survey (USGS) has compiled GIS data representing coal mines, deposits (including those with and without coal mines), occurrences, areas, basins, and provinces of Mongolia as of 2009. These data are now available for download, and may be used in a GIS for a variety of energy resource and environmental studies of Mongolia. Chemical data for 37 coal samples from a previous USGS study of Mongolia (Tewalt and others, 2010) are included in a downloadable GIS point shapefile and shown on the map of Mongolia. A brief report summarizes the methodology used for creation of the shapefiles and the chemical analyses run on the samples.

  19. A compilation of K+p --> K0 DELTA++ cross sections below 2 GeV/c

    CERN Document Server

    Giacomelli, G; Piccinini, M; Rimondi, F; Serra-Lugaresi, P

    1976-01-01

    Data published up to June 1976 on the quasi-two-body reaction K+p --> K0 DELTA++, with DELTA++ -->ppi+, are compiled for laboratory momenta from 0.7 to 2 GeV/c. They include integrated cross-sections, differencial cross-sections, average and differential density matrix elements, as well as coefficients of the Legendre polynomial expensions of the production differential distributions. The data are presented in the form og graphs and computer-produced tables. The method of computation is the same as in a previous report (CERN-HERA-75-1) on K+N cross-sections below2 GeV/c, to which the reader is referred for details on cards formats, notations, etc.

  20. A compilation of information on the {sup 31}P(p,{alpha}){sup 28}Si reaction and properties of excited levels in the compound nucleus {sup 32}S

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.E.; Smith, D.L. [Argonne National Lab., IL (United States). Technology Development Div.

    1997-11-01

    This report documents a survey of the literature, and provides a compilation of data contained therein, for the {sup 31}P(p,{alpha}){sup 28}Si reaction. Attention is paid here to resonance states in the compound-nuclear system {sup 32}S formed by {sup 31}P + p, with emphasis on the alpha-particle decay channels, {sup 28}Si + {alpha} which populate specific levels in {sup 28}Si. The energy region near the proton separation energy for {sup 32}S is especially important in this context for applications in nuclear astrophysics. Properties of the excited states in {sup 28}Si are also considered. Summaries of all the located references are provided and numerical data contained in them are compiled in EXFOR format where applicable.