WorldWideScience

Sample records for unit map compiled

  1. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  2. Karst in the United States: a digital map compilation and database

    Science.gov (United States)

    Weary, David J.; Doctor, Daniel H.

    2014-01-01

    This report describes new digital maps delineating areas of the United States, including Puerto Rico and the U.S. Virgin Islands, having karst or the potential for development of karst and pseudokarst. These maps show areas underlain by soluble rocks and also by volcanic rocks, sedimentary deposits, and permafrost that have potential for karst or pseudokarst development. All 50 States contain rocks with potential for karst development, and about 18 percent of their area is underlain by soluble rocks having karst or the potential for development of karst features. The areas of soluble rocks shown are based primarily on selection from State geologic maps of rock units containing significant amounts of carbonate or evaporite minerals. Areas underlain by soluble rocks are further classified by general climate setting, degree of induration, and degree of exposure. Areas having potential for volcanic pseudokarst are those underlain chiefly by basaltic-flow rocks no older than Miocene in age. Areas with potential for pseudokarst features in sedimentary rocks are in relatively unconsolidated rocks from which pseudokarst features, such as piping caves, have been reported. Areas having potential for development of thermokarst features, mapped exclusively in Alaska, contain permafrost in relatively thick surficial deposits containing ground ice. This report includes a GIS database with links from the map unit polygons to online geologic unit descriptions.

  3. Geospatial compilation and digital map of centerpivot irrigated areas in the mid-Atlantic region, United States

    Science.gov (United States)

    Finkelstein, Jason S.; Nardi, Mark R.

    2015-01-01

    To evaluate water availability within the Northern Atlantic Coastal Plain, the U.S. Geological Survey, in cooperation with the University of Delaware Agricultural Extension, created a dataset that maps the number of acres under center-pivot irrigation in the Northern Atlantic Coastal Plain study area. For this study, the extent of the Northern Atlantic Coastal Plain falls within areas of the States of New York, New Jersey, Delaware, Maryland, Virginia, and North Carolina. The irrigation dataset maps about 271,900 acres operated primarily under center-pivot irrigation in 57 counties. Manual digitizing was performed against aerial imagery in a process where operators used observable center-pivot irrigation signatures—such as irrigation arms, concentric wheel paths through cropped areas, and differential colors—to identify and map irrigated areas. The aerial imagery used for digitizing came from a variety of sources and seasons. The imagery contained a variety of spatial resolutions and included online imagery from the U.S. Department of Agriculture National Agricultural Imagery Program, Microsoft Bing Maps, and the Google Maps mapping service. The dates of the source images ranged from 2010 to 2012 for the U.S. Department of Agriculture imagery, whereas maps from the other mapping services were from 2013.

  4. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  5. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  6. Hydrothermal alteration maps of the central and southern Basin and Range province of the United States compiled from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data

    Science.gov (United States)

    Mars, John L.

    2013-01-01

    Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and Interactive Data Language (IDL) logical operator algorithms were used to map hydrothermally altered rocks in the central and southern parts of the Basin and Range province of the United States. The hydrothermally altered rocks mapped in this study include (1) hydrothermal silica-rich rocks (hydrous quartz, chalcedony, opal, and amorphous silica), (2) propylitic rocks (calcite-dolomite and epidote-chlorite mapped as separate mineral groups), (3) argillic rocks (alunite-pyrophyllite-kaolinite), and (4) phyllic rocks (sericite-muscovite). A series of hydrothermal alteration maps, which identify the potential locations of hydrothermal silica-rich, propylitic, argillic, and phyllic rocks on Landsat Thematic Mapper (TM) band 7 orthorectified images, and geographic information systems shape files of hydrothermal alteration units are provided in this study.

  7. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  8. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  9. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  10. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  11. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  12. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  13. Maps compiled by the ESSO Minerals Company during their exploration program for uranium in South Africa

    International Nuclear Information System (INIS)

    Bertolini, A.; Pretorius, L.; Weideman, M.; Scheepers, T.

    1985-09-01

    The report is a bibliography of approximately one thousand maps. The maps contain information of ESSO Minerals Company's prospecting activities for mainly uranium in South Africa. ESSO explorated for uranium in the Karoo, Northwestern Cape and the Bushveld. The bibliography contains two indexes. The one is a list of prospects and projects as per geological province and the other is an alphabetic list of projects and prospects. Three geological provinces are distiguished, namely, the Bushveld province, Karoo province and Namaqualand province. The annotations contain information on the location and geographic area of the map, the name of the project or prospect, the title, a statement of resposibility (this includes the compiles i.e. geologists, and/or draftsmen), the statement of scale which is always expressed as a ratio, the date of compilation and/or revision and a few keywords to indicate the topical subject matter

  14. The Research and Compilation of City Maps in the National Geomatics Atlas of the PEOPLE'S Republic of China

    Science.gov (United States)

    Wang, G.; Wang, D.; Zhou, W.; Chen, M.; Zhao, T.

    2018-04-01

    The research and compilation of new century version of the National Huge Atlas of the People's Republic of China is the special basic work project by Ministry of Science and Technology of the People's Republic of China. Among them, the research and compilation of the National Geomatics Atlas of the People's Republic of China is its main content. The National Geomatics Atlas of China consists of 4 groups of maps and place name index. The 4 groups of maps are separately nationwide thematic map group, provincial fundamental geographical map group, landcover map group and city map group. The city map group is an important component part of the National Geomatics Atlas of China and mainly shows the process of urbanization in China. This paper, aim at design and compilation of 39 city-wide maps, briefly introduces mapping area research and scale design, mapping technical route, content selection and cartographic generalization, symbol design and visualization of map, etc.

  15. Geologic map of the Shaida deposit and Misgaran prospect, Herat Province, Afghanistan, modified from the 1973 original map compilation of V.I. Tarasenko and others

    Science.gov (United States)

    Tucker, Robert D.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2014-01-01

    This map is a modified version of Geological map and map of useful minerals, Shaida area, scale 1:50,000, which was compiled by V.I. Tarasenko, N.I. Borozenets, and others in 1973. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in August 2010.This modified map illustrates the geological structure of the Shaida copper-lead-zinc deposit and Misgaran copper-lead-zinc prospect in western Afghanistan and includes cross sections of the same area. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross sections and includes modifications based on our examination of these documents and on observations made during our field visit. Elevations on the cross sections are derived from the original Soviet topography and might not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map.The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  16. Genetic maps and physical units

    International Nuclear Information System (INIS)

    Karunakaran, V.; Holt, G.

    1976-01-01

    The relationships between physical and genetic units are examined. Genetic mapping involves the detection of linkage of genes and the measurement of recombination frequencies. The genetic distance is measured in map units and is proportional to the recombination frequencies between linked markers. Physical mapping of genophores, particularly the simple genomes of bacteriophages and bacterial plasmids can be achieved through heteroduplex analysis. Genetic distances are dependent on recombination frequencies and, therefore, can only be correlated accurately with physical unit lengths if the recombination frequency is constant throughout the entire genome. Methods are available to calculate the equivalent length of DNA per average map unit in different organisms. Such estimates indicate significant differences from one organism to another. Gene lengths can also be calculated from the number of amino acids in a specified polypeptide and relating this to the number of nucleotides required to code for such a polypeptide. Many attempts have been made to relate microdosimetric measurements to radiobiological data. For irradiation effects involving deletion of genetic material such a detailed correlation may be possible in systems where heteroduplex analysis or amino acid sequencing can be performed. The problems of DNA packaging and other functional associations within the cell in interpreting data is discussed

  17. Compilation of data and descriptions for United States and foreign liquid metal fast breeder reactors

    International Nuclear Information System (INIS)

    Appleby, E.R.

    1975-08-01

    This document is a compilation of design and engineering information pertaining to liquid metal cooled fast breeder reactors which have operated, are operating, or are currently under construction, in the United States and abroad. All data has been taken from publicly available documents, journals, and books

  18. Bedrock Geologic Map of Vermont - Units

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  19. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  20. Basement domain map of the conterminous United States and Alaska

    Science.gov (United States)

    Lund, Karen; Box, Stephen E.; Holm-Denoma, Christopher S.; San Juan, Carma A.; Blakely, Richard J.; Saltus, Richard W.; Anderson, Eric D.; DeWitt, Ed

    2015-01-01

    The basement-domain map is a compilation of basement domains in the conterminous United States and Alaska designed to be used at 1:5,000,000-scale, particularly as a base layer for national-scale mineral resource assessments. Seventy-seven basement domains are represented as eighty-three polygons on the map. The domains are based on interpretations of basement composition, origin, and architecture and developed from a variety of sources. Analysis of previously published basement, lithotectonic, and terrane maps as well as models of planetary development were used to formulate the concept of basement and the methodology of defining domains that spanned the ages of Archean to present but formed through different processes. The preliminary compilations for the study areas utilized these maps, national-scale gravity and aeromagnetic data, published and limited new age and isotopic data, limited new field investigations, and conventional geologic maps. Citation of the relevant source data for compilations and the source and types of original interpretation, as derived from different types of data, are provided in supporting descriptive text and tables.

  1. Karst mapping in the United States: Past, present and future

    Science.gov (United States)

    Weary, David J.; Doctor, Daniel H.

    2015-01-01

    The earliest known comprehensive karst map of the entire USA was published by Stringfield and LeGrand (1969), based on compilations of William E. Davies of the U.S. Geological Survey (USGS). Various versions of essentially the same map have been published since. The USGS recently published new digital maps and databases depicting the extent of known karst, potential karst, and pseudokarst areas of the United States of America including Puerto Rico and the U.S. Virgin Islands (Weary and Doctor, 2014). These maps are based primarily on the extent of potentially karstic soluble rock types, and rocks with physical properties conducive to the formation of pseudokarst features. These data were compiled and refined from multiple sources at various spatial resolutions, mostly as digital data supplied by state geological surveys. The database includes polygons delineating areas with potential for karst and that are tagged with attributes intended to facilitate classification of karst regions. Approximately 18% of the surface of the fifty United States is underlain by significantly soluble bedrock. In the eastern United States the extent of outcrop of soluble rocks provides a good first-approximation of the distribution of karst and potential karst areas. In the arid western states, the extent of soluble rock outcrop tends to overestimate the extent of regions that might be considered as karst under current climatic conditions, but the new dataset encompasses those regions nonetheless. This database will be revised as needed, and the present map will be updated as new information is incorporated.

  2. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    Science.gov (United States)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the

  3. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  4. Post-Columbia River Basalt Group stratigraphy and map compilation of the Columbia Plateau, Oregon

    International Nuclear Information System (INIS)

    Farooqui, S.M.; Bunker, R.C.; Thoms, R.E.; Clayton, D.C.; Bela, J.L.

    1981-01-01

    This report presents the results of reconnaissance mapping of sedimentary deposits and volcanic rocks overlying the Columbia River Basalt. The project area covers parts of the Dalles, Pendleton, Grangeville, Baker, Canyon City, and Bend. The mapping was done to provide stratigraphic data on the sedimentary deposits and volcanic rocks overlying the Columbia River Basalt Group. 160 refs., 16 figs., 1 tab

  5. Geologic map of metallic and nonmetallic mineral deposits, Badakhshan Province, Afghanistan, modified from the 1967 original map compilation of G.G. Semenov and others

    Science.gov (United States)

    Peters, Stephen G.; Stettner, Will R.; Mathieux, Donald P.; Masonic, Linda M.; Moran, Thomas W.

    2014-01-01

    This geologic map of central Badakhshan Province, Afghanistan, is a combined, redrafted, and modified version of the Geological map of central Badakhshan, scale 1:200,000 (sheet 217), and Map of minerals of central Badakhshan, scale 1:200,000 (also sheet 217) from Semenov and others (1967) (Soviet report no. R0815). That unpublished Soviet report contains the original maps and cross sections, which were prepared in cooperation with the Ministry of Mines and Industries of the Republic of Afghanistan in 1967 under contract no. 1378 (Technoexport, USSR). This USGS publication also includes the gold metallogeny summarized in Abdullah and others (1977) and Peters and others (2007, 2011), and additional compilations from Guguev and others (1967).

  6. A compilation of radioelement concentrations in granitic rocks of the contiguous United States

    International Nuclear Information System (INIS)

    Stuckless, J.S.; VanTrump, G. Jr.

    1982-01-01

    Concentration data for uranium, thorium, and potassium have been compiled for approximately 2,500 granitic samples from the contiguous United States. Uranium and thorium concentrations and ratios involving these elements exhibit a log-normal distribution with statistical parameters. In order to check for a bias in the results due to high concentrations of data in anomalous or heavily sampled areas, the data were reevaluated by averaging all analyses within a 0.5 0 latitude by 0.5 0 longitude grid. The resulting data set contains 330 entries for which radioelements are log-normally distributed. Mean values are not significantly different from those of the ungridded data, but standard deviations are lower by as much as nearly 50 percent. The areal distribution of anomalously high values (more than one standard deviation greater than the geometric mean) does not delineate large uranium districts by either treatment of the data. There is sufficient information for approximately 1,500 samples to permit subdivision of the granites by degree of alumina saturation. Relative to the six variables listed above, peraluminous samples have slightly lower mean values, but the differences are not statistically significant. Standard deviations are also largest for the peraluminous granites with α for Th/U nearly 3 times larger for peraluminous granite than for metaluminous granite. Examination of the variations in Th/U ratios for a few specific granites for which isotopic data are available suggests that variability is caused by late-stage magmatic or secondary processes that may be associated with ore-forming processes. Therefore, although anomalous radioelement concentrations in granitic rocks do not seem to be useful in delineating large uranium provinces with sediment-hosted deposits, highly variable uranium concentrations or Th/U ratios in granitic rocks may be helpful in the search for uranium deposits

  7. Mapping and prediction of schistosomiasis in Nigeria using compiled survey data and Bayesian geospatial modelling

    DEFF Research Database (Denmark)

    Ekpo, Uwem F.; Hürlimann, Eveline; Schur, Nadine

    2013-01-01

    Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35 of the cou......Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35...

  8. Compilation of functional soil maps for the support of spatial planning and land management in Hungary

    Science.gov (United States)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor; Fodor, Nándor; Illés, Gábor; Bakacsi, Zsófia; Szabó, József

    2015-04-01

    The main objective of the DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) project is to significantly extend the potential, how demands on spatial soil related information could be satisfied in Hungary. Although a great amount of soil information is available due to former mappings and surveys, there are more and more frequently emerging discrepancies between the available and the expected data. The gaps are planned to be filled with optimized DSM products heavily based on legacy soil data. Delineation of Areas with Excellent Productivity in the framework of the National Regional Development Plan or delimitation of Areas with Natural Constraints in Hungary according to the common European biophysical criteria are primary issues in national level spatial planning. Impact assessment of the forecasted climate change and the analysis of the possibilities of the adaptation in the agriculture and forestry can be supported by scenario based land management modelling, whose results can be also incorporated in spatial planning. All these challenges require adequate, preferably timely and spatially detailed knowledge of the soil cover. For the satisfaction of these demands the soil conditions of Hungary have been digitally mapped based on the most detailed, available recent and legacy soil data, applying proper DSM techniques. Various soil related information were mapped in three distinct approaches: (i) basic soil properties determining agri-environmental conditions (e.g.: soil type according to the Hungarian genetic classification, rootable depth, sand, silt and clay content by soil layers, pH, OM and carbonate content for the plough layer); (ii) biophysical criteria of natural handicaps (e.g.: poor drainage, unfavourable texture and stoniness, shallow rooting depth, poor chemical properties and soil moisture balance) defined by common European system and (iii) agro-meteorologically modelled yield values for different crops, meteorological

  9. Subsurface temperature maps in French sedimentary basins: new data compilation and interpolation

    International Nuclear Information System (INIS)

    Bonte, D.; Guillou-Frottier, L.; Garibaldi, C.; Bourgine, B.; Lopez, S.; Bouchot, V.; Garibaldi, C.; Lucazeau, F.

    2010-01-01

    Assessment of the underground geothermal potential requires the knowledge of deep temperatures (1-5 km). Here, we present new temperature maps obtained from oil boreholes in the French sedimentary basins. Because of their origin, the data need to be corrected, and their local character necessitates spatial interpolation. Previous maps were obtained in the 1970's using empirical corrections and manual interpolation. In this study, we update the number of measurements by using values collected during the last thirty years, correct the temperatures for transient perturbations and carry out statistical analyses before modelling the 3D distribution of temperatures. This dataset provides 977 temperatures corrected for transient perturbations in 593 boreholes located in the French sedimentary basins. An average temperature gradient of 30.6 deg. C/km is obtained for a representative surface temperature of 10 deg. C. When surface temperature is not accounted for, deep measurements are best fitted with a temperature gradient of 25.7 deg. C/km. We perform a geostatistical analysis on a residual temperature dataset (using a drift of 25.7 deg. C/km) to constrain the 3D interpolation kriging procedure with horizontal and vertical models of variograms. The interpolated residual temperatures are added to the country-scale averaged drift in order to get a three dimensional thermal structure of the French sedimentary basins. The 3D thermal block enables us to extract isothermal surfaces and 2D sections (iso-depth maps and iso-longitude cross-sections). A number of anomalies with a limited depth and spatial extension have been identified, from shallow in the Rhine graben and Aquitanian basin, to deep in the Provence basin. Some of these anomalies (Paris basin, Alsace, south of the Provence basin) may be partly related to thick insulating sediments, while for some others (southwestern Aquitanian basin, part of the Provence basin) large-scale fluid circulation may explain superimposed

  10. Industry Contributions to Seafloor Mapping: Building Partnerships for Collecting, Sharing, and Compiling Data

    Science.gov (United States)

    Brumley, K. J.; Mitchell, G. A.; Millar, D.; Saade, E. J.; Gharib, J. J.

    2017-12-01

    In an effort to map the remaining 85% of the worlds seafloor, The Nippon Foundation and GEBCO have launched Seabed 2030 to provide high-resolution bathymetry for all ocean waters by the year 2030. This ambitious effort will require sharing of bathymetric information to build a global baseline bathymetry database. Multibeam echosounder (MBES) data is a promising source of data for Seabed 2030. These data benefit multiple users which includes not only bathymetric information, but also valuable backscatter data, useful for determining seafloor characteristics), as well as water column data, which can be used to explore other aspects of the marine environment and potentially help constrain some of the ocean's methane flux estimates. Fugro provides global survey services for clients in the oil and gas, telecommunications, infrastructure industries, and state and federal agencies. With a global fleet of survey vessels and autonomous vehicles equipped with state-of-the-art MBES systems, Fugro has performed some of the world's largest offshore surveys over the past several years mapping close to 1,000,000 km2 of seafloor per year with high-resolution MBES data using multi-vessel operational models and new methods for merging datasets from different multibeam sonar systems. Although most of these data are proprietary, Fugro is working with clients in the private-sector to make data available to the Seabed 2030 project at a decimated resolution of 100 m. The company is also contributing the MBES data acquired during transits to survey locations. Fugro has also partnered with Shell Ocean Discovery XPRIZE to support development of new rapid, unmanned, high-resolution ocean mapping technologies that can benefit understanding of the world's oceans. Collaborative approaches such as these are helping to establish a new standard for other industry contributions, and to facilitate a new outlook for data sharing among the public and private sectors. Recognizing the importance of an

  11. Compilation of data relating to the erosive response of 608 recently-burned basins in the western United States

    Science.gov (United States)

    Gartner, Joseph E.; Cannon, Susan H.; Bigio, Erica R.; Davis, Nicole K.; Parrett, Charles; Pierce, Kenneth L.; Rupert, Michael G.; Thurston, Brandon L.; Trebesch, Matthew J.; Garcia, Steve P.; Rea, Alan H.

    2005-01-01

    This report presents a compilation of data on the erosive response, debris-flow initiation processes, basin morphology, burn severity, event-triggering rainfall, rock type, and soils for 608 basins recently burned by 53 fires located throughout the Western United States.  The data presented here are a combination of those collected during our own field research and those reported in the literature.  In some cases, data from a Geographic Information System (GIS) and Digital Elevation Models (DEMs) were used to supplement the data from the primary source.  Due to gaps in the information available, not all parameters are characterized for all basins. This database provides a resource for researchers and land managers interested in examining relations between the runoff response of recently burned basins and their morphology, burn severity, soils and rock type, and triggering rainfall.  The purpose of this compilation is to provide a single resource for future studies addressing problems associated with wildfire-related erosion.  For example, data in this compilation have been used to develop a model for debris flow probability from recently burned basins using logistic multiple regression analysis (Cannon and others, 2004).  This database provides a convenient starting point for other studies.  For additional information on estimated post-fire runoff peak discharges and debris-flow volumes, see Gartner and others (2004).

  12. Compilation of 137Cs concentrations at selected sites in the continental United States

    International Nuclear Information System (INIS)

    Mohr, R.A.; Franks, L.A.

    1982-01-01

    This report summarizes results of cesium-137 analyses of soil samples obtained at 21 locations throughout the continental United States. The sites were all in the vicinity of operating nuclear power reactors, or those scheduled for operation. Selected fallout and meteorological data are also included

  13. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  14. Geologic map of the Dusar area, Herat Province, Afghanistan; Modified from the 1973 original map compilations of V.I. Tarasenko and others

    Science.gov (United States)

    Tucker, Robert D.; Stettner, Will R.; Masonic, Linda M.; Bogdanow, Anya K.

    2017-10-24

    approximately concordant with the plane of the metamorphic fabric. The veins consist mostly of quartz, with minor carbonate and sulfide minerals, and display weak alteration halos along their margins. The gossans are locally anomalous in these metals, but their size and extent makes them attractive exploration targets for potential massive sulfide mineralization.The Dusar gossan zone is a massive, ochreous, and siliceous limonitic rock, approximately 2,200 meters long, 30 to 250 meters wide, and 2.0 to 7.2 meters thick. Drilling below the Dusar gossan intersected a siliceous, sericitic, and limonitic rock underlain by quartz keratophyre with abundant disseminated pyrite. Mineralized sections grade 0.06 weight percent copper and up to 0.05 weight percent zinc. The Namak-sory gossan zone contains a similar deposit with anomalous concentrations of copper, zinc, and gold.The redrafted maps and cross sections reproduce the topology of rock units, contacts, and faults of the original Soviet maps and cross sections, and include minor modifications based on examination of the originals and observations made during two brief field visits by USGS staff in August, 2010, and June, 2013.

  15. Mapping severe fire potential across the contiguous United States

    Science.gov (United States)

    Brett H. Davis

    2016-01-01

    The Fire Severity Mapping System (FIRESEV) project is an effort to provide critical information and tools to fire managers that enhance their ability to assess potential ecological effects of wildland fire. A major component of FIRESEV is the development of a Severe Fire Potential Map (SFPM), a geographic dataset covering the contiguous United States (CONUS) that...

  16. COMPILATION OF GEOMORPHOLOGICAL MAP FOR RECONSTRUCTING THE DEGLACIATION OF ICE-FREE AREAS IN THE MARTEL INLET, KING GEORGE ISLAND, ANTARCTICA

    Directory of Open Access Journals (Sweden)

    Kátia Kellem Rosa

    2014-03-01

    Full Text Available We compiled a geomorphological map and a reconstruction map of glacier extension and ice-free areas in the Martel Inlet, located in King George Island, South Shetlands, Antarctica. Glacier extension data were derived of the digitized over a orthophotomosaic (2003, SPOT (February, 1988; March, 1995 and 2000, Quickbird (October, 2006 and Cosmo-Skymed (February, 2011 images. This mapping was supported by fieldworks carried out in the summers of 2007, 2010 and 2011, and by topographic surveys and geomorphic map in the proglacial area. Several types of glacial deposits were identified in the study area, such as frontal and lateral moraines, flutes, meltwater channels and erosional features like rock moutonnés, striations and U-shaped valleys. These features allowed reconstructing the evolution of the deglaciation environment in the Martel Inlet ice-free areas, which has been affected by a regional climate warming trend. The mapped data indicated the glaciers in study area lost about 0.71 km² of their ice masses (13.2% of the 50.3 km² total area, without any advances during 1979-2011. Since those years these glaciers receded by an average of 25.9 m a-1. These ice-free areas were susceptible to rapid post-depositional changes.

  17. Quaternary Geologic Map of the Lake of the Woods 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Goebel, Joseph E.; Ringrose, Susan M.; Edited and Integrated by Fullerton, David S.

    1995-01-01

    The Quaternary Geologic Map of the Lake of the Woods 4 deg x 6 deg Quadrangle, United States and Canada, was mapped as part of the U.S. Geological Survey Quaternary Geologic Atlas of the United States map series (Miscellaneous Investigations Series I-1420, NM-15). The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the Minnesota Geological Survey, the Manitoba Department of Energy and Mines, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and

  18. USGS Governmental Unit Boundaries Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Governmental Unit Boundaries service from The National Map (TNM) represents major civil areas for the Nation, including States or Territories, counties (or...

  19. Quaternary Geologic Map of the Lake Nipigon 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Farrand, William R.; Edited and Integrated by Fullerton, David S.

    1994-01-01

    The Quaternary Geologic Map of the Lake Nipigon 4 degree x 6 degree Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the University of Michigan, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the map unit descriptions. Deposits of some constructional landforms, such as kame moraine deposits, are distinguished as map units. Deposits of

  20. Geologic map of the Zarkashan-Anguri copper and gold deposits, Ghazni Province, Afghanistan, modified from the 1968 original map compilation of E.P. Meshcheryakov and V.P. Sayapin

    Science.gov (United States)

    Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological map of the area of Zarkashan-Anguri gold deposits, scale 1:50,000, which was compiled by E.P. Meshcheryakov and V.P. Sayapin in 1968. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in April 2010. This modified map, which includes a cross section, illustrates the geologic setting of the Zarkashan-Anguri copper and gold deposits. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross section and includes modifications based on our examination of that and other documents, and based on observations made and sampling undertaken during our field visit. (Refer to the Introduction and the References in the Map PDF for an explanation of our methodology and for complete citations of the original map and related reports.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map.

  1. Improved method for drawing of a glycan map, and the first page of glycan atlas, which is a compilation of glycan maps for a whole organism.

    Directory of Open Access Journals (Sweden)

    Shunji Natsuka

    Full Text Available Glycan Atlas is a set of glycan maps over the whole body of an organism. The glycan map that includes data of glycan structure and quantity displays micro-heterogeneity of the glycans in a tissue, an organ, or cells. The two-dimensional glycan mapping is widely used for structure analysis of N-linked oligosaccharides on glycoproteins. In this study we developed a comprehensive method for the mapping of both N- and O-glycans with and without sialic acid. The mapping data of 150 standard pyridylaminated glycans were collected. The empirical additivity rule which was proposed in former reports was able to adapt for this extended glycan map. The adapted rule is that the elution time of pyridylamino glycans on high performance liquid chromatography (HPLC is expected to be the simple sum of the partial elution times assigned to each monosaccharide residue. The comprehensive mapping method developed in this study is a powerful tool for describing the micro-heterogeneity of the glycans. Furthermore, we prepared 42 pyridylamino (PA- glycans from human serum and were able to draw the map of human serum N- and O-glycans as an initial step of Glycan Atlas editing.

  2. Geologic map of Kundelan ore deposits and prospects, Zabul Province, Afghanistan; modified from the 1971 original map compilations of K.I. Litvinenko and others

    Science.gov (United States)

    Tucker, Robert D.; Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2015-10-26

    This map and cross sections are redrafted modified versions of the Geological map of the Kundelan ore deposit area, scale 1:10,000 (graphical supplement no. 18) and the Geological map of the Kundelan deposits, scale 1:2,000 (graphical supplement no. 3) both contained in an unpublished Soviet report by Litvinenko and others (1971) (report no. 0540). The unpublished Soviet report was prepared in cooperation with the Ministry of Mines and Industries of the Royal Government of Afghanistan in Kabul during 1971. This redrafted map and cross sections illustrate the geology of the main Kundelan copper-gold skarn deposit, located within the Kundelan copper and gold area of interest (AOI), Zabul Province, Afghanistan. Areas of interest (AOIs) of non-fuel mineral resources within Afghanistan were first described and defined by Peters and others (2007) and later by the work of Peters and others (2011a). The location of the main Kundelan copper-gold skarn deposit (area of this map) and the Kundelan copper and gold AOI is shown on the index map provided on this map sheet.

  3. Quaternary geologic map of the Austin 4° x 6° quadrangle, United States

    Science.gov (United States)

    State compilations by Moore, David W.; Wermund, E.G.; edited and integrated by Moore, David W.; Richmond, Gerald Martin; Christiansen, Ann Coe; Bush, Charles A.

    1993-01-01

    This map is part of the Quaternary Geologic Atlas of the United States (I-1420). It was first published as a printed edition in 1993. The geologic data have now been captured digitally and are presented here along with images of the printed map sheet and component parts as PDF files. The Quaternary Geologic Map of the Austin 4° x 6° Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the Earth. They make up the ground on which we walk, the dirt in which we dig foundations, and the soil in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. In recent years, surficial deposits and materials have become the focus of much interest by scientists, environmentalists, governmental agencies, and the general public. They are the foundations of ecosystems, the materials that support plant growth and animal habitat, and the materials through which travels much of the water required for our agriculture, our industry, and our general well being. They also are materials that easily can become contaminated by pesticides, fertilizers, and toxic wastes. In this context, the value of the surficial geologic map is evident.

  4. Environmental aspects of engineering geological mapping in the United States

    Science.gov (United States)

    Radbruch-Hall, Dorothy H.

    1979-01-01

    Many engineering geological maps at different scales have been prepared for various engineering and environmental purposes in regions of diverse geological conditions in the United States. They include maps of individual geological hazards and maps showing the effect of land development on the environment. An approach to assessing the environmental impact of land development that is used increasingly in the United States is the study of a single area by scientists from several disciplines, including geology. A study of this type has been made for the National Petroleum Reserve in northern Alaska. In the San Francisco Bay area, a technique has been worked out for evaluating the cost of different types of construction and land development in terms of the cost of a number of kinds of earth science factors. ?? 1979 International Association of Engineering Geology.

  5. Geomorphic Unit Tool (GUT): Applications of Fluvial Mapping

    Science.gov (United States)

    Kramer, N.; Bangen, S. G.; Wheaton, J. M.; Bouwes, N.; Wall, E.; Saunders, C.; Bennett, S.; Fortney, S.

    2017-12-01

    Geomorphic units are the building blocks of rivers and represent distinct habitat patches for many fluvial organisms. We present the Geomorphic Unit Toolkit (GUT), a flexible GIS geomorphic unit mapping tool, to generate maps of fluvial landforms from topography. GUT applies attributes to landforms based on flow stage (Tier 1), topographic signatures (Tier 2), geomorphic characteristics (Tier 3) and patch characteristics (Tier 4) to derive attributed maps at the level of detail required by analysts. We hypothesize that if more rigorous and consistent geomorphic mapping is conducted, better correlations between physical habitat units and ecohydraulic model results will be obtained compared to past work. Using output from GUT for coarse bed tributary streams in the Columbia River Basin, we explore relationships between salmonid habitat and geomorphic spatial metrics. We also highlight case studies of how GUT can be used to showcase geomorphic impact from large wood restoration efforts. Provided high resolution topography exists, this tool can be used to quickly assess changes in fluvial geomorphology in watersheds impacted by human activities.

  6. Compiling and Mapping Global Permeability of the Unconsolidated and Consolidated Earth: GLobal HYdrogeology MaPS 2.0 (GLHYMPS 2.0)

    Science.gov (United States)

    Huscroft, Jordan; Gleeson, Tom; Hartmann, Jens; Börker, Janine

    2018-02-01

    The spatial distribution of subsurface parameters such as permeability are increasingly relevant for regional to global climate, land surface, and hydrologic models that are integrating groundwater dynamics and interactions. Despite the large fraction of unconsolidated sediments on Earth's surface with a wide range of permeability values, current global, high-resolution permeability maps distinguish solely fine-grained and coarse-grained unconsolidated sediments. Representative permeability values are derived for a wide variety of unconsolidated sediments and applied to a new global map of unconsolidated sediments to produce the first geologically constrained, two-layer global map of shallower and deeper permeability. The new mean logarithmic permeability of the Earth's surface is -12.7 ± 1.7 m2 being 1 order of magnitude higher than that derived from previous maps, which is consistent with the dominance of the coarser sediments. The new data set will benefit a variety of scientific applications including the next generation of climate, land surface, and hydrology models at regional to global scales.

  7. MAPPING A BASIC HEALTH UNIT: AN EXPERIENCE REPORT

    Directory of Open Access Journals (Sweden)

    Bárbara Carvalho Malheiros

    2015-01-01

    Full Text Available Backgound and Objectives: This study is an experience report on the construction of a map of a Basic Health Unit (BHU. The objective was to understand the relevance and/or importance of mapping a BHU and acquire more knowledge on the health-disease status of the registered population and identify the importance of cartography as a working tool. Case description: After reading some texts, evaluating information systems and on-site visits, it was possible to identify the health status of the population of the neighborhoods. The proposed objectives were considered to be achieved, considering the mapping of the assessed population’s health-disease situation with a closer-to-reality viewpoint, identifying the number of individuals, the diseases, living situation and health care. Conclusion: The mapping approach is a powerful working tool for allowing the planning of strategic interventions that enables the development of assistance activities, aiming to promote health and disease prevention. KEYWORDS: Mapping; Basic Health Unit; Health Planning.

  8. Elaboration of a framework for the compilation of countrywide, digital maps for the satisfaction of recent demands on spatial, soil related information in Hungary

    Science.gov (United States)

    Pásztor, László; Dobos, Endre; Szabó, József; Bakacsi, Zsófia; Laborczi, Annamária

    2013-04-01

    imperfection as for the accuracy and reliability of the delivered products. Since, similarly to the great majority of the world, large-scale, comprehensive new surveys cannot be expected in the near future, the actually available legacy data should be relied on. With a recently started project we would like to significantly extend the potential, how countrywide soil information requirements could be satisfied. In the frame of our project we plan the execution of spatial and thematic data mining of significant amount of soil related information available in the form of legacy soil data as well as digital databases and spatial soil information systems. In the course of the analyses we will lean on auxiliary, spatial data themes related to environmental elements. Based on the established relationships we will convert and integrate the specific data sets for the regionalization of the various, derived soil parameters. By the aid of GIS and geostatistical tools we will carry out the spatial extension of certain pedological variables featuring the (including degradation) state, processes or functions of soils. We plan to compile digital soil maps which fulfil optimally the national and international demands from points of view of thematic, spatial and temporal accuracy. The targeted spatial resolution of the proposed countrywide, digital, thematic soil property and function maps is at least 1:50.000 (approx. 50-100 meter raster). Our stressful objective is the definite solution of the regionalization of the information collected in the frame of two recent, contemporary, national, systematic soil data collection (not designed for mapping purpose) on the recent state of soils, in order to produce countrywide maps for the spatial inventory of certain soil properties, processes and functions with sufficient accuracy and reliability.

  9. Map Database for Surficial Materials in the Conterminous United States

    Science.gov (United States)

    Soller, David R.; Reheis, Marith C.; Garrity, Christopher P.; Van Sistine, D. R.

    2009-01-01

    The Earth's bedrock is overlain in many places by a loosely compacted and mostly unconsolidated blanket of sediments in which soils commonly are developed. These sediments generally were eroded from underlying rock, and then were transported and deposited. In places, they exceed 1000 ft (330 m) in thickness. Where the sediment blanket is absent, bedrock is either exposed or has been weathered to produce a residual soil. For the conterminous United States, a map by Soller and Reheis (2004, scale 1:5,000,000; http://pubs.usgs.gov/of/2003/of03-275/) shows these sediments and the weathered, residual material; for ease of discussion, these are referred to as 'surficial materials'. That map was produced as a PDF file, from an Adobe Illustrator-formatted version of the provisional GIS database. The provisional GIS files were further processed without modifying the content of the published map, and are here published.

  10. An updated stress map of the continental United States reveals heterogeneous intraplate stress

    Science.gov (United States)

    Levandowski, Will; Herrmann, Robert B.; Briggs, Rich; Boyd, Oliver; Gold, Ryan

    2018-06-01

    Knowledge of the state of stress in Earth's crust is key to understanding the forces and processes responsible for earthquakes. Historically, low rates of natural seismicity in the central and eastern United States have complicated efforts to understand intraplate stress, but recent improvements in seismic networks and the spread of human-induced seismicity have greatly improved data coverage. Here, we compile a nationwide stress map based on formal inversions of focal mechanisms that challenges the idea that deformation in continental interiors is driven primarily by broad, uniform stress fields derived from distant plate boundaries. Despite plate-boundary compression, extension dominates roughly half of the continent, and second-order forces related to lithospheric structure appear to control extension directions. We also show that the states of stress in several active eastern United States seismic zones differ significantly from those of surrounding areas and that these anomalies cannot be explained by transient processes, suggesting that earthquakes are focused by persistent, locally derived sources of stress. Such spatially variable intraplate stress appears to justify the current, spatially variable estimates of seismic hazard. Future work to quantify sources of stress, stressing-rate magnitudes and their relationship with strain and earthquake rates could allow prospective mapping of intraplate hazard.

  11. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  12. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  13. The Holdridge life zones of the conterminous United States in relation to ecosystem mapping

    Science.gov (United States)

    A.E. Lugo; S. L. Brown; R. Dodson; T. S Smith; H. H. Shugart

    1999-01-01

    Aim Our main goals were to develop a map of the life zones for the conterminous United States, based on the Holdridge Life Zone system, as a tool for ecosystem mapping, and to compare the map of Holdridge life zones with other global vegetation classification and mapping efforts. Location The area of interest is the forty-eight contiguous states of the United States....

  14. Data compilation task report for the source investigation of the 300-FF-1 operable unit phase 1 remedial investigation

    International Nuclear Information System (INIS)

    Young, J.S.; Fruland, R.M.; Fruchter, J.S.

    1990-02-01

    This report provides additional information on facility and waste characteristics for the 300-FF-1 operable unit. The additional information gathered and reported includes meetings and on-site visits with current and past personnel having knowledge of operations in the operable unit, a more precise determination of the location of the Process Sewer lines and Retired Radioactive Liquid Waste Sewer, a better understanding of the phosphoric acid spill at the 340 Complex, and a search for engineering plans and environmental reports related to the operable unit. As a result of this data-gathering effort, recommendations for further investigation include characterization of the 307 Trenches to determine the origin of an underlying uranium plume in the groundwater, more extensive sampling of near-surface and dike sediments in the North and South Process Ponds to better define the extent of horizontal contamination, and detection of possible leaks in the abandoned Radioactive Waste Sewer by either electromagnetic induction or remote television camera inspection techniques. 16 refs., 4 figs., 5 tabs

  15. Using historical aerial photography and softcopy photogrammetry for waste unit mapping in L Lake

    International Nuclear Information System (INIS)

    Christel, L.M.

    1997-10-01

    L Lake was developed as a cooling water reservoir for the L Reactor at the Savannah River Site. The construction of the lake, which began in the fall of 1984, altered the structure and function of Steel Creek. Completed in the fall of 1985, L Lake has a capacity of 31 million cubic meters and a normal pool of 58 meters. When L Reactor operations ceased in 1988, the water level in the lake still had to be maintained. Site managers are currently trying to determine the feasibility of draining or drawing down the lake in order to save tax dollars. In order to understand the full repercussions of such an undertaking, it was necessary to compile a comprehensive inventory of what the lake bottom looked like prior to filling. Aerial photographs, acquired nine days before the filling of the lake began, were scanned and used for softcopy photogrammetry processing. A one-meter digital elevation model was generated and a digital orthophoto mosaic was created as the base map for the project. Seven categories of features, including the large waste units used to contain the contaminated soil removed from the dam site, were screen digitized and used to generate accurate maps. Other map features include vegetation waste piles, where contaminated vegetation from the flood plain was contained, and ash piles, which are sites where vegetation debris was burned and then covered with clean soil. For all seven categories, the area of disturbance totaled just over 63 hectares. When the screen digitizing was completed, the elevation at the centroid of each disturbance was determined. When the information is used in the Savannah River Site Geographical Information System, it can be used to visualize the various L Lake draw-down scenarios suggested by site managers and hopefully, to support evaluations of the cost effectiveness for each proposed activity

  16. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  17. Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1 degree x 2 degrees quadrangle and part of the southern part of the Challis 1 degree x 2 degrees quadrangle, south-central Idaho

    Science.gov (United States)

    Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.

    1995-01-01

    The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.

  18. A mitotically inheritable unit containing a MAP kinase module.

    Science.gov (United States)

    Kicka, Sébastien; Bonnet, Crystel; Sobering, Andrew K; Ganesan, Latha P; Silar, Philippe

    2006-09-05

    Prions are novel kinds of hereditary units, relying solely on proteins, that are infectious and inherited in a non-Mendelian fashion. To date, they are either based on autocatalytic modification of a 3D conformation or on autocatalytic cleavage. Here, we provide further evidence that in the filamentous fungus Podospora anserina, a MAP kinase cascade is probably able to self-activate and generate C, a hereditary unit that bears many similarities to prions and triggers cell degeneration. We show that in addition to the MAPKKK gene, both the MAPKK and MAPK genes are necessary for the propagation of C, and that overexpression of MAPK as that of MAPKKK facilitates the appearance of C. We also show that a correlation exists between the presence of C and localization of the MAPK inside nuclei. These data emphasize the resemblance between prions and a self-positively regulated cascade in terms of their transmission. This thus further expands the concept of protein-base inheritance to regulatory networks that have the ability to self-activate.

  19. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  20. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  1. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  2. Risk maps for targeting exotic plant pest detection programs in the United States

    Science.gov (United States)

    R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al

    2011-01-01

    In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...

  3. Reflections on the Value of Mapping the Final Theory Examination in a Molecular Biochemistry Unit

    OpenAIRE

    Eri, Rajaraman; Cook, Anthony; Brown, Natalie

    2014-01-01

    This article assesses the impact of examination mapping as a tool to enhancing assessment and teaching quality in a second-year biochemistry unit for undergraduates. Examination mapping is a process where all questions in a written examination paper are assessed for links to the unit’s intended learning outcomes. We describe how mapping a final written examination helped visualise the impact of the assessment task on intended learning outcomes and skills for that biochemistry unit. The method...

  4. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  5. 2014 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, M.D.; Mueller, C.S.; Haller, K.M.; Moschetti, M.; Harmsen, S.C.; Field, E.H.; Rukstales, K.S.; Zeng, Y.; Perkins, D.M.; Powers, P.; Rezaeian, S.; Luco, N.; Olsen, A.; Williams, R.

    2012-01-01

    The U.S. National Seismic Hazard Maps are revised every six years, corresponding with the update cycle of the International Building Code. These maps cover the conterminous U.S. and will be updated in 2014 using the best-available science that is obtained from colleagues at regional and topical workshops, which are convened in 2012-2013. Maps for Alaska and Hawaii will be updated shortly following this update. Alternative seismic hazard models discussed at the workshops will be implemented in a logic tree framework and will be used to develop the seismic hazard maps and associated products. In this paper we describe the plan to update the hazard maps, the issues raised in workshops up to March 2012, and topics that will be discussed at future workshops. An advisory panel will guide the development of the hazard maps and ensure that the maps are acceptable to a broad segment of the science and engineering communities. These updated maps will then be considered by end-users for inclusion in building codes, risk models, and public policy documents.

  6. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  7. Compilation of watershed models for tributaries to the Great Lakes, United States, as of 2010, and identification of watersheds for future modeling for the Great Lakes Restoration Initiative

    Science.gov (United States)

    Coon, William F.; Murphy, Elizabeth A.; Soong, David T.; Sharpe, Jennifer B.

    2011-01-01

    As part of the Great Lakes Restoration Initiative (GLRI) during 2009–10, the U.S. Geological Survey (USGS) compiled a list of existing watershed models that had been created for tributaries within the United States that drain to the Great Lakes. Established Federal programs that are overseen by the National Oceanic and Atmospheric Administration (NOAA) and the U.S. Army Corps of Engineers (USACE) are responsible for most of the existing watershed models for specific tributaries. The NOAA Great Lakes Environmental Research Laboratory (GLERL) uses the Large Basin Runoff Model to provide data for the management of water levels in the Great Lakes by estimating United States and Canadian inflows to the Great Lakes from 121 large watersheds. GLERL also simulates streamflows in 34 U.S. watersheds by a grid-based model, the Distributed Large Basin Runoff Model. The NOAA National Weather Service uses the Sacramento Soil Moisture Accounting model to predict flows at river forecast sites. The USACE created or funded the creation of models for at least 30 tributaries to the Great Lakes to better understand sediment erosion, transport, and aggradation processes that affect Federal navigation channels and harbors. Many of the USACE hydrologic models have been coupled with hydrodynamic and sediment-transport models that simulate the processes in the stream and harbor near the mouth of the modeled tributary. Some models either have been applied or have the capability of being applied across the entire Great Lakes Basin; they are (1) the SPAtially Referenced Regressions On Watershed attributes (SPARROW) model, which was developed by the USGS; (2) the High Impact Targeting (HIT) and Digital Watershed models, which were developed by the Institute of Water Research at Michigan State University; (3) the Long-Term Hydrologic Impact Assessment (L–THIA) model, which was developed by researchers at Purdue University; and (4) the Water Erosion Prediction Project (WEPP) model, which was

  8. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  9. Keeping it wild: mapping wilderness character in the United States.

    Science.gov (United States)

    Carver, Steve; Tricker, James; Landres, Peter

    2013-12-15

    A GIS-based approach is developed to identify the state of wilderness character in US wilderness areas using Death Valley National Park (DEVA) as a case study. A set of indicators and measures are identified by DEVA staff and used as the basis for developing a flexible and broadly applicable framework to map wilderness character using data inputs selected by park staff. Spatial data and GIS methods are used to map the condition of four qualities of wilderness character: natural, untrammelled, undeveloped, and solitude or primitive and unconfined recreation. These four qualities are derived from the US 1964 Wilderness Act and later developed by Landres et al. (2008a) in "Keeping it Wild: An Interagency Strategy to Monitor Trends in Wilderness Character Across the National Wilderness Preservation System." Data inputs are weighted to reflect their importance in relation to other data inputs and the model is used to generate maps of each of the four qualities of wilderness character. The combined map delineates the range of quality of wilderness character in the DEVA wilderness revealing the majority of wilderness character to be optimal quality with the best areas in the northern section of the park. This map will serve as a baseline for monitoring change in wilderness character and for evaluating the spatial impacts of planning alternatives for wilderness and backcountry stewardship plans. The approach developed could be applied to any wilderness area, either in the USA or elsewhere in the world. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Quaternary Geologic Map of the Regina 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Fullerton, David S.; Christiansen, Earl A.; Schreiner, Bryan T.; Colton, Roger B.; Clayton, Lee; Bush, Charles A.; Fullerton, David S.

    2007-01-01

    For scientific purposes, the map differentiates Quaternary surficial deposits and materials on the basis of clast lithology or composition, matrix texture or particle size, structure, genesis, stratigraphic relations, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the 'Description of Map Units'. Deposits of some constructional landforms, such as end moraines, are distinguished as map units. Deposits of erosional landforms, such as outwash terraces, are not distinguished, although glaciofluvial, ice-contact, fluvial, and lacustrine deposits that are mapped may be terraced. Differentiation of sequences of fluvial and glaciofluvial deposits at this scale is not possible. For practical purposes, the map is a surficial materials map. Materials are distinguished on the basis of lithology or composition, texture or particle size, and other physical, chemical, and engineering characteristics. It is not a map of soils that are recognized and classified in pedology or agronomy. Rather, it is a generalized map of soils as recognized in engineering geology, or of substrata or parent materials in which pedologic or agronomic soils are formed. As a materials map, it serves as a base from which a variety of maps for use in planning engineering, land-use planning, or land-management projects can be derived and from which a variety of maps relating to earth surface processes and Quaternary geologic history can be derived.

  11. Radiation field mapping in mammography units with TLDs

    Energy Technology Data Exchange (ETDEWEB)

    Castro, J.C.O.; Silva, J.O., E-mail: jonas.silva@ufg.br [Universidade Federal de Goiás (IFG), Goiânia (Brazil). Instituto de Física; Veneziani, G.R. [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo-SP (Brazil). Centro de Metrologia das Radiações

    2017-07-01

    Mammography is the most common imaging technique for breast cancer detection and its tracking. For dosimetry, is important to know the field intensity variation. In this work, TLD-100 were used to made a field mapping of a mammographic system from a hospital in Goiânia/GO. The maximum radiation intensity was 8 cm far from chest wall. The results obtained could be used in the optimization of the dosimetry in the equipment used in this work. (author)

  12. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  13. A Compilation of Boiling Water Reactor Operational Experience for the United Kingdom's Office for Nuclear Regulation's Advanced Boiling Water Reactor Generic Design Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Timothy A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Liao, Huafei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    United States nuclear power plant Licensee Event Reports (LERs), submitted to the United States Nuclear Regulatory Commission (NRC) under law as required by 10 CFR 50.72 and 50.73 were evaluated for reliance to the United Kingdom’s Health and Safety Executive – Office for Nuclear Regulation’s (ONR) general design assessment of the Advanced Boiling Water Reactor (ABWR) design. An NRC compendium of LERs, compiled by Idaho National Laboratory over the time period January 1, 2000 through March 31, 2014, were sorted by BWR safety system and sorted into two categories: those events leading to a SCRAM, and those events which constituted a safety system failure. The LERs were then evaluated as to the relevance of the operational experience to the ABWR design.

  14. A Method for Mapping Future Urbanization in the United States

    Directory of Open Access Journals (Sweden)

    Lahouari Bounoua

    2018-04-01

    Full Text Available Cities are poised to absorb additional people. Their sustainability, or ability to accommodate a population increase without depleting resources or compromising future growth, depends on whether they harness the efficiency gains from urban land management. Population is often projected as a bulk national number without details about spatial distribution. We use Landsat and population data in a methodology to project and map U.S. urbanization for the year 2020 and document its spatial pattern. This methodology is important to spatially disaggregate projected population and assist land managers to monitor land use, assess infrastructure and distribute resources. We found the U.S. west coast urban areas to have the fastest population growth with relatively small land consumption resulting in future decrease in per capita land use. Except for Miami (FL, most other U.S. large urban areas, especially in the Midwest, are growing spatially faster than their population and inadvertently consuming land needed for ecosystem services. In large cities, such as New York, Chicago, Houston and Miami, land development is expected more in suburban zones than urban cores. In contrast, in Los Angeles land development within the city core is greater than in its suburbs.

  15. Quaternary Geologic Map of the Lake Superior 4° x 6° Quadrangle, United States and Canada

    Data.gov (United States)

    Department of the Interior — The Quaternary Geologic Map of the Lake Superior 4° x 6° Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as...

  16. Quaternary allostratigraphy of surficial deposit map units at Yucca Mountain, Nevada: A progress report

    International Nuclear Information System (INIS)

    Lundstrom, S.C.; Wesling, J.R.; Swan, F.H.; Taylor, E.M.; Whitney, J.W.

    1993-01-01

    Surficial geologic mapping at Yucca Mountain, Nevada, is relevant to site characterization studies of paleoclimate, tectonics, erosion, flood hazards, and water infiltration. Alluvial, colluvial, and eolian allostratigraphic map units are defined on the basis of age-related surface characteristics and soil development, as well as lithology and sedimentology indicative of provenance and depositional mode. In gravelly alluvial units, which include interbedded debris flows, the authors observe a useful qualitative correlation between surface and soil properties. Map units of estimated middle Pleistocene age typically have a well-developed, varnished desert pavement, and minimal erosional and preserved depositional microrelief, associated with a soil with a reddened Bt horizon and stage 3 carbonate and silica morphology. Older units have greater erosional relief, an eroded argillic horizon and stage 4 carbonate morphology, whereas younger units have greater preservation of depositional morphology, but lack well-developed pavements, rock varnish, and Bt and Kqm soil horizons. Trench and gully-wall exposures show that alluvial, colluvial and eolian dominated surface units are underlain by multiple buried soils separating sedimentologically similar deposits; this stratigraphy increases the potential for understanding the long-term Quaternary paleoenvironmental history of Yucca Mountain. Age estimates for allostratigraphic units, presently based on uranium-trend dating and regional correlation using soil development, will be further constrained by ongoing dating studies that include tephra identification, uranium-series disequilibrium, and thermoluminescence methods

  17. Mapping variation in radon potential both between and within geological units

    International Nuclear Information System (INIS)

    Miles, J C H; Appleton, J D

    2005-01-01

    Previously, the potential for high radon levels in UK houses has been mapped either on the basis of grouping the results of radon measurements in houses by grid squares or by geological units. In both cases, lognormal modelling of the distribution of radon concentrations was applied to allow the estimated proportion of houses above the UK radon Action Level (AL, 200 Bq m -3 ) to be mapped. This paper describes a method of combining the grid square and geological mapping methods to give more accurate maps than either method can provide separately. The land area is first divided up using a combination of bedrock and superficial geological characteristics derived from digital geological map data. Each different combination of geological characteristics may appear at the land surface in many discontinuous locations across the country. HPA has a database of over 430 000 houses in which long-term measurements of radon concentration have been made, and whose locations are accurately known. Each of these measurements is allocated to the appropriate bedrock-superficial geological combination underlying it. Taking each geological combination in turn, the spatial variation of radon potential is mapped, treating the combination as if it were continuous over the land area. All of the maps of radon potential within different geological combinations are then combined to produce a map of variation in radon potential over the whole land surface

  18. Computer Program Development Specification for Ada Integrated Environment. Ada Compiler Phases B5-AIE (1). COMP (1).

    Science.gov (United States)

    1982-11-05

    INTERMETRICS INCORPORATED * 733 CONCORD AVENUE e CAMBRIDGE, MASSACHUSETTS 02138 1 6171 861-1340 B5-AIE(l) .COt4P(1) SET U _j ISTING AST.’ ISTINGDIN...header EZMAP DS A-OAMAP ’ address of exception handler map -- Code (instructions and literals) follows BODY ZQU * entry point to the unit Ist ...exceed a figure to be determined. (3) VIH limits the compiler to 200 subdomains accessible at once. This limits the number of units that may be WITHd

  19. Development of a new USDA plant hardiness zone map for the United States

    Science.gov (United States)

    C. Daly; M.P. Widrlechner; M.D. Halbleib; J.I. Smith; W.P. Gibson

    2012-01-01

    In many regions of the world, the extremes of winter cold are a major determinant of the geographic distribution of perennial plant species and of their successful cultivation. In the United States, the U.S. Department of Agriculture (USDA) Plant Hardiness Zone Map (PHZM) is the primary reference for defining geospatial patterns of extreme winter cold for the...

  20. Using the Large Fire Simulator System to map wildland fire potential for the conterminous United States

    Science.gov (United States)

    LaWen Hollingsworth; James Menakis

    2010-01-01

    This project mapped wildland fire potential (WFP) for the conterminous United States by using the large fire simulation system developed for Fire Program Analysis (FPA) System. The large fire simulation system, referred to here as LFSim, consists of modules for weather generation, fire occurrence, fire suppression, and fire growth modeling. Weather was generated with...

  1. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  2. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    Science.gov (United States)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  3. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  4. Compilation, quality control, analysis, and summary of discrete suspended-sediment and ancillary data in the United States, 1901-2010

    Science.gov (United States)

    Lee, Casey J.; Glysson, G. Douglas

    2013-01-01

    Human-induced and natural changes to the transport of sediment and sediment-associated constituents can degrade aquatic ecosystems and limit human uses of streams and rivers. The lack of a dedicated, easily accessible, quality-controlled database of sediment and ancillary data has made it difficult to identify sediment-related water-quality impairments and has limited understanding of how human actions affect suspended-sediment concentrations and transport. The purpose of this report is to describe the creation of a quality-controlled U.S. Geological Survey suspended-sediment database, provide guidance for its use, and summarize characteristics of suspended-sediment data through 2010. The database is provided as an online application at http://cida.usgs.gov/sediment to allow users to view, filter, and retrieve available suspended-sediment and ancillary data. A data recovery, filtration, and quality-control process was performed to expand the availability, representativeness, and utility of existing suspended-sediment data collected by the U.S. Geological Survey in the United States before January 1, 2011. Information on streamflow condition, sediment grain size, and upstream landscape condition were matched to sediment data and sediment-sampling sites to place data in context with factors that may influence sediment transport. Suspended-sediment and selected ancillary data are presented from across the United States with respect to time, streamflow, and landscape condition. Examples of potential uses of this database for identifying sediment-related impairments, assessing trends, and designing new data collection activities are provided. This report and database can support local and national-level decision making, project planning, and data mining activities related to the transport of suspended-sediment and sediment-associated constituents.

  5. A multicriteria framework for producing local, regional, and national insect and disease risk maps

    Science.gov (United States)

    Frank J. Jr. Krist; Frank J. Sapio

    2010-01-01

    The construction of the 2006 National Insect and Disease Risk Map, compiled by the USDA Forest Service, State and Private Forestry Area, Forest Health Protection Unit, resulted in the development of a GIS-based, multicriteria approach for insect and disease risk mapping that can account for regional variations in forest health concerns and threats. This risk mapping...

  6. Geochemical landscapes of the conterminous United States; new map presentations for 22 elements

    Science.gov (United States)

    Gustavsson, N.; Bolviken, B.; Smith, D.B.; Severson, R.C.

    2001-01-01

    Geochemical maps of the conterminous United States have been prepared for seven major elements (Al, Ca, Fe, K, Mg, Na, and Ti) and 15 trace elements (As, Ba, Cr, Cu, Hg, Li, Mn, Ni, Pb, Se, Sr, V, Y, Zn, and Zr). The maps are based on an ultra low-density geochemical survey consisting of 1,323 samples of soils and other surficial materials collected from approximately 1960-1975. The data were published by Boerngen and Shacklette (1981) and black-and-white point-symbol geochemical maps were published by Shacklette and Boerngen (1984). The data have been reprocessed using weighted-median and Bootstrap procedures for interpolation and smoothing.

  7. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  8. Non-Markovianity Measure Based on Brukner-Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    Science.gov (United States)

    He, Zhi; Zhu, Lie-Qiang; Li, Li

    2017-03-01

    A non-Markovianity measure based on Brukner-Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner-Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner-Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. Supported by the National Natural Science Foundation of China under Grant No. 61505053, the Natural Science Foundation of Hunan Province under Grant No. 2015JJ3092, the Research Foundation of Education Bureau of Hunan Province, China under Grant No. 16B177, the School Foundation from the Hunan University of Arts and Science under Grant No. 14ZD01

  9. Non-Markovianity Measure Based on Brukner–Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    International Nuclear Information System (INIS)

    He Zhi; Zhu Lie-Qiang; Li Li

    2017-01-01

    A non-Markovianity measure based on Brukner–Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner–Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner–Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. (paper)

  10. Documentation for the 2014 update of the United States national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter M.; Mueller, Charles S.; Haller, Kathleen M.; Frankel, Arthur D.; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen C.; Boyd, Oliver S.; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nico; Wheeler, Russell L.; Williams, Robert A.; Olsen, Anna H.

    2014-01-01

    The national seismic hazard maps for the conterminous United States have been updated to account for new methods, models, and data that have been obtained since the 2008 maps were released (Petersen and others, 2008). The input models are improved from those implemented in 2008 by using new ground motion models that have incorporated about twice as many earthquake strong ground shaking data and by incorporating many additional scientific studies that indicate broader ranges of earthquake source and ground motion models. These time-independent maps are shown for 2-percent and 10-percent probability of exceedance in 50 years for peak horizontal ground acceleration as well as 5-hertz and 1-hertz spectral accelerations with 5-percent damping on a uniform firm rock site condition (760 meters per second shear wave velocity in the upper 30 m, VS30). In this report, the 2014 updated maps are compared with the 2008 version of the maps and indicate changes of plus or minus 20 percent over wide areas, with larger changes locally, caused by the modifications to the seismic source and ground motion inputs.

  11. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  12. Looking for an old map

    Science.gov (United States)

    ,

    1996-01-01

    Many people want maps that show an area of the United States as it existed many years ago. These are called historical maps, and there are two types. The most common type consists of special maps prepared by commercial firms to show such historical features as battle-fields, military routes, or the paths taken by famous travelers. Typically, these maps are for sale to tourists at the sites of historical events. The other type is the truly old map--one compiled by a surveyor or cartographer many years ago. Lewis and Clark, for example, made maps of their journeys into the Northwest Territories in 1803-6, and originals of some of these maps still exist.

  13. Floodplain Mapping for the Continental United States Using Machine Learning Techniques and Watershed Characteristics

    Science.gov (United States)

    Jafarzadegan, K.; Merwade, V.; Saksena, S.

    2017-12-01

    Using conventional hydrodynamic methods for floodplain mapping in large-scale and data-scarce regions is problematic due to the high cost of these methods, lack of reliable data and uncertainty propagation. In this study a new framework is proposed to generate 100-year floodplains for any gauged or ungauged watershed across the United States (U.S.). This framework uses Flood Insurance Rate Maps (FIRMs), topographic, climatic and land use data which are freely available for entire U.S. for floodplain mapping. The framework consists of three components, including a Random Forest classifier for watershed classification, a Probabilistic Threshold Binary Classifier (PTBC) for generating the floodplains, and a lookup table for linking the Random Forest classifier to the PTBC. The effectiveness and reliability of the proposed framework is tested on 145 watersheds from various geographical locations in the U.S. The validation results show that around 80 percent of total watersheds are predicted well, 14 percent have acceptable fit and less than five percent are predicted poorly compared to FIRMs. Another advantage of this framework is its ability in generating floodplains for all small rivers and tributaries. Due to the high accuracy and efficiency of this framework, it can be used as a preliminary decision making tool to generate 100-year floodplain maps for data-scarce regions and all tributaries where hydrodynamic methods are difficult to use.

  14. Mapping landscape units in Galicia (Spain: A first step for assessment and management?

    Directory of Open Access Journals (Sweden)

    Corbelle-Rico Eduardo

    2017-12-01

    Full Text Available In the beginning of 2015, the Regional Administration of Galicia (NW Spain set the requirements for a map of landscape units: it had to be produced in less than 3 months, it should cover the whole territory of the region (29,574 km², and it should be useful for management at a scale of 1:25,000. With these objectives in mind, we pro- posed a semiautomatic mapping methodology entirely based on the use of free software (GRASS GIS and already available cartographic information. Semi-automatic classification of different land-use patterns was at the heart of the proposed process. Consultation with experts of different academic background took place along the project. This consultation process allowed to identify both problems and opportunities. As it could be expected, the diverse epistemic community represented by the expert panel implied that one of the main challenges was to reach consensus on the understanding of the concept of landscape and the decisions leading to the mapping methodology proposed in this paper. This initiated a very interesting debate that, in our view, was centred around three main issues: the approach to the landscape, the purpose of the mapping exercise, and the ability to include subjectivity into the analysis.

  15. Dose mapping in working space of KORI unit 1 using MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C. W.; Shin, C. H.; Kim, J. G. [Hanyang University, Seoul (Korea, Republic of); Kim, S. Y. [Innovative Techonology Center for Radiation Safety, Seoul (Korea, Republic of)

    2004-07-01

    Radiation field analysis in nuclear power plant mainly depends on actual measurements. In this study, the analysis using computational calculation is performed to overcome the limits of measurement and provide the initial information for unfolding. The radiation field mapping is performed, which makes it possible to analyze the trends of the radiation filed for whole space. By using MCNPX code, containment building inside is modeled for KORI unit 1 cycle 21 under operation. Applying the neutron spectrum from the operating reactor as a radiation source, the ambient doses are calculated in the whole space, containment building inside, for neutron and photon fields. Dose mapping is performed for three spaces, 6{approx}20, 20{approx}44, 44{approx}70 ft from bottom of the containment building. The radiation distribution in dose maps shows the effects from structures and materials of components. With this dose maps, radiation field analysis contained the region near the detect position. The analysis and prediction are possible for radiation field from other radiation source or operating cycle.

  16. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  17. Classification of hyperspectral imagery using MapReduce on a NVIDIA graphics processing unit (Conference Presentation)

    Science.gov (United States)

    Ramirez, Andres; Rahnemoonfar, Maryam

    2017-04-01

    A hyperspectral image provides multidimensional figure rich in data consisting of hundreds of spectral dimensions. Analyzing the spectral and spatial information of such image with linear and non-linear algorithms will result in high computational time. In order to overcome this problem, this research presents a system using a MapReduce-Graphics Processing Unit (GPU) model that can help analyzing a hyperspectral image through the usage of parallel hardware and a parallel programming model, which will be simpler to handle compared to other low-level parallel programming models. Additionally, Hadoop was used as an open-source version of the MapReduce parallel programming model. This research compared classification accuracy results and timing results between the Hadoop and GPU system and tested it against the following test cases: the CPU and GPU test case, a CPU test case and a test case where no dimensional reduction was applied.

  18. Compilation of Thesis Abstracts

    Science.gov (United States)

    2005-09-01

    Special Events Management in the Post-9/11 World...Information Sharing TOWARDS A STRATEGIC APPROACH TO SPECIAL EVENTS MANAGEMENT IN THE POST-9/11 WORLD G.B. Jones-Chief, FBI Special Events Management Unit...counterterrorism and law enforcement planning for major special events and identifies some of the strategic issues that have emerged in special events management

  19. Combining forest inventory, satellite remote sensing, and geospatial data for mapping forest attributes of the conterminous United States

    Science.gov (United States)

    Mark Nelson; Greg Liknes; Charles H. Perry

    2009-01-01

    Analysis and display of forest composition, structure, and pattern provides information for a variety of assessments and management decision support. The objective of this study was to produce geospatial datasets and maps of conterminous United States forest land ownership, forest site productivity, timberland, and reserved forest land. Satellite image-based maps of...

  20. Utilizing Multi-Sensor Fire Detections to Map Fires in the United States

    Science.gov (United States)

    Howard, S. M.; Picotte, J. J.; Coan, M. J.

    2014-11-01

    In 2006, the Monitoring Trends in Burn Severity (MTBS) project began a cooperative effort between the US Forest Service (USFS) and the U.S.Geological Survey (USGS) to map and assess burn severity all large fires that have occurred in the United States since 1984. Using Landsat imagery, MTBS is mandated to map wildfire and prescribed fire that meet specific size criteria: greater than 1000 acres in the west and 500 acres in the east, regardless of ownership. Relying mostly on federal and state fire occurrence records, over 15,300 individual fires have been mapped. While mapping recorded fires, an additional 2,700 "unknown" or undocumented fires were discovered and assessed. It has become apparent that there are perhaps thousands of undocumented fires in the US that are yet to be mapped. Fire occurrence records alone are inadequate if MTBS is to provide a comprehensive accounting of fire across the US. Additionally, the sheer number of fires to assess has overwhelmed current manual procedures. To address these problems, the National Aeronautics and Space Administration (NASA) Applied Sciences Program is helping to fund the efforts of the USGS and its MTBS partners (USFS, National Park Service) to develop, and implement a system to automatically identify fires using satellite data. In near real time, USGS will combine active fire satellite detections from MODIS, AVHRR and GOES satellites with Landsat acquisitions. Newly acquired Landsat imagery will be routinely scanned to identify freshly burned area pixels, derive an initial perimeter and tag the burned area with the satellite date and time of detection. Landsat imagery from the early archive will be scanned to identify undocumented fires. Additional automated fire assessment processes will be developed. The USGS will develop these processes using open source software packages in order to provide freely available tools to local land managers providing them with the capability to assess fires at the local level.

  1. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  2. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  3. MAPPING GLAUCONITE UNITES WITH USING REMOTE SENSING TECHNIQUES IN NORTH EAST OF IRAN

    Directory of Open Access Journals (Sweden)

    R. Ahmadirouhani

    2014-10-01

    Full Text Available Glauconite is a greenish ferric-iron silicate mineral with micaceous structure, characteristically formed in shallow marine environments. Glauconite has been used as a pigmentation agent for oil paint, contaminants remover in environmental studies and a source of potassium in plant fertilizers, and other industries. Koppeh-dagh basin is extended in Iran, Afghanistan and Turkmenistan countries and Glauconite units exist in this basin. In this research for enhancing and mapping glauconitic units in Koppeh-dagh structural zone in north east of Iran, remote sensing techniques such as Spectral Angle Mapper classification (SAM, band ratio and band composition methods on SPOT, ASTER and Landsat data in 3 steps were applied.

  4. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    Science.gov (United States)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  5. Geologic map of the greater Denver area, Front Range urban corridor, Colorado

    Science.gov (United States)

    Trimble, Donald E.; Machette, Michael N.

    1979-01-01

    This digital map shows the areal extent of surficial deposits and rock stratigraphic units (formations) as compiled by Trimble and Machette from 1973 to 1977 and published in 1979 under the Front Range Urban Corridor Geology Program. Trimble and Machette compiled their geologic map from published geologic maps and unpublished geologic mapping having varied map unit schemes. A convenient feature of the compiled map is its uniform classification of geologic units that mostly matches those of companion maps to the north (USGS I-855-G) and to the south (USGS I-857-F). Published as a color paper map, the Trimble and Machette map was intended for land-use planning in the Front Range Urban Corridor. This map recently (1997-1999) was digitized under the USGS Front Range Infrastructure Resources Project. In general, the mountainous areas in the western part of the map exhibit various igneous and metamorphic bedrock units of Precambrian age, major faults, and fault brecciation zones at the east margin (5-20 km wide) of the Front Range. The eastern and central parts of the map (Colorado Piedmont) depict a mantle of unconsolidated deposits of Quaternary age and interspersed outcroppings of Cretaceous or Tertiary-Cretaceous sedimentary bedrock. The Quaternary mantle comprises eolian deposits (quartz sand and silt), alluvium (gravel, sand, and silt of variable composition), colluvium, and a few landslides. At the mountain front, north-trending, dipping Paleozoic and Mesozoic sandstone, shale, and limestone bedrock formations form hogbacks and intervening valleys.

  6. Geologic Map of the Derain (H-10) Quadrangle on Mercury: The Challenges of Consistently Mapping the Intercrater Plains Unit

    Science.gov (United States)

    Whitten, J. L.; Fassett, C. I.; Ostrach, L. R.

    2018-06-01

    We present the initial mapping of the H-10 quadrangle on Mercury, a region that was imaged for the first time by MESSENGER. Geologic map with assist with further characterization of the intercrater plains and their possible formation mechanism(s).

  7. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  8. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  9. The General Urban Plan of Casimcea territorial administrative unit, map of natural and anthropogenic risks

    Directory of Open Access Journals (Sweden)

    Sorin BĂNICĂ

    2013-08-01

    Full Text Available The General Urban Plan represents the legal ground for any development action proposed. After endorsement and approval as required by law, GUP is act of authority of local government for the area in which it applies. The aim is to establish priorities regulations applied in land use planning and construction of structures. In terms of geographical location, the administrative territory of Casimcea, Tulcea county, falls in the central Northwest Plateau Casimcei. This is the second unit of the Central Dobrogea Plateau. Geographical location in southeastern Romania, climatic and relief conditions and anthropogenic pressure, expose the village administrative territorial unit Casimcea, permanent susceptibility to produce natural and antropogenical risks. In this context, we identified the following categories of natural and anthropogenic risks: i natural risk phenomena (earthquakes, strong winds, heavy rains, floods caused by overflowing or precipitation, erosion of river banks and torrents, gravitational processes, rain droplet erosion and surface soil erosion; and ii anthropogenic risk phenomena (overgrazing, chemicals use in agriculture, road transport infrastructure and electricity, wind turbines for electricity production, waste deposits, agro-zootechnical complexs, and human cemeteries. Extending their surface was materialized by creating a map of natural and anthropogenic risk on Casimcea territorial administrative unit, explaining the share of potentially affected areas as territorial balance

  10. Data layer integration for the national map of the united states

    Science.gov (United States)

    Usery, E.L.; Finn, M.P.; Starbuck, M.

    2009-01-01

    The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.

  11. Very High Resolution Tree Cover Mapping for Continental United States using Deep Convolutional Neural Networks

    Science.gov (United States)

    Ganguly, Sangram; Kalia, Subodh; Li, Shuang; Michaelis, Andrew; Nemani, Ramakrishna R.; Saatchi, Sassan A

    2017-01-01

    Uncertainties in input land cover estimates contribute to a significant bias in modeled above ground biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.

  12. Very High Resolution Tree Cover Mapping for Continental United States using Deep Convolutional Neural Networks

    Science.gov (United States)

    Ganguly, S.; Kalia, S.; Li, S.; Michaelis, A.; Nemani, R. R.; Saatchi, S.

    2017-12-01

    Uncertainties in input land cover estimates contribute to a significant bias in modeled above gound biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition/ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree/non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial/satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.

  13. Transient electromagnetic mapping of clay units in the San Luis Valley, Colorado

    Science.gov (United States)

    Fitterman, David V.; Grauch, V.J.S.

    2010-01-01

    Transient electromagnetic soundings were used to obtain information needed to refine hydrologic models of the San Luis Valley, Colorado. The soundings were able to map an aquitard called the blue clay that separates an unconfined surface aquifer from a deeper confined aquifer. The blue clay forms a conductor with an average resistivity of 6.9 ohm‐m. Above the conductor are found a mixture of gray clay and sand. The gray clay has an average resistivity of 21 ohm‐m, while the sand has a resistivity of greater than 100 ohm‐m. The large difference in resistivity of these units makes mapping them with a surface geophysical method relatively easy. The blue clay was deposited at the bottom of Lake Alamosa which filled most of the San Luis Valley during the Pleistocene. The geometry of the blue clay is influenced by a graben on the eastern side of the valley. The depth to the blue clay is greater over the graben. Along the eastern edge of valley the blue clay appears to be truncated by faults.

  14. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  15. Quantitative analysis of terrain units mapped in the northern quarter of Venus from Venera 15/16 data

    Science.gov (United States)

    Schaber, G. G.

    1991-01-01

    The contacts between 34 geological/geomorphic terrain units in the northern quarter of Venus mapped from Venera 15/16 data were digitized and converted to a Sinusoidal Equal-Area projection. The result was then registered with a merged Pioneer Venus/Venera 15/16 altimetric database, root mean square (rms) slope values, and radar reflectivity values derived from Pioneer Venus. The resulting information includes comparisons among individual terrain units and terrain groups to which they are assigned in regard to percentage of map area covered, elevation, rms slopes, distribution of suspected craters greater than 10 km in diameter.

  16. Preliminary geologic map of the Lathrop Wells volcanic center

    International Nuclear Information System (INIS)

    Crowe, B.; Harrington, C.; McFadden, L.; Perry, F.; Wells, S.; Turrin, B.; Champion, D.

    1988-12-01

    A preliminary geologic map has been compiled for the bedrock geology of the Lathrop Wells volcanic center. The map was completed through use of a combination of stereo photographic interpretation and field mapping on color aerial photographs. These photographs (scale 1:4000) were obtained from American Aerial Surveys, Inc. They were flown on August 18, 1987, at the request of the Yucca Mountain Project (then Nevada Nuclear Waste Storage Investigations). The photographs are the Lathrop Wells VC-Area 25 series, numbers 1--32. The original negatives for these photographs are on file with American Aerial Surveys, Inc. Copies of the negatives have been archived at the Los Alamos National Laboratory, Group N-5. The preliminary geologic map is a bedrock geologic map. It does not show alluvial deposits, eolian sands, or scoria fall deposits from the youngest eruptive events. The units will be compiled on separate maps when the geomorphic and soils studies are more advanced

  17. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  18. Navigating Without Road Maps: The Early Business of Automobile Route Guide Publishing in the United States

    Science.gov (United States)

    Bauer, John T.

    2018-05-01

    In the United States, automobile route guides were important precursors to the road maps that Americans are familiar with today. Listing turn-by-turn directions between cities, they helped drivers navigate unmarked, local roads. This paper examines the early business of route guide publishing through the Official Automobile Blue Book series of guides. It focuses specifically on the expansion, contraction, and eventual decline of the Blue Book publishing empire and also the work of professional "pathfinders" that formed the company's data-gathering infrastructure. Be- ginning in 1901 with only one volume, the series steadily grew until 1920, when thirteen volumes were required to record thousands of routes throughout the country. Bankruptcy and corporate restructuring in 1921 forced the publishers to condense the guide into a four-volume set in 1922. Competition from emerging sheet maps, along with the nationwide standardization of highway numbers, pushed a switch to an atlas format in 1926. Blue Books, however, could not remain competitive and disappeared after 1937. "Pathfinders" were employed by the publishers and equipped with reliable automobiles. Soon they developed a shorthand notation system for recording field notes and efficiently incorporating them into the development workflow. Although pathfinders did not call themselves cartographers, they were geographical data field collectors and considered their work to be an "art and a science," much the same as modern-day cartographers. The paper concludes with some comments about the place of route guides in the history of American commercial cartography and draws some parallels between "pathfinders" and the digital road mappers of today.

  19. Detailed mapping of surface units on Mars with HRSC color data

    Science.gov (United States)

    Combe, J.-Ph.; Wendt, L.; McCord, T. B.; Neukum, G.

    2008-09-01

    Introduction: Making use of HRSC color data Mapping outcrops of clays, sulfates and ferric oxides are basis information to derive the climatic, tectonic and volcanic evolution of Mars, especially the episodes related to the presence of liquid water. The challenge is to resolve spatially the outcrops and to distinguish these components from the globally-driven deposits like the iron oxide-rich bright red dust and the basaltic dark sands. The High Resolution Stereo Camera (HRSC) onboard Mars-Express has five color filters in the visible and near infrared that are designed for visual interpretation and mapping various surface units [1]. It provides also information on the topography at scale smaller than a pixel (roughness) thanks to the different geometry of observation for each color channel. The HRSC dataset is the only one that combines global coverage, 200 m/pixel spatial resolution or better and filtering colors of light. The present abstract is a work in progress (to be submitted to Planetary and Space Science) that shows the potential and limitations of HRSC color data as visual support and as multispectral images. Various methods are described from the most simple to more complex ones in order to demonstrate how to make use of the spectra, because of the specific steps of processing they require [2-4]. The objective is to broaden the popularity of HRSC color data, as they could be used more widely by the scientific community. Results prove that imaging spectrometry and HRSC color data complement each other for mapping outcrops types. Example regions of interest HRSC is theoretically sensitive to materials with absorption features in the visible and near-infrared up to 1 μm. Therefore, oxide-rich red dust and basalts (pyroxenes) can be mapped, as well as very bright components like water ice [5, 6]. Possible detection of other materials still has to be demonstrated. We first explore regions where unusual mineralogy appears clearly from spectral data. Hematite

  20. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  1. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  2. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  3. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  4. LA TITUDE-LONGITUDE GRID MAPS OF AFRICA (CCTA ...

    African Journals Online (AJOL)

    of mapping see de Meillon, B., Davis, D. H. S., and Hardy, F., Plague in Southern Mrica. I. The Siphonaptera. Government Printer, Pretoria, 1961, or consult CCTAjCSA Publication. No. 29, referred to above. * Climatological Atlas of Africa, compiled and edited in the African Climatology Unit, University of the. Witwatersrand ...

  5. High Resolution Map of Water Supply and Demand for North East United States

    Science.gov (United States)

    Ehsani, N.; Vorosmarty, C. J.; Fekete, B. M.

    2012-12-01

    Accurate estimates of water supply and demand are crucial elements in water resources management and modeling. As part of our NSF-funded EaSM effort to build a Northeast Regional Earth System Model (NE-RESM) as a framework to improve our understanding and capacity to forecast the implications of planning decisions on the region's environment, ecosystem services, energy and economic systems through the 21st century, we are producing a high resolution map (3' x 3' lat/long) of estimated water supply and use for the north east region of United States. Focusing on water demand, results from this study enables us to quantify how demand sources affect the hydrology and thermal-chemical water pollution across the region. In an attempt to generate this 3-minute resolution map in which each grid cell has a specific estimated monthly domestic, agriculture, thermoelectric and industrial water use. Estimated Use of Water in the United States in 2005 (Kenny et al., 2009) is being coupled to high resolution land cover and land use, irrigation, power plant and population data sets. In addition to water demands, we tried to improve estimates of water supply from the WBM model by improving the way it controls discharge from reservoirs. Reservoirs are key characteristics of the modern hydrologic system, with a particular impact on altering the natural stream flow, thermal characteristics, and biogeochemical fluxes of rivers. Depending on dam characteristics, watershed characteristics and the purpose of building a dam, each reservoir has a specific optimum operating rule. It means that literally 84,000 dams in the National Inventory of Dams potentially follow 84,000 different sets of rules for storing and releasing water which must somehow be accounted for in our modeling exercise. In reality, there is no comprehensive observational dataset depicting these operating rules. Thus, we will simulate these rules. Our perspective is not to find the optimum operating rule per se but to find

  6. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    Science.gov (United States)

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  7. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  8. Radiological mapping of functional transcription units of bacteriophage phiX174 and S13

    International Nuclear Information System (INIS)

    Pollock, T.J.; Tessman, I.; Tessman, E.S.

    1978-01-01

    It has been found that the nearest promoter is not always the primary promoter for making translatable message. The technique of ultraviolet mapping was used to determine the location of promoter sites for translated mRNA coded for by bacteriophages phiX174 and S13. The method is based on the theory that the 'target size' for u.v. inactivation of expression of a gene is proportional to the distance between the promoter and the 3' end of the gene. This method has revealed an expected and some unexpected locations for the promoters responsible for gene expression. Ultraviolet-survival curves for expression of phage genes were interpreted in the following way. The contiguous genes D, F, G and H are expressed as a unit under the control of a promoter located near gene D. However, gene B (and probably the adjacent genes K and C) are controlled by a promoter distant from gene B, possibly in the region of gene H, rather than from a promoter located just before gene B. Likewise, gene A is controlled by a promoter distant from gene A. (author)

  9. A terrain-based site characterization map of California with implications for the contiguous United States

    Science.gov (United States)

    Yong, Alan K.; Hough, Susan E.; Iwahashi, Junko; Braverman, Amy

    2012-01-01

    We present an approach based on geomorphometry to predict material properties and characterize site conditions using the VS30 parameter (time‐averaged shear‐wave velocity to a depth of 30 m). Our framework consists of an automated terrain classification scheme based on taxonomic criteria (slope gradient, local convexity, and surface texture) that systematically identifies 16 terrain types from 1‐km spatial resolution (30 arcsec) Shuttle Radar Topography Mission digital elevation models (SRTM DEMs). Using 853 VS30 values from California, we apply a simulation‐based statistical method to determine the mean VS30 for each terrain type in California. We then compare the VS30 values with models based on individual proxies, such as mapped surface geology and topographic slope, and show that our systematic terrain‐based approach consistently performs better than semiempirical estimates based on individual proxies. To further evaluate our model, we apply our California‐based estimates to terrains of the contiguous United States. Comparisons of our estimates with 325 VS30 measurements outside of California, as well as estimates based on the topographic slope model, indicate our method to be statistically robust and more accurate. Our approach thus provides an objective and robust method for extending estimates of VS30 for regions where in situ measurements are sparse or not readily available.

  10. Land suitability maps for waste disposal siting

    International Nuclear Information System (INIS)

    Hrasna, M.

    1996-01-01

    The suitability of geoenvironment for waste disposal depends mainly on its stability and on the danger of groundwater pollution. Besides them, on the land suitability maps for the given purpose also those factors of the factors of the geoenvironment and the landscape should be taken into account, which enable another way of the land use, such as mineral resources, water resources, fertile soils, nature reserves, etc. On the base of the relevant factors influence evaluation - suitable, moderately suitable and unsuitable territorial units are delimited on the maps. The different way of various scale maps compilation is applied, taken into account their different representing feasibilities. (authors)

  11. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  12. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  13. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  14. Population-Based Trachoma Mapping in Six Evaluation Units of Papua New Guinea.

    Science.gov (United States)

    Ko, Robert; Macleod, Colin; Pahau, David; Sokana, Oliver; Keys, Drew; Burnett, Anthea; Willis, Rebecca; Wabulembo, Geoffrey; Garap, Jambi; Solomon, Anthony W

    2016-01-01

    We sought to determine the prevalence of trachomatous inflammation - follicular (TF) in children aged 1-9 years, and trachomatous trichiasis (TT) in those aged ≥15 years, in suspected trachoma-endemic areas of Papua New Guinea (PNG). We carried out six population-based prevalence surveys using the protocol developed as part of the Global Trachoma Mapping Project. A total of 19,013 individuals were sampled for inclusion, with 15,641 (82.3%) consenting to participate. Four evaluation units had prevalences of TF in children ≥10%, above which threshold the World Health Organization (WHO) recommends mass drug administration (MDA) of azithromycin for at least three years; Western Province (South Fly/Daru) 11.2% (95% confidence interval, CI, 6.9-17.0%), Southern Highlands (East) 12.2% (95% CI 9.6-15.0%), Southern Highlands (West) 11.7% (95% CI 8.5-15.3%), and West New Britain 11.4% (95% CI 8.7-13.9%). TF prevalence was 5.0-9.9% in Madang (9.4%, 95% CI 6.1-13.0%) and National Capital District (6.0%. 95% CI 3.2-9.1%) where consideration of a single round of MDA is warranted. Cases of TT were not found outside West New Britain, in which four cases were seen, generating an estimated population-level prevalence of TT in adults of 0.10% (95% CI 0.00-0.40%) for West New Britain, below the WHO elimination threshold of 0.2% of those aged ≥15 years. Trachoma is a public health issue in PNG. However, other than in West New Britain, there are few data to support the idea that trachoma is a cause of blindness in PNG. Further research is needed to understand the stimulus for the active trachoma phenotype in these populations.

  15. What Is the Unit of Visual Attention? Object for Selection, but Boolean Map for Access

    Science.gov (United States)

    Huang, Liqiang

    2010-01-01

    In the past 20 years, numerous theories and findings have suggested that the unit of visual attention is the object. In this study, I first clarify 2 different meanings of unit of visual attention, namely the unit of access in the sense of measurement and the unit of selection in the sense of division. In accordance with this distinction, I argue…

  16. Heel Effect: Dose Mapping And Profiling For Mobile C-Arm Fluoroscopy Unit Toshiba SXT-1000A

    International Nuclear Information System (INIS)

    Husaini Salleh; Mohd Khalid Matori; Muhammad Jamal Md Isa; Mohd Ramli Arshad; Shahrul Azlan Azizan; Mohd Firdaus Abdul Rahman; Md Khairusalih Md Zin

    2014-01-01

    Heel Effect is the well known phenomena in x-ray production. It contributes the effect to image formation and as well as scattered radiation. But there is paucity in the study related to heel effect. This study is for mapping and profiling the dose on the surface of water phantom by using mobile C-arm unit Toshiba SXT-1000A. Based on the result the dose profile is increasing up to about 57 % from anode to cathode bound of the irradiated area. This result and information can be used as a guide to manipulate these phenomena for better image quality and radiation safety for this specific and dedicated fluoroscopy unit. (author)

  17. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  18. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  19. Seep Detection using E/V Nautilus Integrated Seafloor Mapping and Remotely Operated Vehicles on the United States West Coast

    Science.gov (United States)

    Gee, L. J.; Raineault, N.; Kane, R.; Saunders, M.; Heffron, E.; Embley, R. W.; Merle, S. G.

    2017-12-01

    Exploration Vessel (E/V) Nautilus has been mapping the seafloor off the west coast of the United States, from Washington to California, for the past three years with a Kongsberg EM302 multibeam sonar. This system simultaneously collects bathymetry, seafloor and water column backscatter data, allowing an integrated approach to mapping to more completely characterize a region, and has identified over 1,000 seafloor seeps. Hydrographic multibeam sonars like the EM302 were designed for mapping the bathymetry. It is only in the last decade that major mapping projects included an integrated approach that utilizes the seabed and water column backscatter information in addition to the bathymetry. Nautilus mapping in the Eastern Pacific over the past three years has included a number of seep-specific expeditions, and utilized and adapted the preliminary mapping guidelines that have emerged from research. The likelihood of seep detection is affected by many factors: the environment: seabed geomorphology, surficial sediment, seep location/depth, regional oceanography and biology, the nature of the seeps themselves: size variation, varying flux, depth, and transience, the detection system: design of hydrographic multibeam sonars limits use for water column detection, the platform: variations in the vessel and operations such as noise, speed, and swath overlap. Nautilus integrated seafloor mapping provided multiple indicators of seep locations, but it remains difficult to assess the probability of seep detection. Even when seeps were detected, they have not always been located during ROV dives. However, the presence of associated features (methane hydrate and bacterial mats) serve as evidence of potential seep activity and reinforce the transient nature of the seeps. Not detecting a seep in the water column data does not necessarily indicate that there is not a seep at a given location, but with multiple passes over an area and by the use of other contextual data, an area may

  20. Combined landslide inventory and susceptibility assessment based on different mapping units: an example from the Flemish Ardennes, Belgium

    Directory of Open Access Journals (Sweden)

    M. Van Den Eeckhaut

    2009-03-01

    Full Text Available For a 277 km2 study area in the Flemish Ardennes, Belgium, a landslide inventory and two landslide susceptibility zonations were combined to obtain an optimal landslide susceptibility assessment, in five classes. For the experiment, a regional landslide inventory, a 10 m × 10 m digital representation of topography, and lithological and soil hydrological information obtained from 1:50 000 scale maps, were exploited. In the study area, the regional inventory shows 192 landslides of the slide type, including 158 slope failures occurred before 1992 (model calibration set, and 34 failures occurred after 1992 (model validation set. The study area was partitioned in 2.78×106 grid cells and in 1927 topographic units. The latter are hydro-morphological units obtained by subdividing slope units based on terrain gradient. Independent models were prepared for the two terrain subdivisions using discriminant analysis. For grid cells, a single pixel was identified as representative of the landslide depletion area, and geo-environmental information for the pixel was obtained from the thematic maps. The landslide and geo-environmental information was used to model the propensity of the terrain to host landslide source areas. For topographic units, morphologic and hydrologic information and the proportion of lithologic and soil hydrological types in each unit, were used to evaluate landslide susceptibility, including the depletion and depositional areas. Uncertainty associated with the two susceptibility models was evaluated, and the model performance was tested using the independent landslide validation set. An heuristic procedure was adopted to combine the landslide inventory and the susceptibility zonations. The procedure makes optimal use of the available landslide and susceptibility information, minimizing the limitations inherent in the inventory and the susceptibility maps. For the established susceptibility classes, regulations to

  1. Combined landslide inventory and susceptibility assessment based on different mapping units: an example from the Flemish Ardennes, Belgium

    Science.gov (United States)

    van den Eeckhaut, M.; Reichenbach, P.; Guzzetti, F.; Rossi, M.; Poesen, J.

    2009-03-01

    For a 277 km2 study area in the Flemish Ardennes, Belgium, a landslide inventory and two landslide susceptibility zonations were combined to obtain an optimal landslide susceptibility assessment, in five classes. For the experiment, a regional landslide inventory, a 10 m × 10 m digital representation of topography, and lithological and soil hydrological information obtained from 1:50 000 scale maps, were exploited. In the study area, the regional inventory shows 192 landslides of the slide type, including 158 slope failures occurred before 1992 (model calibration set), and 34 failures occurred after 1992 (model validation set). The study area was partitioned in 2.78×106 grid cells and in 1927 topographic units. The latter are hydro-morphological units obtained by subdividing slope units based on terrain gradient. Independent models were prepared for the two terrain subdivisions using discriminant analysis. For grid cells, a single pixel was identified as representative of the landslide depletion area, and geo-environmental information for the pixel was obtained from the thematic maps. The landslide and geo-environmental information was used to model the propensity of the terrain to host landslide source areas. For topographic units, morphologic and hydrologic information and the proportion of lithologic and soil hydrological types in each unit, were used to evaluate landslide susceptibility, including the depletion and depositional areas. Uncertainty associated with the two susceptibility models was evaluated, and the model performance was tested using the independent landslide validation set. An heuristic procedure was adopted to combine the landslide inventory and the susceptibility zonations. The procedure makes optimal use of the available landslide and susceptibility information, minimizing the limitations inherent in the inventory and the susceptibility maps. For the established susceptibility classes, regulations to link terrain domains to appropriate land

  2. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  3. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  4. A Fire Severity Mapping System (FSMS) for real-time management applications and long term planning: Developing a map of the landscape potential for severe fire in the western United States

    Science.gov (United States)

    Gregory K. Dillon; Zachary A. Holden; Penny Morgan; Bob Keane

    2009-01-01

    The Fire Severity Mapping System project is geared toward providing fire managers across the western United States with critical information for dealing with and planning for the ecological effects of wildfire at multiple levels of thematic, spatial, and temporal detail. For this project, we are developing a comprehensive, west-wide map of the landscape potential for...

  5. Using NASA Satellite Observations to Map Wildfire Risk in the United States for Allocation of Fire Management Resources

    Science.gov (United States)

    Farahmand, A.; Reager, J. T., II; Behrangi, A.; Stavros, E. N.; Randerson, J. T.

    2017-12-01

    Fires are a key disturbance globally acting as a catalyst for terrestrial ecosystem change and contributing significantly to both carbon emissions and changes in surface albedo. The socioeconomic impacts of wildfire activities are also significant with wildfire activity results in billions of dollars of losses every year. Fire size, area burned and frequency are increasing, thus the likelihood of fire danger, defined by United States National Interagency Fire Center (NFIC) as the demand of fire management resources as a function of how flammable fuels (a function of ignitability, consumability and availability) are from normal, is an important step toward reducing costs associated with wildfires. Numerous studies have aimed to predict the likelihood of fire danger, but few studies use remote sensing data to map fire danger at scales commensurate with regional management decisions (e.g., deployment of resources nationally throughout fire season with seasonal and monthly prediction). Here, we use NASA Gravity Recovery And Climate Experiment (GRACE) assimilated surface soil moisture, NASA Atmospheric Infrared Sounder (AIRS) vapor pressure deficit, NASA Moderate Resolution Imaging Spectroradiometer (MODIS) enhanced vegetation index products and landcover products, along with US Forest Service historical fire activity data to generate probabilistic monthly fire potential maps in the United States. These maps can be useful in not only government operational allocation of fire management resources, but also improving understanding of the Earth System and how it is changing in order to refine predictions of fire extremes.

  6. Elaboration Of A Classification Of Geomorphologic Units And The Basis Of A Digital Data-Base For Establishing Geomorphologic Maps In Egypt

    International Nuclear Information System (INIS)

    EI Gammal, E.A.; Cherif, O.H.; Abdel Aleem, E.

    2003-01-01

    A database for the classification and description of basic geomorphologic land form units has been prepared for establishing geomorphologic maps in Egyptian terrains. This database includes morpho-structural, lithological, denudational and depositional units. The database.is included in tables with proper coding to be used for establishing automatically the color, symbols and legend of the maps. Also the system includes description of various geomorphic units. The system is designed to be used with the ARC Map software. The AUTOCAD 2000 software has been used to trace the maps. The database has been applied to produce five new geomorphologic maps with a scale of I: 100 000. These are: Wadi Feiran Sheet, Wadi Kid Sheet, Gabal Katherina Sheet in South Sinai, Shelattein area (South Eastern Desert) and Baharia Oasis area (Western Desert)

  7. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  8. Material Units, Structures/Landforms, and Stratigraphy for the Global Geologic Map of Ganymede (1:15M)

    Science.gov (United States)

    Patterson, G. Wesley; Head, James W.; Collins, Geoffrey C.; Pappalardo, Robert T.; Prockter, Louis M.; Lucchitta, Baerbel K.

    2008-01-01

    In the coming year a global geological map of Ganymede will be completed that represents the most recent understanding of the satellite on the basis of Galileo mission results. This contribution builds on important previous accomplishments in the study of Ganymede utilizing Voyager data and incorporates the many new discoveries that were brought about by examination of Galileo data. Material units have been defined, structural landforms have been identified, and an approximate stratigraphy has been determined utilizing a global mosaic of the surface with a nominal resolution of 1 km/pixel assembled by the USGS. This mosaic incorporates the best available Voyager and Galileo regional coverage and high resolution imagery (100-200 m/pixel) of characteristic features and terrain types obtained by the Galileo spacecraft. This map has given us a more complete understanding of: 1) the major geological processes operating on Ganymede, 2) the characteristics of the geological units making up its surface, 3) the stratigraphic relationships of geological units and structures, and 4) the geological history inferred from these relationships. A summary of these efforts is provided here.

  9. Translation of Bernstein Coefficients Under an Affine Mapping of the Unit Interval

    Science.gov (United States)

    Alford, John A., II

    2012-01-01

    We derive an expression connecting the coefficients of a polynomial expanded in the Bernstein basis to the coefficients of an equivalent expansion of the polynomial under an affine mapping of the domain. The expression may be useful in the calculation of bounds for multi-variate polynomials.

  10. Okeanos Explorer (EX1606): CAPSTONE Wake Island Unit PRIMNM (ROV & Mapping)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Operations will use the ship’s deep water mapping systems (Kongsberg EM302 multibeam sonar, EK60 split-beam fisheries sonars, ADCPs, and Knudsen 3260 chirp...

  11. Next-generation forest change mapping across the United States: the landscape change monitoring system (LCMS)

    Science.gov (United States)

    Sean P. Healey; Warren B. Cohen; Yang Zhiqiang; Ken Brewer; Evan Brooks; Noel Gorelick; Mathew Gregory; Alexander Hernandez; Chengquan Huang; Joseph Hughes; Robert Kennedy; Thomas Loveland; Kevin Megown; Gretchen Moisen; Todd Schroeder; Brian Schwind; Stephen Stehman; Daniel Steinwand; James Vogelmann; Curtis Woodcock; Limin Yang; Zhe. Zhu

    2015-01-01

    Forest change information is critical in forest planning, ecosystem modeling, and in updating forest condition maps. The Landsat satellite platform has provided consistent observations of the world’s ecosystems since 1972. A number of innovative change detection algorithms have been developed to use the Landsat archive to identify and characterize forest change. The...

  12. Geologic quadrangle maps of the United States: geology of the Casa Diablo Mountain quadrangle, California

    Science.gov (United States)

    Rinehart, C. Dean; Ross, Donald Clarence

    1957-01-01

    The Casa Diablo Mountain quadrangle was mapped in the summers of 1952 and 1953 by the U.S. Geological Survey in cooperation with the California State Division of Mines as part of a study of potential tungsten-bearing areas.

  13. Taxonomic classification of world map units in crop producing areas of Argentina and Brazil with representative US soil series and major land resource areas in which they occur

    Science.gov (United States)

    Huckle, H. F. (Principal Investigator)

    1980-01-01

    The most probable current U.S. taxonomic classification of the soils estimated to dominate world soil map units (WSM)) in selected crop producing states of Argentina and Brazil are presented. Representative U.S. soil series the units are given. The map units occurring in each state are listed with areal extent and major U.S. land resource areas in which similar soils most probably occur. Soil series sampled in LARS Technical Report 111579 and major land resource areas in which they occur with corresponding similar WSM units at the taxonomic subgroup levels are given.

  14. Mapping critical levels of ozone, sulfur dioxide and nitrogen oxide for crops, forests and natural vegetation in the United States

    International Nuclear Information System (INIS)

    Rosenbaum, B.J.; Strickland, T.C.; McDowell, M.K.

    1994-01-01

    Air pollution abatement strategies for controlling nitrogen dioxide, sulfur dioxide, and ozone emissions in the United States focus on a 'standards-based' approach. This approach places limits on air pollution by maintaining a baseline value for air quality, no matter what the ecosystem can or cannot withstand. This paper, presents example critical levels maps for the conterminous U.S. developed using the 'effects-based' mapping approach as defined by the United Nations Economic Commission for Europe's Convention on Long-Range Transboundary Air Pollution, Task Force on Mapping. This approach emphasizes the pollution level or load capacity an ecosystem can accommodate before degradation occurs, and allows for analysis of cumulative effects. Presents the first stage of an analysis that reports the distribution of exceedances of critical levels for NO 2 , SO 2 , and O 3 in sensitive forest, crop, and natural vegetation ecosystems in the contiguous United States. It is concluded that extrapolation to surrounding geographic areas requires the analysis of diverse and compounding factors that preclude simple extrapolation methods. Pollutant data depicted in this analysis are limited to locationally specific data, and would be enhanced by utilizing spatial statistics, along with converging associated anthropogenic and climatological factors. Values used for critical levels were derived from current scientific knowledge. While not intended to be a definitive value, adjustments will occur as the scientific community gains new insight to pollutant/receptor relationships. We recommend future analysis to include a refinement of sensitive receptor data coverages and to report relative proportions of exceedances at varying grid scales. 27 refs., 4 figs., 1 tab

  15. Assessment and mapping of slope stability based on slope units: A ...

    Indian Academy of Sciences (India)

    Shallow landslide; infinite slope stability equation; return period precipitation; assessment; slope unit. ... 2010), logistic regression ... model to assess the hazard of shallow landslides ..... grating a fuzzy k-means classification and a Bayesian.

  16. Mapping Investments and Published Outputs in Norovirus Research: A Systematic Analysis of Research Funded in the United States and United Kingdom During 1997-2013.

    Science.gov (United States)

    Head, Michael G; Fitchett, Joseph R; Lichtman, Amos B; Soyode, Damilola T; Harris, Jennifer N; Atun, Rifat

    2016-02-01

    Norovirus accounts for a considerable portion of the global disease burden. Mapping national or international investments relating to norovirus research is limited. We analyzed the focus and type of norovirus research funding awarded to institutions in the United States and United Kingdom during 1997-2013. Data were obtained from key public and philanthropic funders across both countries, and norovirus-related research was identified from study titles and abstracts. Included studies were further categorized by the type of scientific investigation, and awards related to vaccine, diagnostic, and therapeutic research were identified. Norovirus publication trends are also described using data from Scopus. In total, US and United Kingdom funding investment for norovirus research was £97.6 million across 349 awards; 326 awards (amount, £84.9 million) were received by US institutions, and 23 awards (£12.6 million) were received by United Kingdom institutions. Combined, £81.2 million of the funding (83.2%) was for preclinical research, and £16.4 million (16.8%) was for translational science. Investments increased from £1.7 million in 1997 to £11.8 million in 2013. Publication trends showed a consistent temporal increase from 48 in 1997 to 182 in 2013. Despite increases over time, trends in US and United Kingdom funding for norovirus research clearly demonstrate insufficient translational research and limited investment in diagnostics, therapeutics, or vaccine research. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  17. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  18. Geoelectric hazard maps for the Mid-Atlantic United States: 100 year extreme values and the 1989 magnetic storm

    Science.gov (United States)

    Love, Jeffrey J.; Lucas, Greg M.; Kelbert, Anna; Bedrosian, Paul A.

    2018-01-01

    Maps of extreme value geoelectric field amplitude are constructed for the Mid‐Atlantic United States, a region with high population density and critically important power grid infrastructure. Geoelectric field time series for the years 1983–2014 are estimated by convolving Earth surface impedances obtained from 61 magnetotelluric survey sites across the Mid‐Atlantic with historical 1 min (2 min Nyquist) measurements of geomagnetic variation obtained from a nearby observatory. Statistical models are fitted to the maximum geoelectric amplitudes occurring during magnetic storms, and extrapolations made to estimate threshold amplitudes only exceeded, on average, once per century. For the Mid‐Atlantic region, 100 year geoelectric exceedance amplitudes have a range of almost 3 orders of magnitude (from 0.04 V/km at a site in southern Pennsylvania to 24.29 V/km at a site in central Virginia), and they have significant geographic granularity, all of which is due to site‐to‐site differences in magnetotelluric impedance. Maps of these 100 year exceedance amplitudes resemble those of the estimated geoelectric amplitudes attained during the March 1989 magnetic storm, and, in that sense, the March 1989 storm resembles what might be loosely called a “100 year” event. The geoelectric hazard maps reported here stand in stark contrast with the 100 year geoelectric benchmarks developed for the North American Electric Reliability Corporation.

  19. Geoelectric Hazard Maps for the Mid-Atlantic United States: 100 Year Extreme Values and the 1989 Magnetic Storm

    Science.gov (United States)

    Love, Jeffrey J.; Lucas, Greg M.; Kelbert, Anna; Bedrosian, Paul A.

    2018-01-01

    Maps of extreme value geoelectric field amplitude are constructed for the Mid-Atlantic United States, a region with high population density and critically important power grid infrastructure. Geoelectric field time series for the years 1983-2014 are estimated by convolving Earth surface impedances obtained from 61 magnetotelluric survey sites across the Mid-Atlantic with historical 1 min (2 min Nyquist) measurements of geomagnetic variation obtained from a nearby observatory. Statistical models are fitted to the maximum geoelectric amplitudes occurring during magnetic storms, and extrapolations made to estimate threshold amplitudes only exceeded, on average, once per century. For the Mid-Atlantic region, 100 year geoelectric exceedance amplitudes have a range of almost 3 orders of magnitude (from 0.04 V/km at a site in southern Pennsylvania to 24.29 V/km at a site in central Virginia), and they have significant geographic granularity, all of which is due to site-to-site differences in magnetotelluric impedance. Maps of these 100 year exceedance amplitudes resemble those of the estimated geoelectric amplitudes attained during the March 1989 magnetic storm, and, in that sense, the March 1989 storm resembles what might be loosely called a "100 year" event. The geoelectric hazard maps reported here stand in stark contrast with the 100 year geoelectric benchmarks developed for the North American Electric Reliability Corporation.

  20. Mapping water availability, cost and projected consumptive use in the eastern United States with comparisons to the west

    Science.gov (United States)

    Tidwell, Vincent C.; Moreland, Barbie D.; Shaneyfelt, Calvin R.; Kobos, Peter

    2018-01-01

    The availability of freshwater supplies to meet future demand is a growing concern. Water availability metrics are needed to inform future water development decisions. With the help of water managers, water availability was mapped for over 1300 watersheds throughout the 31 contiguous states in the eastern US complimenting a prior study of the west. The compiled set of water availability data is unique in that it considers multiple sources of water (fresh surface and groundwater, wastewater and brackish groundwater); accommodates institutional controls placed on water use; is accompanied by cost estimates to access, treat and convey each unique source of water; and is compared to projected future growth in consumptive water use to 2030. Although few administrative limits have been set on water availability in the east, water managers have identified 315 fresh surface water and 398 fresh groundwater basins (with 151 overlapping basins) as areas of concern (AOCs) where water supply challenges exist due to drought related concerns, environmental flows, groundwater overdraft, or salt water intrusion. This highlights a difference in management where AOCs are identified in the east which simply require additional permitting, while in the west strict administrative limits are established. Although the east is generally considered ‘water rich’ roughly a quarter of the basins were identified as AOCs; however, this is still in strong contrast to the west where 78% of the surface water basins are operating at or near their administrative limit. Little effort was noted on the part of eastern or western water managers to quantify non-fresh water resources.

  1. Geologic map of Harrat Hutaymah, with petrologic classification and distribution of ultramafic inclusions, Saudi Arabia

    Science.gov (United States)

    Thornber, Carl R.

    1990-01-01

    This map shows detailed geology of the Quaternary and Tertiary volcanic deposits that comprise Harrat Hutaymah and an updated and generalized compilation of the underlying Proterozoic and Paleozoic basement rocks. Quaternary alluvial cover and details of basement geology (that is, faults, dikes, and other features) are not shown. Volcanic unit descriptions and contact relations are based upon field investigation by the author and on compilation and revision of mapping Kellogg (1984; northern half of area) and Pallister (1984; southern half of area). A single K-Ar date of 1.80 ± 0.05 Ma for an alkali olivine basalt flow transected by the Al Hutaymah tuff ring (Pallister, 1984) provides the basis for an estimated late Tertiary to Quaternary age range for all harrat volcanic units other than unit Qtr (tuff reworked during Quaternary age time). Contact relations and unit descriptions for the basement rocks were compiled from Pallister (1984), Kellogg (1984 and 1985), DuBray (1984), Johnson and Williams (1984), Vaslet and others (1987), Cole and Hedge (1986), and Richter and others (1984). All rock unit names in this report are informal and capitalization follows Saudi Arabian stratigraphic nomenclature (Fitch, 1980). Geographic information was compiled from Pallister (1984), Kellogg (1984), and Fuller (in Johnson and Williams, 1984) and from field investigation by the author in 1986. The pie diagrams on the map show the distribution and petrology of ultramafic xenoliths of Harrat Hutaymah. The pie diagrams are explained by a detailed classification of ultramafic xenoliths that is introduced in this report.

  2. Mapping marginal croplands suitable for cellulosic feedstock crops in the Great Plains, United States

    Science.gov (United States)

    Gu, Yingxin; Wylie, Bruce K.

    2016-01-01

    Growing cellulosic feedstock crops (e.g., switchgrass) for biofuel is more environmentally sustainable than corn-based ethanol. Specifically, this practice can reduce soil erosion and water quality impairment from pesticides and fertilizer, improve ecosystem services and sustainability (e.g., serve as carbon sinks), and minimize impacts on global food supplies. The main goal of this study was to identify high-risk marginal croplands that are potentially suitable for growing cellulosic feedstock crops (e.g., switchgrass) in the US Great Plains (GP). Satellite-derived growing season Normalized Difference Vegetation Index, a switchgrass biomass productivity map obtained from a previous study, US Geological Survey (USGS) irrigation and crop masks, and US Department of Agriculture (USDA) crop indemnity maps for the GP were used in this study. Our hypothesis was that croplands with relatively low crop yield but high productivity potential for switchgrass may be suitable for converting to switchgrass. Areas with relatively low crop indemnity (crop indemnity marginal croplands in the GP are potentially suitable for switchgrass development. The total estimated switchgrass biomass productivity gain from these suitable areas is about 5.9 million metric tons. Switchgrass can be cultivated in either lowland or upland regions in the GP depending on the local soil and environmental conditions. This study improves our understanding of ecosystem services and the sustainability of cropland systems in the GP. Results from this study provide useful information to land managers for making informed decisions regarding switchgrass development in the GP.

  3. Mapping water availability, projected use and cost in the western United States

    Science.gov (United States)

    Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara

    2014-05-01

    New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.

  4. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  5. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  6. Hierarchical Object-Based Mapping of Riverscape Units and in-Stream Mesohabitats Using LiDAR and VHR Imagery

    Directory of Open Access Journals (Sweden)

    Luca Demarchi

    2016-01-01

    Full Text Available In this paper, we present a new, semi-automated methodology for mapping hydromorphological indicators of rivers at a regional scale using multisource remote sensing (RS data. This novel approach is based on the integration of spectral and topographic information within a multilevel, geographic, object-based image analysis (GEOBIA. Different segmentation levels were generated based on the two sources of Remote Sensing (RS data, namely very-high spatial resolution, near-infrared imagery (VHR and high-resolution LiDAR topography. At each level, different input object features were tested with Machine Learning classifiers for mapping riverscape units and in-stream mesohabitats. The GEOBIA approach proved to be a powerful tool for analyzing the river system at different levels of detail and for coupling spectral and topographic datasets, allowing for the delineation of the natural fluvial corridor with its primary riverscape units (e.g., water channel, unvegetated sediment bars, riparian densely-vegetated units, etc. and in-stream mesohabitats with a high level of accuracy, respectively of K = 0.91 and K = 0.83. This method is flexible and can be adapted to different sources of data, with the potential to be implemented at regional scales in the future. The analyzed dataset, composed of VHR imagery and LiDAR data, is nowadays increasingly available at larger scales, notably through European Member States. At the same time, this methodology provides a tool for monitoring and characterizing the hydromorphological status of river systems continuously along the entire channel network and coherently through time, opening novel and significant perspectives to river science and management, notably for planning and targeting actions.

  7. Geologic map and map database of northeastern San Francisco Bay region, California, [including] most of Solano County and parts of Napa, Marin, Contra Costa, San Joaquin, Sacramento, Yolo, and Sonoma Counties

    Science.gov (United States)

    Graymer, Russell Walter; Jones, David Lawrence; Brabb, Earl E.

    2002-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (nesfmf.ps, nesfmf.pdf, nesfmf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  8. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  9. A spatial database of bedding attitudes to accompany Geologic map of the greater Denver area, Front Range Urban Corridor, Colorado

    Science.gov (United States)

    Trimble, Donald E.; Machette, Michael N.; Brandt, Theodore R.; Moore, David W.; Murray, Kyle E.

    2003-01-01

    This digital map shows bedding attitude symbols display over the geographic extent of surficial deposits and rock stratigraphic units (formations) as compiled by Trimble and Machette 1973-1977 and published in 1979 (U.S. Geological Survey Map I-856-H) under the Front Range Urban Corridor Geology Program. Trimble and Machette compiled their geologic map from published geologic maps and unpublished geologic mapping having varied map unit schemes. A convenient feature of the compiled map is its uniform classification of geologic units that mostly matches those of companion maps to the north (USGS I-855-G) and to the south (USGS I-857-F). Published as a color paper map, the Trimble and Machette map was intended for land-use planning in the Front Range Urban Corridor. This map recently (1997-1999), was digitized under the USGS Front Range Infrastructure Resources Project (see cross-reference). In general, the mountainous areas in the west part of the map exhibit various igneous and metamorphic bedrock units of Precambrian age, major faults, and fault brecciation zones at the east margin (5-20 km wide) of the Front Range. The eastern and central parts of the map (Colorado Piedmont) depict a mantle of unconsolidated deposits of Quaternary age and interspersed outcroppings of Cretaceous or Tertiary-Cretaceous sedimentary bedrock. The Quaternary mantle is comprised of eolian deposits (quartz sand and silt), alluvium (gravel, sand, and silt of variable composition), colluvium, and few landslides. At the mountain front, north-trending, dipping Paleozoic and Mesozoic sandstone, shale, and limestone bedrock formations form hogbacks and intervening valleys.

  10. Circum-North Pacific tectonostratigraphic terrane map

    Science.gov (United States)

    Nokleberg, Warren J.; Parfenov, Leonid M.; Monger, James W.H.; Baranov, Boris B.; Byalobzhesky, Stanislav G.; Bundtzen, Thomas K.; Feeney, Tracey D.; Fujita, Kazuya; Gordey, Steven P.; Grantz, Arthur; Khanchuk, Alexander I.; Natal'in, Boris A.; Natapov, Lev M.; Norton, Ian O.; Patton, William W.; Plafker, George; Scholl, David W.; Sokolov, Sergei D.; Sosunov, Gleb M.; Stone, David B.; Tabor, Rowland W.; Tsukanov, Nickolai V.; Vallier, Tracy L.; Wakita, Koji

    1994-01-01

    The companion tectonostratigraphic terrane and overlap assemblage of map the Circum-North Pacific presents a modern description of the major geologic and tectonic units of the region. The map illustrates both the onshore terranes and overlap volcanic assemblages of the region, and the major offshore geologic features. The map is the first collaborative compilation of the geology of the region at a scale of 1:5,000,000 by geologists of the Russian Far East, Japanese, Alaskan, Canadian, and U.S.A. Pacific Northwest. The map is designed to be a source of geologic information for all scientists interested in the region, and is designed to be used for several purposes, including regional tectonic analyses, mineral resource and metallogenic analyses (Nokleberg and others, 1993, 1994a), petroleum analyses, neotectonic analyses, and analyses of seismic hazards and volcanic hazards. This text contains an introduction, tectonic definitions, acknowledgments, descriptions of postaccretion stratified rock units, descriptions and stratigraphic columns for tectonostratigraphic terranes in onshore areas, and references for the companion map (Sheets 1 to 5). This map is the result of extensive geologic mapping and associated tectonic studies in the Russian Far East, Hokkaido Island of Japan, Alaska, the Canadian Cordillera, and the U.S.A. Pacific Northwest in the last few decades. Geologic mapping suggests that most of this region can be interpreted as a collage of fault-bounded tectonostratigraphic terranes that were accreted onto continental margins around the Circum-

  11. Advanced competencies mapping of critical care nursing: a qualitative research in two Intensive Care Units.

    Science.gov (United States)

    Alfieri, Emanuela; Mori, Marina; Barbui, Valentina; Sarli, Leopoldo

    2017-07-18

    Nowadays, in Italy, the nursing profession has suffered important changes in response to the needs of citizens' health and to improve the quality of the health service in the country.  At the basis of this development there is an increase of the nurses' knowledge, competencies and responsibilities. Currently, the presence of nurses who have followed post-basic training paths, and the subsequent acquisition of advanced clinical knowledge and specializations, has made it essential for the presence of competencies mappings for each specialty, also to differentiate them from general care nurses. The objective is to get a mapping of nurse's individual competencies working in critical care, to analyze the context of the Parma Hospital and comparing it with the Lebanon Heart Hospital in Lebanon. The survey has been done through a series of interviews involving some of the hospital staff, in order to collect opinions about the ICU nurses' competencies. What emerged from the data allowed us to get a list of important abilities, competencies, character traits and  intensive care nurse activities. Italians and Lebanese nurses appear to be prepared from a technical point of view, with a desire for improvement through specializations, masters and enabling courses in advanced health maneuvers. By respondents nurses can seize a strong desire for professional improvement. At the end of our research we were able to draw a list of different individual competencies, behavioral and moral characteristics. The nurse figure has a high potential and large professional improvement prospects, if more taken into account by the health system.

  12. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  13. Mapping grasslands suitable for cellulosic biofuels in the Greater Platte River Basin, United States

    Science.gov (United States)

    Wylie, Bruce K.; Gu, Yingxin

    2012-01-01

    Biofuels are an important component in the development of alternative energy supplies, which is needed to achieve national energy independence and security in the United States. The most common biofuel product today in the United States is corn-based ethanol; however, its development is limited because of concerns about global food shortages, livestock and food price increases, and water demand increases for irrigation and ethanol production. Corn-based ethanol also potentially contributes to soil erosion, and pesticides and fertilizers affect water quality. Studies indicate that future potential production of cellulosic ethanol is likely to be much greater than grain- or starch-based ethanol. As a result, economics and policy incentives could, in the near future, encourage expansion of cellulosic biofuels production from grasses, forest woody biomass, and agricultural and municipal wastes. If production expands, cultivation of cellulosic feedstock crops, such as switchgrass (Panicum virgatum L.) and miscanthus (Miscanthus species), is expected to increase dramatically. The main objective of this study is to identify grasslands in the Great Plains that are potentially suitable for cellulosic feedstock (such as switchgrass) production. Producing ethanol from noncropland holdings (such as grassland) will minimize the effects of biofuel developments on global food supplies. Our pilot study area is the Greater Platte River Basin, which includes a broad range of plant productivity from semiarid grasslands in the west to the fertile corn belt in the east. The Greater Platte River Basin was the subject of related U.S. Geological Survey (USGS) integrated research projects.

  14. Mapping the potential distribution of the invasive Red Shiner, Cyprinella lutrensis (Teleostei: Cyprinidae) across waterways of the conterminous United States

    Science.gov (United States)

    Poulos, Helen M.; Chernoff, Barry; Fuller, Pam L.; Butman, David

    2012-01-01

    Predicting the future spread of non-native aquatic species continues to be a high priority for natural resource managers striving to maintain biodiversity and ecosystem function. Modeling the potential distributions of alien aquatic species through spatially explicit mapping is an increasingly important tool for risk assessment and prediction. Habitat modeling also facilitates the identification of key environmental variables influencing species distributions. We modeled the potential distribution of an aggressive invasive minnow, the red shiner (Cyprinella lutrensis), in waterways of the conterminous United States using maximum entropy (Maxent). We used inventory records from the USGS Nonindigenous Aquatic Species Database, native records for C. lutrensis from museum collections, and a geographic information system of 20 raster climatic and environmental variables to produce a map of potential red shiner habitat. Summer climatic variables were the most important environmental predictors of C. lutrensis distribution, which was consistent with the high temperature tolerance of this species. Results from this study provide insights into the locations and environmental conditions in the US that are susceptible to red shiner invasion.

  15. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  16. Mapping and modeling the biogeochemical cycling of turf grasses in the United States.

    Science.gov (United States)

    Milesi, Cristina; Running, Steven W; Elvidge, Christopher D; Dietz, John B; Tuttle, Benjamin T; Nemani, Ramakrishna R

    2005-09-01

    Turf grasses are ubiquitous in the urban landscape of the United States and are often associated with various types of environmental impacts, especially on water resources, yet there have been limited efforts to quantify their total surface and ecosystem functioning, such as their total impact on the continental water budget and potential net ecosystem exchange (NEE). In this study, relating turf grass area to an estimate of fractional impervious surface area, it was calculated that potentially 163,800 km2 (+/- 35,850 km2) of land are cultivated with turf grasses in the continental United States, an area three times larger than that of any irrigated crop. Using the Biome-BGC ecosystem process model, the growth of warm-season and cool-season turf grasses was modeled at a number of sites across the 48 conterminous states under different management scenarios, simulating potential carbon and water fluxes as if the entire turf surface was to be managed like a well-maintained lawn. The results indicate that well-watered and fertilized turf grasses act as a carbon sink. The potential NEE that could derive from the total surface potentially under turf (up to 17 Tg C/yr with the simulated scenarios) would require up to 695 to 900 liters of water per person per day, depending on the modeled water irrigation practices, suggesting that outdoor water conservation practices such as xeriscaping and irrigation with recycled waste-water may need to be extended as many municipalities continue to face increasing pressures on freshwater.

  17. Mapping and Assessment of the United States Ocean Wave Energy Resource

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T; Hagerman, George; Scott, George

    2011-12-01

    This project estimates the naturally available and technically recoverable U.S. wave energy resources, using a 51-month Wavewatch III hindcast database developed especially for this study by National Oceanographic and Atmospheric Administration's (NOAA's) National Centers for Environmental Prediction. For total resource estimation, wave power density in terms of kilowatts per meter is aggregated across a unit diameter circle. This approach is fully consistent with accepted global practice and includes the resource made available by the lateral transfer of wave energy along wave crests, which enables wave diffraction to substantially reestablish wave power densities within a few kilometers of a linear array, even for fixed terminator devices. The total available wave energy resource along the U.S. continental shelf edge, based on accumulating unit circle wave power densities, is estimated to be 2,640 TWh/yr, broken down as follows: 590 TWh/yr for the West Coast, 240 TWh/yr for the East Coast, 80 TWh/yr for the Gulf of Mexico, 1570 TWh/yr for Alaska, 130 TWh/yr for Hawaii, and 30 TWh/yr for Puerto Rico. The total recoverable wave energy resource, as constrained by an array capacity packing density of 15 megawatts per kilometer of coastline, with a 100-fold operating range between threshold and maximum operating conditions in terms of input wave power density available to such arrays, yields a total recoverable resource along the U.S. continental shelf edge of 1,170 TWh/yr, broken down as follows: 250 TWh/yr for the West Coast, 160 TWh/yr for the East Coast, 60 TWh/yr for the Gulf of Mexico, 620 TWh/yr for Alaska, 80 TWh/yr for Hawaii, and 20 TWh/yr for Puerto Rico.

  18. Resonating, Rejecting, Reinterpreting: Mapping the Stabilization Discourse in the United Nations Security Council, 2000–14

    Directory of Open Access Journals (Sweden)

    David Curran

    2015-10-01

    Full Text Available This article charts the evolution of the conceptualisation of stabilization in the UN Security Council (UNSC during the period 2001–2014. UNSC open meetings provide an important dataset for a critical review of stabilization discourse and an opportunity to chart the positions of permanent Members, rotating Members and the UN Secretariat towards this concept. This article is the first to conduct an analysis of this material to map the evolution of stabilization in this critical chamber of the UN. This dataset of official statements will be complemented by a review of open source reporting on UNSC meetings and national stabilization doctrines of the ‘P3’ – France, the UK and the US. These countries have developed national stabilization doctrines predominantly to deal with cross-governmental approaches to counterinsurgency operations conducted during the 2000s. The article therefore presents a genealogy of the concept of stabilization in the UNSC to help understand implications for its future development in this multilateral setting. This article begins by examining efforts by the P3 to ‘upload’ their conceptualisations of stabilization into UN intervention frameworks. Secondly, the article uses a content analysis of UNSC debates during 2000–2014 to explore the extent to which the conceptualisation of stabilization resonated with other Council members, were rejected in specific contexts or in general, or were re-interpreted by member states to suit alternative security agendas and interests. Therefore, the article not only examines the UNSC debates surrounding existing UN ‘stabilization operations’ (MONUSCO, MINUSTAH, MINUSCA, MINUSMA, which could be regarded as evidence that this ‘western’ concept has resonated with other UNSC members and relevant UN agencies, but also documents the appearance of stabilization in other contexts too. The article opens new avenues of research into concepts of stabilization within the UN, and

  19. Auxiliary variables for the mapping of the drainage network: spatial correlation between relieve units, lithotypes and springs in Benevente River basin-ES

    Directory of Open Access Journals (Sweden)

    Tony Vinicius Moreira Sampaio

    2014-12-01

    Full Text Available Process of the drainage network mapping present methodological limitations re- sulting in inaccurate maps, restricting their use in environmental studies. Such problems demand the realization of long field surveys to verify the error and the search for auxiliary variables to optimize this works and turn possible the analysis of map accuracy. This research aims at the measurement of the correlation be- tween springs, lithotypes and relieve units, characterized by Roughness Concentration Index (RCI in River Basin Benevente-ES, focusing on the operations of map algebra and the use of spatial statistical techniques. These procedures have identified classes of RCI and lithotypes that present the highest and the lowest correlation with the spatial distribution of springs, indicating its potential use as auxiliary variables to verify the map accuracy.

  20. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  1. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  2. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  3. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  4. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  5. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  6. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  7. Agricultural production in the United States by county: a compilation of information from the 1974 census of agriculture for use in terrestrial food-chain transport and assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shor, R.W.; Baes, C.F. III; Sharp, R.D.

    1982-01-01

    Terrestrial food-chain models that simulate the transport of environmentally released radionuclides incorporate parameters describing agricultural production and practice. Often a single set of default parameters, such as that listed in USNRC Regulatory Guide 1.109, is used in lieu of site-specific information. However, the geographical diversity of agricultural practice in the United States suggests the limitations of a single set of default parameters for assessment models. This report documents default parameters with a county-wide resolution based on analysis of the 1974 US Census of Agriculture for use in terrestrial food chain models. Data reported by county, together with state-based information from the US Department of Agriculture, Economic and Statistics Service, provided the basis for estimates of model input parameters. This report also describes these data bases, their limitations, and lists default parameters by county. Vegetable production is described for four categories: leafy vegetables; vegetables and fruits exposed to airborne material; vegetables, fruits, and nuts protected from airborne materials; and grains. Livestock feeds were analyzed in categories of hay, silage, pasture, and grains. Pasture consumption was estimated from cattle and sheep inventories, their feed requirements, and reported quantities of harvested forage. The results were compared with assumed yields of the pasture areas reported. In addition, non-vegetable food production estimates including milk, beef, pork, lamb, poultry, eggs, goat milk, and honey are described. The agricultural parameters and land use information - in all 47 items - are tabulated in four appendices for each of the 3067 counties of the US reported to the Census of Agriculture, excluding those in Hawaii and Alaska.

  8. Agricultural production in the United States by county: a compilation of information from the 1974 census of agriculture for use in terrestrial food-chain transport and assessment models

    International Nuclear Information System (INIS)

    Shor, R.W.; Baes, C.F. III; Sharp, R.D.

    1982-01-01

    Terrestrial food-chain models that simulate the transport of environmentally released radionuclides incorporate parameters describing agricultural production and practice. Often a single set of default parameters, such as that listed in USNRC Regulatory Guide 1.109, is used in lieu of site-specific information. However, the geographical diversity of agricultural practice in the United States suggests the limitations of a single set of default parameters for assessment models. This report documents default parameters with a county-wide resolution based on analysis of the 1974 US Census of Agriculture for use in terrestrial food chain models. Data reported by county, together with state-based information from the US Department of Agriculture, Economic and Statistics Service, provided the basis for estimates of model input parameters. This report also describes these data bases, their limitations, and lists default parameters by county. Vegetable production is described for four categories: leafy vegetables; vegetables and fruits exposed to airborne material; vegetables, fruits, and nuts protected from airborne materials; and grains. Livestock feeds were analyzed in categories of hay, silage, pasture, and grains. Pasture consumption was estimated from cattle and sheep inventories, their feed requirements, and reported quantities of harvested forage. The results were compared with assumed yields of the pasture areas reported. In addition, non-vegetable food production estimates including milk, beef, pork, lamb, poultry, eggs, goat milk, and honey are described. The agricultural parameters and land use information - in all 47 items - are tabulated in four appendices for each of the 3067 counties of the US reported to the Census of Agriculture, excluding those in Hawaii and Alaska

  9. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  10. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  11. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  12. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  13. Mapping Antimicrobial Stewardship in Undergraduate Medical, Dental, Pharmacy, Nursing and Veterinary Education in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Enrique Castro-Sánchez

    Full Text Available To investigate the teaching of antimicrobial stewardship (AS in undergraduate healthcare educational degree programmes in the United Kingdom (UK.Cross-sectional survey of undergraduate programmes in human and veterinary medicine, dentistry, pharmacy and nursing in the UK. The main outcome measures included prevalence of AS teaching; stewardship principles taught; estimated hours apportioned; mode of content delivery and teaching strategies; evaluation methodologies; and frequency of multidisciplinary learning.80% (112/140 of programmes responded adequately. The majority of programmes teach AS principles (88/109, 80.7%. 'Adopting necessary infection prevention and control precautions' was the most frequently taught principle (83/88, 94.3%, followed by 'timely collection of microbiological samples for microscopy, culture and sensitivity' (73/88, 82.9% and 'minimisation of unnecessary antimicrobial prescribing' (72/88, 81.8%. The 'use of intravenous administration only to patients who are severely ill, or unable to tolerate oral treatment' was reported in ~50% of courses. Only 32/88 (36.3% programmes included all recommended principles.Antimicrobial stewardship principles are included in most undergraduate healthcare and veterinary degree programmes in the UK. However, future professionals responsible for using antimicrobials receive disparate education. Education may be boosted by standardisation and strengthening of less frequently discussed principles.

  14. Mapping watershed potential to contribute phosphorus from geologic materials to receiving streams, southeastern United States

    Science.gov (United States)

    Terziotti, Silvia; Hoos, Anne B.; Harned, Douglas; Garcia, Ana Maria

    2010-01-01

    As part of the southeastern United States SPARROW (SPAtially Referenced Regressions On Watershed attributes) water-quality model implementation, the U.S. Geological Survey created a dataset to characterize the contribution of phosphorus to streams from weathering and erosion of surficial geologic materials. SPARROW provides estimates of total nitrogen and phosphorus loads in surface waters from point and nonpoint sources. The characterization of the contribution of phosphorus from geologic materials is important to help separate the effects of natural or background sources of phosphorus from anthropogenic sources of phosphorus, such as municipal wastewater or agricultural practices. The potential of a watershed to contribute phosphorus from naturally occurring geologic materials to streams was characterized by using geochemical data from bed-sediment samples collected from first-order streams in relatively undisturbed watersheds as part of the multiyear U.S. Geological Survey National Geochemical Survey. The spatial pattern of bed-sediment phosphorus concentration is offered as a tool to represent the best available information at the regional scale. One issue may weaken the use of bed-sediment phosphorus concentration as a surrogate for the potential for geologic materials in the watershed to contribute to instream levels of phosphorus-an unknown part of the variability in bed-sediment phosphorus concentration may be due to the rates of net deposition and processing of phosphorus in the streambed rather than to variability in the potential of the watershed's geologic materials to contribute phosphorus to the stream. Two additional datasets were created to represent the potential of a watershed to contribute phosphorus from geologic materials disturbed by mining activities from active mines and inactive mines.

  15. Mapping integration of midwives across the United States: Impact on access, equity, and outcomes.

    Directory of Open Access Journals (Sweden)

    Saraswathi Vedam

    Full Text Available Our multidisciplinary team examined published regulatory data to inform a 50-state database describing the environment for midwifery practice and interprofessional collaboration. Items (110 detailed differences across jurisdictions in scope of practice, autonomy, governance, and prescriptive authority; as well as restrictions that can affect patient safety, quality, and access to maternity providers across birth settings. A nationwide survey of state regulatory experts (n = 92 verified the 'on the ground' relevance, importance, and realities of local interpretation of these state laws. Using a modified Delphi process, we selected 50/110 key items to include in a weighted, composite Midwifery Integration Scoring (MISS system. Higher scores indicate greater integration of midwives across all settings. We ranked states by MISS scores; and, using reliable indicators in the CDC-Vital Statistics Database, we calculated correlation coefficients between MISS scores and maternal-newborn outcomes by state, as well as state density of midwives and place of birth. We conducted hierarchical linear regression analysis to control for confounding effects of race.MISS scores ranged from lowest at 17 (North Carolina to highest at 61 (Washington, out of 100 points. Higher MISS scores were associated with significantly higher rates of spontaneous vaginal delivery, vaginal birth after cesarean, and breastfeeding, and significantly lower rates of cesarean, preterm birth, low birth weight infants, and neonatal death. MISS scores also correlated with density of midwives and access to care across birth settings. Significant differences in newborn outcomes accounted for by MISS scores persisted after controlling for proportion of African American births in each state.The MISS scoring system assesses the level of integration of midwives and evaluates regional access to high quality maternity care. In the United States, higher MISS Scores were associated with significantly

  16. Heat Maps of Hypertension, Diabetes Mellitus, and Smoking in the Continental United States.

    Science.gov (United States)

    Loop, Matthew Shane; Howard, George; de Los Campos, Gustavo; Al-Hamdan, Mohammad Z; Safford, Monika M; Levitan, Emily B; McClure, Leslie A

    2017-01-01

    Geographic variations in cardiovascular mortality are substantial, but descriptions of geographic variations in major cardiovascular risk factors have relied on data aggregated to counties. Herein, we provide the first description of geographic variation in the prevalence of hypertension, diabetes mellitus, and smoking within and across US counties. We conducted a cross-sectional analysis of baseline risk factor measurements and latitude/longitude of participant residence collected from 2003 to 2007 in the REGARDS study (Reasons for Geographic and Racial Differences in Stroke). Of the 30 239 participants, all risk factor measurements and location data were available for 28 887 (96%). The mean (±SD) age of these participants was 64.8(±9.4) years; 41% were black; 55% were female; 59% were hypertensive; 22% were diabetic; and 15% were current smokers. In logistic regression models stratified by race, the median(range) predicted prevalence of the risk factors were as follows: for hypertension, 49% (45%-58%) among whites and 72% (68%-78%) among blacks; for diabetes mellitus, 14% (10%-20%) among whites and 31% (28%-41%) among blacks; and for current smoking, 12% (7%-16%) among whites and 18% (11%-22%) among blacks. Hypertension was most prevalent in the central Southeast among whites, but in the west Southeast among blacks. Diabetes mellitus was most prevalent in the west and central Southeast among whites but in south Florida among blacks. Current smoking was most prevalent in the west Southeast and Midwest among whites and in the north among blacks. Geographic disparities in prevalent hypertension, diabetes mellitus, and smoking exist within states and within counties in the continental United States, and the patterns differ by race. © 2017 American Heart Association, Inc.

  17. Mapping integration of midwives across the United States: Impact on access, equity, and outcomes

    Science.gov (United States)

    Stoll, Kathrin; MacDorman, Marian; Declercq, Eugene; Cramer, Renee; Cheyney, Melissa; Fisher, Timothy; Butt, Emma; Yang, Y. Tony; Powell Kennedy, Holly

    2018-01-01

    birth settings. Significant differences in newborn outcomes accounted for by MISS scores persisted after controlling for proportion of African American births in each state. Conclusion The MISS scoring system assesses the level of integration of midwives and evaluates regional access to high quality maternity care. In the United States, higher MISS Scores were associated with significantly higher rates of physiologic birth, less obstetric interventions, and fewer adverse neonatal outcomes. PMID:29466389

  18. Assssment and Mapping of the Riverine Hydrokinetic Resource in the Continental United States

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Ravens, Thomas M. [University of Alaska Anchorage; Cunningham, Keith W. [University of Alaska Fairbanks; Scott, George [National Renewable Energy Laboratory

    2012-12-14

    The U.S. Department of Energy (DOE) funded the Electric Power Research Institute and its collaborative partners, University of Alaska ? Anchorage, University of Alaska ? Fairbanks, and the National Renewable Energy Laboratory, to provide an assessment of the riverine hydrokinetic resource in the continental United States. The assessment benefited from input obtained during two workshops attended by individuals with relevant expertise and from a National Research Council panel commissioned by DOE to provide guidance to this and other concurrent, DOE-funded assessments of water based renewable energy. These sources of expertise provided valuable advice regarding data sources and assessment methodology. The assessment of the hydrokinetic resource in the 48 contiguous states is derived from spatially-explicit data contained in NHDPlus ?a GIS-based database containing river segment-specific information on discharge characteristics and channel slope. 71,398 river segments with mean annual flow greater than 1,000 cubic feet per second (cfs) mean discharge were included in the assessment. Segments with discharge less than 1,000 cfs were dropped from the assessment, as were river segments with hydroelectric dams. The results for the theoretical and technical resource in the 48 contiguous states were found to be relatively insensitive to the cutoff chosen. Raising the cutoff to 1,500 cfs had no effect on estimate of the technically recoverable resource, and the theoretical resource was reduced by 5.3%. The segment-specific theoretical resource was estimated from these data using the standard hydrological engineering equation that relates theoretical hydraulic power (Pth, Watts) to discharge (Q, m3 s-1) and hydraulic head or change in elevation (??, m) over the length of the segment, where ? is the specific weight of water (9800 N m-3): ??? = ? ? ?? For Alaska, which is not encompassed by NPDPlus, hydraulic head and discharge data were manually obtained from Idaho National

  19. Complete Bouguer gravity anomaly map of the state of Colorado

    Science.gov (United States)

    Abrams, Gerda A.

    1993-01-01

    The Bouguer gravity anomaly map is part of a folio of maps of Colorado cosponsored by the National Mineral Resources Assessment Program (NAMRAP) and the National Geologic Mapping Program (COGEOMAP) and was produced to assist in studies of the mineral resource potential and tectonic setting of the State. Previous compilations of about 12,000 gravity stations by Behrendt and Bajwa (1974a,b) are updated by this map. The data was reduced at a 2.67 g/cm3 and the grid contoured at 3 mGal intervals. This map will aid in the mineral resource assessment by indicating buried intrusive complexes, volcanic fields, major faults and shear zones, and sedimentary basins; helping to identify concealed geologic units; and identifying localities that might be hydrothermically altered or mineralized.

  20. Global Geological Map of Venus

    Science.gov (United States)

    Ivanov, M. A.

    2008-09-01

    Introduction: The Magellan SAR images provide sufficient data to compile a geological map of nearly the entire surface of Venus. Such a global and selfconsistent map serves as the base to address the key questions of the geologic history of Venus. 1) What is the spectrum of units and structures that makes up the surface of Venus [1-3]? 2) What volcanic/tectonic processes do they characterize [4-7]? 3) Did these processes operated locally, regionally, or globally [8- 11]? 4) What are the relationships of relative time among the units [8]? 5) At which length-scale these relationships appear to be consistent [8-10]? 6) What is the absolute timing of formation of the units [12-14]? 7) What are the histories of volcanism, tectonics and the long-wavelength topography on Venus? 7) What model(s) of heat loss and lithospheric evolution [15-21] do these histories correspond to? The ongoing USGS program of Venus mapping has already resulted in a series of published maps at the scale 1:5M [e.g. 22-30]. These maps have a patch-like distribution, however, and are compiled by authors with different mapping philosophy. This situation not always results in perfect agreement between the neighboring areas and, thus, does not permit testing geological hypotheses that could be addressed with a self-consistent map. Here the results of global geological mapping of Venus at the scale 1:10M is presented. The map represents a contiguous area extending from 82.5oN to 82.5oS and comprises ~99% of the planet. Mapping procedure: The map was compiled on C2- MIDR sheets, the resolution of which permits identifying the basic characteristics of previously defined units. The higher resolution images were used during the mapping to clarify geologic relationships. When the map was completed, its quality was checked using published USGS maps [e.g., 22-30] and the catalogue of impact craters [31]. The results suggest that the mapping on the C2-base provided a highquality map product. Units and

  1. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  2. Bedrock Geologic Map of the Jay Peak, VT Quadrangle

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital data from VG99-1 Compilation bedrock geologic map of the Jay Peak quadrangle, Compiled by B. Doolan, 1999: VGS Open-File Report VG99-1, 1 plate, scale...

  3. Removing non-urban roads from the National Land Cover Database to create improved urban maps for the United States, 1992-2011

    Science.gov (United States)

    Soulard, Christopher E.; Acevedo, William; Stehman, Stephen V.

    2018-01-01

    Quantifying change in urban land provides important information to create empirical models examining the effects of human land use. Maps of developed land from the National Land Cover Database (NLCD) of the conterminous United States include rural roads in the developed land class and therefore overestimate the amount of urban land. To better map the urban class and understand how urban lands change over time, we removed rural roads and small patches of rural development from the NLCD developed class and created four wall-to-wall maps (1992, 2001, 2006, and 2011) of urban land. Removing rural roads from the NLCD developed class involved a multi-step filtering process, data fusion using geospatial road and developed land data, and manual editing. Reference data classified as urban or not urban from a stratified random sample was used to assess the accuracy of the 2001 and 2006 urban and NLCD maps. The newly created urban maps had higher overall accuracy (98.7 percent) than the NLCD maps (96.2 percent). More importantly, the urban maps resulted in lower commission error of the urban class (23 percent versus 57 percent for the NLCD in 2006) with the trade-off of slightly inflated omission error (20 percent for the urban map, 16 percent for NLCD in 2006). The removal of approximately 230,000 km2 of rural roads from the NLCD developed class resulted in maps that better characterize the urban footprint. These urban maps are more suited to modeling applications and policy decisions that rely on quantitative and spatially explicit information regarding urban lands.

  4. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  5. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  6. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  7. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  8. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  9. Neutron data compilation at the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Lemmel, H.D.; Attree, P.M.; Byer, T.A.; Good, W.M.; Hjaerne, L.; Konshin, V.A.; Lorens, A.

    1968-03-01

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  10. Neutron data compilation at the International Atomic Energy Agency

    Energy Technology Data Exchange (ETDEWEB)

    Lemmel, H D; Attree, P M; Byer, T A; Good, W M; Hjaerne, L; Konshin, V A; Lorens, A [Nuclear Data Unit, International Atomic Energy Agency, Vienna (Austria)

    1968-03-15

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  11. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  12. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  13. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  14. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  15. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  16. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  17. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  18. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  19. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  20. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  1. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  2. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  3. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  4. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  5. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  6. Experience from the Inspection of Licensees' Outage Activities, Including Fire Protection Programmes, Event Response Inspections, and the Impact of the Fukushima Daiichi NPP Accident on Inspection Programmes. Workshop Proceedings, Chattanooga, Tennessee, United States, 7-10 April 2014 - Appendix: Compilation of Survey Responses

    International Nuclear Information System (INIS)

    2014-10-01

    This appendix provides the complete compilation of responses received to the questionnaire issued in conjunction with the workshop announcements. The responses are provided as received, with changes made only to the formatting. The OECD Nuclear Energy Agency (NEA) Committee on Nuclear Regulatory Activities (CNRA) Working Group on Inspection Practices (WGIP) sponsored the 12. International Workshop on Nuclear Regulatory Inspection Activities. The workshop was hosted by the U.S. NRC, in Chattanooga, Tennessee, United States of America on 7 -10 April 2014. The three workshop topics that were addressed were as follows: - Inspection of Outage Activities Including Fire Protection Programmes. - Event Response Inspections. - The Impact of Inspection Programmes of the Fukushima Daiichi NPP Accident. Each of the respondents was given the following instructions in relation to their response: - Only one response per country is required. If more than one person from your country is participating, please co-ordinate the responses accordingly. - Please provide responses on separate sheet and clearly identify the questionnaire part and topic. For preparation of the workshop, participants are invited to supply their national inspection approaches used in inspection of events and incidents according to the surveys. Actual issues that were discussed during the workshop were generated by the topic leaders based on the responses submitted by participants with their registration forms. This formats helps to ensure that issues considered most important by the workshop participants are covered during the group discussions. (authors)

  7. Geologic Map of the Thaumasia Region, Mars

    Science.gov (United States)

    Dohm, Janes M.; Tanaka, Kenneth L.; Hare, Trent M.

    2001-01-01

    The geology of the Thaumasia region (fig. 1, sheet 3) includes a wide array of rock materials, depositional and erosional landforms, and tectonic structures. The region is dominated by the Thaumasia plateau, which includes central high lava plains ringed by highly deformed highlands; the plateau may comprise the ancestral center of Tharsis tectonism (Frey, 1979; Plescia and Saunders, 1982). The extensive structural deformation of the map region, which is without parallel on Mars in both complexity and diversity, occurred largely throughout the Noachian and Hesperian periods (Tanaka and Davis, 1988; Scott and Dohm, 1990a). The deformation produced small and large extensional and contractional structures (fig. 2, sheet 3) that resulted from stresses related to the formation of Tharsis (Frey, 1979; Wise and others, 1979; Plescia and Saunders, 1982; Banerdt and others, 1982, 1992; Watters and Maxwell, 1986; Tanaka and Davis, 1988; Francis, 1988; Watters, 1993; Schultz and Tanaka, 1994), from magmatic-driven uplifts, such as at Syria Planum (Tanaka and Davis, 1988; Dohm and others, 1998; Dohm and Tanaka, 1999) and central Valles Marineris (Dohm and others, 1998, Dohm and Tanaka, 1999), and from the Argyre impact (Wilhelms, 1973; Scott and Tanaka, 1986). In addition, volcanic, eolian, and fluvial processes have highly modified older surfaces in the map region. Local volcanic and tectonic activity often accompanied episodes of valley formation. Our mapping depicts and describes the diverse terrains and complex geologic history of this unique ancient tectonic region of Mars. The geologic (sheet 1), paleotectonic (sheet 2), and paleoerosional (sheet 3) maps of the Thaumasia region were compiled on a Viking 1:5,000,000-scale digital photomosaic base. The base is a combination of four quadrangles: the southeast part of Phoenicis Lacus (MC–17), most of the southern half of Coprates (MC–18), a large part of Thaumasia (MC–25), and the northwest margin of Argyre (MC–26

  8. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  9. Continuous bedside pressure mapping and rates of hospital-associated pressure ulcers in a medical intensive care unit.

    Science.gov (United States)

    Behrendt, Robert; Ghaznavi, Amir M; Mahan, Meredith; Craft, Susan; Siddiqui, Aamir

    2014-03-01

    Critically ill patients are vulnerable to the development of hospital-associated pressure ulcers (HAPUs). Positioning of patients is an essential component of pressure ulcer prevention because it off-loads areas of high pressure. However, the effectiveness of such positioning is debatable. A continuous bedside pressure mapping (CBPM) device can provide real-time feedback of optimal body position though a pressure-sensing mat that displays pressure images at a patient's bedside, allowing off-loading of high-pressure areas and possibly preventing HAPU formation. A prospective controlled study was designed to determine if CBPM would reduce the number of HAPUs in patients treated in our medical intensive care unit. In 2 months, 422 patients were enrolled and assigned to beds equipped with or without a CBPM device. Patients' skin was assessed daily and weekly to determine the presence and progress of HAPUs. All patients were turned every 2 hours. CBPM patients were repositioned to off-load high-pressure points during turning, according to a graphic display. The number of newly formed HAPUs was the primary outcome measured. A χ(2) test was then used to compare the occurrence of HAPUs between groups. HAPUs developed in 2 of 213 patients in the CBPM group (0.9%; both stage II) compared with 10 of 209 in the control group (4.8%; all stage II; P = .02). Significantly fewer HAPUs occurred in the CBPM group than the control group, indicating the effectiveness of real-time visual feedback in repositioning of patients to prevent the formation of new HAPUs.

  10. Modeled changes in 100 year Flood Risk and Asset Damages within Mapped Floodplains of the Contiguous United States

    Science.gov (United States)

    Wobus, C. W.; Gutmann, E. D.; Jones, R.; Rissing, M.; Mizukami, N.; Lorie, M.; Mahoney, H.; Wood, A.; Mills, D.; Martinich, J.

    2017-12-01

    A growing body of recent work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus increasing monetary damages from flooding in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1% annual exceedance probability flood events at 57,116 locations across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century, under two greenhouse gas (GHG) emissions scenarios. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations, and trajectories of future damages that vary substantially depending on the GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches $4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long-term in terms of reduced flood risk. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages at a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1% AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results suggest that monetary damages from inland flooding could be substantially reduced through more aggressive GHG mitigation policies.

  11. Climate change impacts on flood risk and asset damages within mapped 100-year floodplains of the contiguous United States

    Science.gov (United States)

    Wobus, Cameron; Gutmann, Ethan; Jones, Russell; Rissing, Matthew; Mizukami, Naoki; Lorie, Mark; Mahoney, Hardee; Wood, Andrew W.; Mills, David; Martinich, Jeremy

    2017-12-01

    A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year) flood events at 57 116 stream reaches across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG) emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.

  12. Climate change impacts on flood risk and asset damages within mapped 100-year floodplains of the contiguous United States

    Directory of Open Access Journals (Sweden)

    C. Wobus

    2017-12-01

    Full Text Available A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5 to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year flood events at 57 116 stream reaches across the contiguous United States (CONUS. We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars, suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods. Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.

  13. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  14. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  15. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  16. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  17. Generating a Danish raster-based topsoil property map combining choropleth maps and point information

    DEFF Research Database (Denmark)

    Greve, Mogens H.; Greve, Mette B.; Bøcher, Peder K.

    2007-01-01

    The Danish environmental authorities have posed a soil type dependent restriction on the application of nitrogen. The official Danish soil map is a choropleth topsoil map classifying the agricultural land into eight classes. The use of the soil map has shown that the maps have serious...... classification flaws. The objective of this work is to compile a continuous national topsoil texture map to replace the old topsoil map. Approximately 45,000 point samples were interpolated using ordinary kriging in 250 m x 250 m cells. To reduce variability and to obtain more homogeneous strata, the samples...... were stratified according to landscape types. Five new soil texture maps were compiled; one for each of the five textural classes, and a new categorical soil type map was compiled using the old classification system. Both the old choropleth map and the new continuous soil maps were compared to 354...

  18. Mapping urban geology of the city of Girona, Catalonia

    Science.gov (United States)

    Vilà, Miquel; Torrades, Pau; Pi, Roser; Monleon, Ona

    2016-04-01

    A detailed and systematic geological characterization of the urban area of Girona has been conducted under the project '1:5000 scale Urban geological map of Catalonia' of the Catalan Geological Survey (Institut Cartogràfic i Geològic de Catalunya). The results of this characterization are organized into: i) a geological information system that includes all the information acquired; ii) a stratigraphic model focused on identification, characterization and correlation of the geological materials and structures present in the area and; iii) a detailed geological map that represents a synthesis of all the collected information. The mapping project integrates in a GIS environment pre-existing cartographic documentation (geological and topographical), core data from compiled boreholes, descriptions of geological outcrops within the urban network and neighbouring areas, physico-chemical characterisation of representative samples of geological materials, detailed geological mapping of Quaternary sediments, subsurface bedrock and artificial deposits and, 3D modelling of the main geological surfaces. The stratigraphic model is structured in a system of geological units that from a chronostratigrafic point of view are structured in Palaeozoic, Paleogene, Neogene, Quaternary and Anthropocene. The description of the geological units is guided by a systematic procedure. It includes the main lithological and structural features of the units that constitute the geological substratum and represents the conceptual base of the 1:5000 urban geological map of the Girona metropolitan area, which is organized into 6 map sheets. These map sheets are composed by a principal map, geological cross sections and, several complementary maps, charts and tables. Regardless of the geological map units, the principal map also represents the main artificial deposits, features related to geohistorical processes, contours of outcrop areas, information obtained in stations, borehole data, and contour

  19. Digital Geologic Map of the Nevada Test Site and Vicinity, Nye, Lincoln, and Clark Counties, Nevada, and Inyo County, California

    Science.gov (United States)

    Slate, Janet L.; Berry, Margaret E.; Rowley, Peter D.; Fridrich, Christopher J.; Morgan, Karen S.; Workman, Jeremiah B.; Young, Owen D.; Dixon, Gary L.; Williams, Van S.; McKee, Edwin H.; Ponce, David A.; Hildenbrand, Thomas G.; Swadley, W.C.; Lundstrom, Scott C.; Ekren, E. Bartlett; Warren, Richard G.; Cole, James C.; Fleck, Robert J.; Lanphere, Marvin A.; Sawyer, David A.; Minor, Scott A.; Grunwald, Daniel J.; Laczniak, Randell J.; Menges, Christopher M.; Yount, James C.; Jayko, Angela S.

    1999-01-01

    This digital geologic map of the Nevada Test Site (NTS) and vicinity, as well as its accompanying digital geophysical maps, are compiled at 1:100,000 scale. The map compilation presents new polygon (geologic map unit contacts), line (fault, fold axis, metamorphic isograd, dike, and caldera wall) and point (structural attitude) vector data for the NTS and vicinity, Nye, Lincoln, and Clark Counties, Nevada, and Inyo County, California. The map area covers two 30 x 60-minute quadrangles-the Pahute Mesa quadrangle to the north and the Beatty quadrangle to the south-plus a strip of 7.5-minute quadrangles on the east side-72 quadrangles in all. In addition to the NTS, the map area includes the rest of the southwest Nevada volcanic field, part of the Walker Lane, most of the Amargosa Desert, part of the Funeral and Grapevine Mountains, some of Death Valley, and the northern Spring Mountains. This geologic map improves on previous geologic mapping of the same area (Wahl and others, 1997) by providing new and updated Quaternary and bedrock geology, new geophysical interpretations of faults beneath the basins, and improved GIS coverages. Concurrent publications to this one include a new isostatic gravity map (Ponce and others, 1999) and a new aeromagnetic map (Ponce, 1999).

  20. Ogallala Aquifer Mapping Program

    International Nuclear Information System (INIS)

    1984-10-01

    A computerized data file has been established which can be used efficiently by the contour-plotting program SURFACE II to produce maps of the Ogallala aquifer in 17 counties of the Texas Panhandle. The data collected have been evaluated and compiled into three sets, from which SURFACE II can generate maps of well control, aquifer thickness, saturated thickness, water level, and the difference between virgin (pre-1942) and recent (1979 to 1981) water levels. 29 figures, 1 table

  1. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  2. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  3. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  4. Assessing Accessibility and Transport Infrastructure Inequities in Administrative Units in Serbia’s Danube Corridor Based on Multi-Criteria Analysis and Gis Mapping Tools

    Directory of Open Access Journals (Sweden)

    Ana VULEVIC

    2018-02-01

    Full Text Available The Danube Regions, especially the sub-national units of governance, must be ready to play an active role in spatial development policies. A precondition for this is good accessibility and the coordinated development of all transport systems in the Danube corridor. The main contribution of this paper is to provide a multi-criteria model for potential decision making related to the evaluation of transportation accessibility in Serbia’s Danube Corridor. Geographic Information Systems (GIS, based on maps, indicate the existing counties’ transport infrastructures inequities (between well-connected and isolated counties in terms of accessibility to central places. Through the research, relevant indicators have been identifi ed. This provides an outline of transportation perspectives regarding the development achieved and also fosters the increase of transportation accessibility in some peripheral Serbian Danube administrative units – counties (Nomenclature of Territorial Units for Statistics level 3 – NUTS 3.

  5. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  6. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  7. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  8. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  9. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  10. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  11. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  12. Geologic and geophysical maps of the eastern three-fourths of the Cambria 30' x 60' quadrangle, central California Coast Ranges

    Science.gov (United States)

    Graymer, R.W.; Langenheim, V.E.; Roberts, M.A.; McDougall, Kristin

    2014-01-01

    The Cambria 30´ x 60´ quadrangle comprises southwestern Monterey County and northwestern San Luis Obispo County. The land area includes rugged mountains of the Santa Lucia Range extending from the northwest to the southeast part of the map; the southern part of the Big Sur coast in the northwest; broad marine terraces along the southwest coast; and broadvalleys, rolling hills, and modest mountains in the northeast. This report contains geologic, gravity anomaly, and aeromagnetic anomaly maps of the eastern three-fourths of the 1:100,000-scale Cambria quadrangle and the associated geologic and geophysical databases (ArcMap databases), as well as complete descriptions of the geologic map units and the structural relations in the mapped area. A cross section is based on both the geologic map and potential-field geophysical data. The maps are presented as an interactive, multilayer PDF, rather than more traditional pre-formatted map-sheet PDFs. Various geologic, geophysical, paleontological, and base map elements are placed on separate layers, which allows the user to combine elements interactively to create map views beyond the traditional map sheets. Four traditional map sheets (geologic map, gravity map, aeromagnetic map, paleontological locality map) are easily compiled by choosing the associated data layers or by choosing the desired map under Bookmarks.

  13. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    Science.gov (United States)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume

  14. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  15. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  16. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  17. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  18. Digital geologic map of the Thirsty Canyon NW quadrangle, Nye County, Nevada

    Science.gov (United States)

    Minor, S.A.; Orkild, P.P.; Sargent, K.A.; Warren, R.G.; Sawyer, D.A.; Workman, J.B.

    1998-01-01

    This digital geologic map compilation presents new polygon (i.e., geologic map unit contacts), line (i.e., fault, fold axis, dike, and caldera wall), and point (i.e., structural attitude) vector data for the Thirsty Canyon NW 7 1/2' quadrangle in southern Nevada. The map database, which is at 1:24,000-scale resolution, provides geologic coverage of an area of current hydrogeologic and tectonic interest. The Thirsty Canyon NW quadrangle is located in southern Nye County about 20 km west of the Nevada Test Site (NTS) and 30 km north of the town of Beatty. The map area is underlain by extensive layers of Neogene (about 14 to 4.5 million years old [Ma]) mafic and silicic volcanic rocks that are temporally and spatially associated with transtensional tectonic deformation. Mapped volcanic features include part of a late Miocene (about 9.2 Ma) collapse caldera, a Pliocene (about 4.5 Ma) shield volcano, and two Pleistocene (about 0.3 Ma) cinder cones. Also documented are numerous normal, oblique-slip, and strike-slip faults that reflect regional transtensional deformation along the southern part of the Walker Lane belt. The Thirsty Canyon NW map provides new geologic information for modeling groundwater flow paths that may enter the map area from underground nuclear testing areas located in the NTS about 25 km to the east. The geologic map database comprises six component ArcINFO map coverages that can be accessed after decompressing and unbundling the data archive file (tcnw.tar.gz). These six coverages (tcnwpoly, tcnwflt, tcnwfold, tcnwdike, tcnwcald, and tcnwatt) are formatted here in ArcINFO EXPORT format. Bundled with this database are two PDF files for readily viewing and printing the map, accessory graphics, and a description of map units and compilation methods.

  19. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  20. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  1. Construction experiences from underground works at Oskarshamn. Compilation report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-12-01

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m 3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  2. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  3. An Alternative Approach to Mapping Thermophysical Units from Martian Thermal Inertia and Albedo Data Using a Combination of Unsupervised Classification Techniques

    Directory of Open Access Journals (Sweden)

    Eriita Jones

    2014-06-01

    Full Text Available Thermal inertia and albedo provide information on the distribution of surface materials on Mars. These parameters have been mapped globally on Mars by the Thermal Emission Spectrometer (TES onboard the Mars Global Surveyor. Two-dimensional clusters of thermal inertia and albedo reflect the thermophysical attributes of the dominant materials on the surface. In this paper three automated, non-deterministic, algorithmic classification methods are employed for defining thermophysical units: Expectation Maximisation of a Gaussian Mixture Model; Iterative Self-Organizing Data Analysis Technique (ISODATA; and Maximum Likelihood. We analyse the behaviour of the thermophysical classes resulting from the three classifiers, operating on the 2007 TES thermal inertia and albedo datasets. Producing a rigorous mapping of thermophysical classes at ~3 km/pixel resolution remains important for constraining the geologic processes that have shaped the Martian surface on a regional scale, and for choosing appropriate landing sites. The results from applying these algorithms are compared to geologic maps, surface data from lander missions, features derived from imaging, and previous classifications of thermophysical units which utilized manual (and potentially more time consuming classification methods. These comparisons comprise data suitable for validation of our classifications. Our work shows that a combination of the algorithms—ISODATA and Maximum Likelihood—optimises the sensitivity to the underlying dataspace, and that new information on Martian surface materials can be obtained by using these methods. We demonstrate that the algorithms used here can be applied to define a finer partitioning of albedo and thermal inertia for a more detailed mapping of surface materials, grain sizes and thermal behaviour of the Martian surface and shallow subsurface, at the ~3 km scale.

  4. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  5. Geologic map of the Hasty Quadrangle, Boone and Newton Counties, Arkansas

    Science.gov (United States)

    Hudson, Mark R.; Murray, Kyle E.

    2004-01-01

    This digital geologic map compilation presents new polygon (for example, geologic map unit contacts), line (for example, fault, fold axis, and structure contour), and point (for example, structural attitude, contact elevations) vector data for the Hasty 7.5-minute quadrangle in northern Arkansas. The map database, which is at 1:24,000-scale resolution, provides geologic coverage of an area of current hydrogeologic, tectonic, and stratigraphic interest. The Hasty quadrangle is located in northern Newton and southern Boone Counties about 20 km south of the town of Harrison. The map area is underlain by sedimentary rocks of Ordovician, Mississippian, and Pennsylvanian age that were mildly deformed by a series of normal and strike-slip faults and folds. The area is representative of the stratigraphic and structural setting of the southern Ozark Dome. The Hasty quadrangle map provides new geologic information for better understanding groundwater flow paths in and adjacent to the Buffalo River watershed.

  6. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  7. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  8. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  9. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  10. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  11. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  12. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  13. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  14. Seismic activity maps for the Armenian Highlands

    Energy Technology Data Exchange (ETDEWEB)

    Karapetyan, N.K.; Manukyan, Zh.O.

    1976-01-01

    Seismic activity maps for the periods 1952 to 1967 and 1952 to 1968 were compiled for the Armenian Highlands in order to study the spatial distribution of earthquake recurrence and to construct maps in isolines of seismic activity. Diagrams are presented illustrating such seismic activity maps for the indicated periods. 4 references, 3 figures, 1 table.

  15. Application of a GIS-Based Slope Unit Method for Landslide Susceptibility Mapping along the Longzi River, Southeastern Tibetan Plateau, China

    Directory of Open Access Journals (Sweden)

    Fei Wang

    2017-06-01

    Full Text Available The Longzi River Basin in Tibet is located along the edge of the Himalaya Mountains and is characterized by complex geological conditions and numerous landslides. To evaluate the susceptibility of landslide disasters in this area, eight basic factors were analyzed comprehensively in order to obtain a final susceptibility map. The eight factors are the slope angle, slope aspect, plan curvature, distance-to-fault, distance-to-river, topographic relief, annual precipitation, and lithology. Except for the rainfall factor, which was extracted from the grid cell, all the factors were extracted and classified by the slope unit, which is the basic unit in geological disaster development. The eight factors were superimposed using the information content method (ICM, and the weight of each factor was acquired through an analytic hierarchy process (AHP. The sensitivities of the landslides were divided into four categories: low, moderate, high, and very high, respectively, accounting for 22.76%, 38.64%, 27.51%, and 11.09% of the study area. The accuracies of the area under AUC using slope units and grid cells are 82.6% and 84.2%, respectively, and it means that the two methods are accurate in predicting landslide occurrence. The results show that the high and very high susceptibility areas are distributed throughout the vicinity of the river, with a large component in the north as well as a small portion in the middle and the south. Therefore, it is necessary to conduct landslide warnings in these areas, where the rivers are vast and the population is dense. The susceptibility map can reflect the comprehensive risk of each slope unit, which provides an important reference for later detailed investigations, including research and warning studies.

  16. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  17. Mapping of lithologic and structural units using multispectral imagery. [Afar-Triangle/Ethiopia and adjacent areas (Ethiopian Plateau, Somali Plateau, and parts of Yemen and Saudi Arabia)

    Science.gov (United States)

    Kronberg, P. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. ERTS-1 MSS imagery covering the Afar-Triangle/Ethiopia and adjacent regions (Ethiopian Plateau, Somali Plateau, and parts of Yemen and Saudi Arabi) was applied to the mapping of lithologic and structural units of the test area at a scale 1:1,000,000. Results of the geological evaluation of the ERTS-1 imagery of the Afar have proven the usefullness of this type of satellite data for regional geological mapping. Evaluation of the ERTS images also resulted in new aspects of the structural setting and tectonic development of the Afar-Triangle, where three large rift systems, the oceanic rifts of the Red Sea and Gulf of Aden and the continental East African rift system, seem to meet each other. Surface structures mapped by ERTS do not indicate that the oceanic rift of the Gulf of Aden (Sheba Ridge) continues into the area of continental crust west of the Gulf of Tadjura. ERTS data show that the Wonji fault belt of the African rift system does not enter or cut through the central Afar. The Aysha-Horst is not a Horst but an autochthonous spur of the Somali Plateau.

  18. Evaluation of Electromagnetic Induction to Characterize and Map Sodium-Affected Soils in the Northern Great Plains of the United States

    Science.gov (United States)

    Brevik, E. C.; Heilig, J.; Kempenich, J.; Doolittle, J.; Ulmer, M.

    2012-04-01

    Sodium-affected soils (SAS) cover over 4 million hectares in the Northern Great Plains of the United States. Improving the classification, interpretation, and mapping of SAS is a major goal of the United States Department of Agriculture-Natural Resource Conservation Service (USDA-NRCS) as Northern Great Plains soil surveys are updated. Apparent electrical conductivity (ECa) as measured with ground conductivity meters has shown promise for mapping SAS, however, this use of this geophysical tool needs additional evaluation. This study used an EM-38 MK2-2 meter (Geonics Limited, Mississauga, Ontario), a Trimble AgGPS 114 L-band DGPS (Trimble, Sunnyvale, CA) and the RTmap38MK2 program (Geomar Software, Inc., Mississauga, Ontario) on an Allegro CX field computer (Juniper Systems, North Logan, UT) to collect, observe, and interpret ECa data in the field. The ECa map generated on-site was then used to guide collection of soil samples for soil characterization and to evaluate the influence of soil properties in SAS on ECa as measured with the EM-38MK2-2. Stochastic models contained in the ESAP software package were used to estimate the SAR and salinity levels from the measured ECa data in 30 cm depth intervals to a depth of 90 cm and for the bulk soil (0 to 90 cm). This technique showed promise, with meaningful spatial patterns apparent in the ECa data. However, many of the stochastic models used for salinity and SAR for individual depth intervals and for the bulk soil had low R-squared values. At both sites, significant variability in soil clay and water contents along with a small number of soil samples taken to calibrate the ECa values to soil properties likely contributed to these low R-squared values.

  19. Shape indexes for semi-automated detection of windbreaks in thematic tree cover maps from the central United States

    Science.gov (United States)

    Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman

    2017-01-01

    Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...

  20. Genome-Wide Mapping of Transcriptional Regulation and Metabolism Describes Information-Processing Units in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Daniela Ledezma-Tejeida

    2017-08-01

    Full Text Available In the face of changes in their environment, bacteria adjust gene expression levels and produce appropriate responses. The individual layers of this process have been widely studied: the transcriptional regulatory network describes the regulatory interactions that produce changes in the metabolic network, both of which are coordinated by the signaling network, but the interplay between them has never been described in a systematic fashion. Here, we formalize the process of detection and processing of environmental information mediated by individual transcription factors (TFs, utilizing a concept termed genetic sensory response units (GENSOR units, which are composed of four components: (1 a signal, (2 signal transduction, (3 genetic switch, and (4 a response. We used experimentally validated data sets from two databases to assemble a GENSOR unit for each of the 189 local TFs of Escherichia coli K-12 contained in the RegulonDB database. Further analysis suggested that feedback is a common occurrence in signal processing, and there is a gradient of functional complexity in the response mediated by each TF, as opposed to a one regulator/one pathway rule. Finally, we provide examples of other GENSOR unit applications, such as hypothesis generation, detailed description of cellular decision making, and elucidation of indirect regulatory mechanisms.

  1. Compiling a Monolingual Dictionary for Native Speakers

    Directory of Open Access Journals (Sweden)

    Patrick Hanks

    2011-10-01

    Full Text Available

    ABSTRACT: This article gives a survey of the main issues confronting the compilers of monolingual dictionaries in the age of the Internet. Among others, it discusses the relationship between a lexical database and a monolingual dictionary, the role of corpus evidence, historical principles in lexicography vs. synchronic principles, the instability of word meaning, the need for full vocabulary coverage, principles of definition writing, the role of dictionaries in society, and the need for dictionaries to give guidance on matters of disputed word usage. It concludes with some questions about the future of dictionary publishing.

    OPSOMMING: Die samestelling van 'n eentalige woordeboek vir moedertaalsprekers. Hierdie artikel gee 'n oorsig van die hoofkwessies waarmee die samestellers van eentalige woordeboeke in die eeu van die Internet te kampe het. Dit bespreek onder andere die verhouding tussen 'n leksikale databasis en 'n eentalige woordeboek, die rol van korpusgetuienis, historiese beginsels vs sinchroniese beginsels in die leksikografie, die onstabiliteit van woordbetekenis, die noodsaak van 'n volledige woordeskatdekking, beginsels van die skryf van definisies, die rol van woordeboeke in die maatskappy, en die noodsaak vir woordeboeke om leiding te gee oor sake van betwiste woordgebruik. Dit sluit af met 'n aantal vrae oor die toekoms van die publikasie van woordeboeke.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, LEKSIKALE DATABASIS, WOORDEBOEKSTRUKTUUR, WOORDBETEKENIS, BETEKENISVERANDERING, GEBRUIK, GEBRUIKSAANTEKENINGE, HISTORIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, SINCHRONIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, REGISTER, SLANG, STANDAARDENGELS, WOORDESKATDEKKING, KONSEKWENSIE VAN VERSAMELINGS, FRASEOLOGIE, SINTAGMATIESE PATRONE, PROBLEME VAN KOMPOSISIONALITEIT, LINGUISTIESE PRESKRIPTIVISME, LEKSIKALE GETUIENIS

  2. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  3. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  4. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  5. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  6. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  7. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  8. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  9. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  10. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  11. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  12. Site investigation SFR. Rock type coding, overview geological mapping and identification of rock units and possible deformation zones in drill cores from the construction of SFR

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Jesper (Vattenfall Power Consultant AB, Stockholm (Sweden)); Curtis, Philip; Bockgaard, Niclas (Golder Associates AB (Sweden)); Mattsson, Haakan (GeoVista AB, Luleaa (Sweden))

    2011-01-15

    This report presents the rock type coding, overview lithological mapping and identification of rock units and possible deformation zones in drill cores from 32 boreholes associated with the construction of SFR. This work can be seen as complementary to single-hole interpretations of other older SFR boreholes earlier reported in /Petersson and Andersson 2010/: KFR04, KFR08, KFR09, KFR13, KFR35, KFR36, KFR54, KFR55, KFR7A, KFR7B and KFR7C. Due to deficiencies in the available material, the necessary activities have deviated somewhat from the established methodologies used during the recent Forsmark site investigations for the final repository for spent nuclear fuel. The aim of the current work has been, wherever possible, to allow the incorporation of all relevant material from older boreholes in the ongoing SFR geological modelling work in spite of the deficiencies. The activities include: - Rock type coding of the original geological mapping according to the nomenclature used during the preceding Forsmark site investigation. As part of the Forsmark site investigation such rock type coding has already been performed on most of the old SFR boreholes if the original geological mapping results were available. This earlier work has been complemented by rock type coding on two further boreholes: KFR01 and KFR02. - Lithological overview mapping, including documentation of (1) rock types, (2) ductile and brittle-ductile deformation and (3) alteration for drill cores from eleven of the boreholes for which no original geological borehole mapping was available (KFR31, KFR32, KFR34, KFR37,KFR38, KFR51, KFR69, KFR70, KFR71, KFR72 and KFR89). - Identification of possible deformation zones and merging of similar rock types into rock units. This follows SKB's established criteria and methodology of the geological Single-hole interpretation (SHI) process wherever possible. Deviations from the standard SHI process are associated with the lack of data, for example BIPS images

  13. Site investigation SFR. Rock type coding, overview geological mapping and identification of rock units and possible deformation zones in drill cores from the construction of SFR

    International Nuclear Information System (INIS)

    Petersson, Jesper; Curtis, Philip; Bockgaard, Niclas; Mattsson, Haakan

    2011-01-01

    This report presents the rock type coding, overview lithological mapping and identification of rock units and possible deformation zones in drill cores from 32 boreholes associated with the construction of SFR. This work can be seen as complementary to single-hole interpretations of other older SFR boreholes earlier reported in /Petersson and Andersson 2010/: KFR04, KFR08, KFR09, KFR13, KFR35, KFR36, KFR54, KFR55, KFR7A, KFR7B and KFR7C. Due to deficiencies in the available material, the necessary activities have deviated somewhat from the established methodologies used during the recent Forsmark site investigations for the final repository for spent nuclear fuel. The aim of the current work has been, wherever possible, to allow the incorporation of all relevant material from older boreholes in the ongoing SFR geological modelling work in spite of the deficiencies. The activities include: - Rock type coding of the original geological mapping according to the nomenclature used during the preceding Forsmark site investigation. As part of the Forsmark site investigation such rock type coding has already been performed on most of the old SFR boreholes if the original geological mapping results were available. This earlier work has been complemented by rock type coding on two further boreholes: KFR01 and KFR02. - Lithological overview mapping, including documentation of (1) rock types, (2) ductile and brittle-ductile deformation and (3) alteration for drill cores from eleven of the boreholes for which no original geological borehole mapping was available (KFR31, KFR32, KFR34, KFR37,KFR38, KFR51, KFR69, KFR70, KFR71, KFR72 and KFR89). - Identification of possible deformation zones and merging of similar rock types into rock units. This follows SKB's established criteria and methodology of the geological Single-hole interpretation (SHI) process wherever possible. Deviations from the standard SHI process are associated with the lack of data, for example BIPS images, or a

  14. Preliminary Geologic Map of the Cook Inlet Region, Alaska-Including Parts of the Talkeetna, Talkeetna Mountains, Tyonek, Anchorage, Lake Clark, Kenai, Seward, Iliamna, Seldovia, Mount Katmai, and Afognak 1:250,000-scale Quadrangles

    Science.gov (United States)

    Wilson, Frederic H.; Hults, Chad P.; Schmoll, Henry R.; Haeussler, Peter J.; Schmidt, Jeanine M.; Yehle, Lynn A.; Labay, Keith A.; Shew, Nora B.

    2009-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. The files named __geol contain geologic polygons and line (contact) attributes; files named __fold contain fold axes; files named __lin contain lineaments; and files named __dike contain dikes as lines. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make

  15. 1976 compilation of national nuclear data committees

    International Nuclear Information System (INIS)

    1977-01-01

    This list of currently existing National Nuclear Data Committees, and their memberships, is published with the object of promoting the interaction and enhance the awareness of nuclear data activities in IAEA Member States. The following Member States have indicated the existence of a nuclear data committee in their countries: Bangladesh, Bolivia, Bulgaria, France, Hungary, India, Japan, Romania, Sweden, USSR, United Kingdom, USA, Yugoslavia

  16. Application of the Lean Office philosophy and mapping of the value stream in the process of designing the banking units of a financial company

    Directory of Open Access Journals (Sweden)

    Nelson Antônio Calsavara

    2016-09-01

    Full Text Available The purpose of this study is to conduct a critical analysis of the effects of Lean Office on the design process of the banking units of a financial company and how the implementation of this philosophy may contribute to productivity, thus reducing implementation time. A literature review of the Toyota Production System was conducted, as well as studies on its methods, with advancement to lean thinking and consistent application of Lean philosophies in services and Office. A bibliographic and documentary survey of the Lean processes and procedures for opening bank branches was taken. A Current State Map was developed, modeling the current operating procedures. Soon after the identification and analysis of waste, proposals were presented for reducing deadlines and eliminating and grouping stages, with consequent development of the Future State Map, implementation and monitoring of stages, and the measurement of estimated time gains in operation, demonstrating an estimated 45% reduction, in days, from start to end of the process, concluding that the implementation of the Lean Office philosophy contributed to the process.

  17. Exploiting differential vegetation phenology for satellite-based mapping of semiarid grass vegetation in the southwestern United States and northern Mexico

    Science.gov (United States)

    Dye, Dennis G.; Middleton, Barry R.; Vogel, John M.; Wu, Zhuoting; Velasco, Miguel G.

    2016-01-01

    We developed and evaluated a methodology for subpixel discrimination and large-area mapping of the perennial warm-season (C4) grass component of vegetation cover in mixed-composition landscapes of the southwestern United States and northern Mexico. We describe the methodology within a general, conceptual framework that we identify as the differential vegetation phenology (DVP) paradigm. We introduce a DVP index, the Normalized Difference Phenometric Index (NDPI) that provides vegetation type-specific information at the subpixel scale by exploiting differential patterns of vegetation phenology detectable in time-series spectral vegetation index (VI) data from multispectral land imagers. We used modified soil-adjusted vegetation index (MSAVI2) data from Landsat to develop the NDPI, and MSAVI2 data from MODIS to compare its performance relative to one alternate DVP metric (difference of spring average MSAVI2 and summer maximum MSAVI2), and two simple, conventional VI metrics (summer average MSAVI2, summer maximum MSAVI2). The NDPI in a scaled form (NDPIs) performed best in predicting variation in perennial C4 grass cover as estimated from landscape photographs at 92 sites (R2 = 0.76, p landscapes of the Southwest, and potentially for monitoring of its response to drought, climate change, grazing and other factors, including land management. With appropriate adjustments, the method could potentially be used for subpixel discrimination and mapping of grass or other vegetation types in other regions where the vegetation components of the landscape exhibit contrasting seasonal patterns of phenology.

  18. ABOUT SYSTEM MAPPING OF BIOLOGICAL RESOURCES FOR SUBSTANTIATION OF ENVIRONMENTAL MANAGEMENT OF THE ADMINISTRATED UNIT ON THE EXAMPLE OF NOVOSIBIRSK REGION

    Directory of Open Access Journals (Sweden)

    O. N. Nikolaeva

    2017-01-01

    Full Text Available The article considers the issues of systematization, modeling and presentation of regional biological resources data. The problem of providing regional state authorities with actual biological resources data and an analysis tool has been stated. The necessity of complex analysis of heterogeneous biological resources data in connection with the landscape factors has been articulated. The system of biological resources’ cartographic models (BRCM is proposed as tools for the regional authorities to develop the BRCM for practical appliances. The goal and the target audience of the system are named. The principles of cartographic visualization of information in the BRCM are formulated. The main sources of biological resources data are listed. These sources include state cadastres, monitoring and statistics. The scales for regional and topical biological resources’ cartographic models are stated. These scales comprise two scale groups for depicting the region itself and its units of internal administrative division. The specifics of cartographic modeling and visualization of relief according to legal requirements to public cartographic data are described. Various options of presentation of biological resources’ cartographic models, such as digital maps, 3Dmodels and cartographic animation are described. Examples of maps and cartographic 3D-models of Novosibirsk Region forests are shown. The conclusion about practical challenges solved with BRCM has been made.

  19. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  20. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  1. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  2. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  3. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  4. Feasibility and utility of mapping disease risk at the neighbourhood level within a Canadian public health unit: an ecological study

    Directory of Open Access Journals (Sweden)

    Wanigaratne Susitha

    2010-05-01

    Full Text Available Abstract Background We conducted spatial analyses to determine the geographic variation of cancer at the neighbourhood level (dissemination areas or DAs within the area of a single Ontario public health unit, Wellington-Dufferin-Guelph, covering a population of 238,326 inhabitants. Cancer incidence data between 1999 and 2003 were obtained from the Ontario Cancer Registry and were geocoded down to the level of DA using the enhanced Postal Code Conversion File. The 2001 Census of Canada provided information on the size and age-sex structure of the population at the DA level, in addition to information about selected census covariates, such as average neighbourhood income. Results Age standardized incidence ratios for cancer and the prevalence of census covariates were calculated for each of 331 dissemination areas in Wellington-Dufferin-Guelph. The standardized incidence ratios (SIR for cancer varied dramatically across the dissemination areas. However, application of the Moran's I statistic, a popular index of spatial autocorrelation, suggested significant spatial patterns for only two cancers, lung and prostate, both in males (p Conclusion This paper demonstrates the feasibility and utility of a systematic approach to identifying neighbourhoods, within the area served by a public health unit, that have significantly higher risks of cancer. This exploratory, ecologic study suggests several hypotheses for these spatial patterns that warrant further investigations. To the best of our knowledge, this is the first Canadian study published in the peer-reviewed literature estimating the risk of relatively rare public health outcomes at a very small areal level, namely dissemination areas.

  5. Compiling Utility Requirements For New Nuclear Power Plant Project

    International Nuclear Information System (INIS)

    Patrakka, Eero

    2002-01-01

    Teollisuuden Voima Oy (TVO) submitted in November 2000 to the Finnish Government an application for a Decision-in-Principle concerning the construction of a new nuclear power plant in Finland. The actual investment decision can be made first after a positive decision has been made by the Government and the Parliament. Parallel to the licensing process, technical preparedness has been upheld so that the procurement process can be commenced without delay, when needed. This includes the definition of requirements for the plant and preliminary preparation of bid inquiry specifications. The core of the technical requirements corresponds to the specifications presented in the European Utility Requirement (EUR) document, compiled by major European electricity producers. Quite naturally, an amount of modifications to the EUR document are needed that take into account the country- and site-specific conditions as well as the experiences gained in the operation of the existing NPP units. Along with the EUR-related requirements concerning the nuclear island and power generation plant, requirements are specified for scope of supply as well as for a variety of issues related to project implementation. (author)

  6. Renewable Energy Atlas of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Kuiper, J. [Environmental Science Division; Hlava, K. [Environmental Science Division; Greenwood, H. [Environmentall Science Division; Carr, A. [Environmental Science Division

    2013-12-13

    The Renewable Energy Atlas (Atlas) of the United States is a compilation of geospatial data focused on renewable energy resources, federal land ownership, and base map reference information. This report explains how to add the Atlas to your computer and install the associated software. The report also includes: A description of each of the components of the Atlas; Lists of the Geographic Information System (GIS) database content and sources; and A brief introduction to the major renewable energy technologies. The Atlas includes the following: A GIS database organized as a set of Environmental Systems Research Institute (ESRI) ArcGIS Personal GeoDatabases, and ESRI ArcReader and ArcGIS project files providing an interactive map visualization and analysis interface.

  7. Gravity and isostatic anomaly maps of Greece produced

    Science.gov (United States)

    Lagios, E.; Chailas, S.; Hipkin, R. G.

    A gravity anomaly map of Greece was first compiled in the early 1970s [Makris and Stavrou, 1984] from all available gravity data collected by different Hellenic institutions. However, to compose this map the data had to be smoothed to the point that many of the smaller-wavelength gravity anomalies were lost. New work begun in 1987 has resulted in the publication of an updated map [Lagios et al., 1994] and an isostatic anomaly map derived from it.The gravity data cover the area between east longitudes 19° and 27° and north latitudes 32° and 42°, organized in files of 100-km squares and grouped in 10-km squares using UTM zone 34 coordinates. Most of the data on land come from the gravity observations of Makris and Stavrou [1984] with additional data from the Institute of Geology and Mining Exploration, the Public Oil Corporation of Greece, and Athens University. These data were checked using techniques similar to those used in compiling the gravity anomaly map of the United States, but the horizontal gradient was used as a check rather than the gravity difference. Marine data were digitized from the maps of Morelli et al. [1975a, 1975b]. All gravity anomaly values are referred to the IGSN-71 system, reduced with the standard Bouger density of 2.67 Mg/m3. We estimate the errors of the anomalies in the continental part of Greece to be ±0.9 mGal; this is expected to be smaller over fairly flat regions. For stations whose height has been determined by leveling, the error is only ±0.3 mGal. For the marine areas, the errors are about ±5 mGal [Morelli, 1990].

  8. Disaggregating and mapping crop statistics using hypertemporal remote sensing

    Science.gov (United States)

    Khan, M. R.; de Bie, C. A. J. M.; van Keulen, H.; Smaling, E. M. A.; Real, R.

    2010-02-01

    Governments compile their agricultural statistics in tabular form by administrative area, which gives no clue to the exact locations where specific crops are actually grown. Such data are poorly suited for early warning and assessment of crop production. 10-Daily satellite image time series of Andalucia, Spain, acquired since 1998 by the SPOT Vegetation Instrument in combination with reported crop area statistics were used to produce the required crop maps. Firstly, the 10-daily (1998-2006) 1-km resolution SPOT-Vegetation NDVI-images were used to stratify the study area in 45 map units through an iterative unsupervised classification process. Each unit represents an NDVI-profile showing changes in vegetation greenness over time which is assumed to relate to the types of land cover and land use present. Secondly, the areas of NDVI-units and the reported cropped areas by municipality were used to disaggregate the crop statistics. Adjusted R-squares were 98.8% for rainfed wheat, 97.5% for rainfed sunflower, and 76.5% for barley. Relating statistical data on areas cropped by municipality with the NDVI-based unit map showed that the selected crops were significantly related to specific NDVI-based map units. Other NDVI-profiles did not relate to the studied crops and represented other types of land use or land cover. The results were validated by using primary field data. These data were collected by the Spanish government from 2001 to 2005 through grid sampling within agricultural areas; each grid (block) contains three 700 m × 700 m segments. The validation showed 68%, 31% and 23% variability explained (adjusted R-squares) between the three produced maps and the thousands of segment data. Mainly variability within the delineated NDVI-units caused relatively low values; the units are internally heterogeneous. Variability between units is properly captured. The maps must accordingly be considered "small scale maps". These maps can be used to monitor crop performance of

  9. The current status of mapping karst areas and availability of public sinkhole-risk resources in karst terrains of the United States

    Science.gov (United States)

    Kuniansky, Eve L.; Weary, David J.; Kaufmann, James E.

    2016-01-01

    Subsidence from sinkhole collapse is a common occurrence in areas underlain by water-soluble rocks such as carbonate and evaporite rocks, typical of karst terrain. Almost all 50 States within the United States (excluding Delaware and Rhode Island) have karst areas, with sinkhole damage highest in Florida, Texas, Alabama, Missouri, Kentucky, Tennessee, and Pennsylvania. A conservative estimate of losses to all types of ground subsidence was $125 million per year in 1997. This estimate may now be low, as review of cost reports from the last 15 years indicates that the cost of karst collapses in the United States averages more than $300 million per year. Knowing when a catastrophic event will occur is not possible; however, understanding where such occurrences are likely is possible. The US Geological Survey has developed and maintains national-scale maps of karst areas and areas prone to sinkhole formation. Several States provide additional resources for their citizens; Alabama, Colorado, Florida, Indiana, Iowa, Kentucky, Minnesota, Missouri, Ohio, and Pennsylvania maintain databases of sinkholes or karst features, with Florida, Kentucky, Missouri, and Ohio providing sinkhole reporting mechanisms for the public.

  10. Compilation of kinetic data for geochemical calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.; Savage, D.; Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the mineralogical and

  11. Mapping information exposure on social media to explain differences in HPV vaccine coverage in the United States.

    Science.gov (United States)

    Dunn, Adam G; Surian, Didi; Leask, Julie; Dey, Aditi; Mandl, Kenneth D; Coiera, Enrico

    2017-05-25

    Together with access, acceptance of vaccines affects human papillomavirus (HPV) vaccine coverage, yet little is known about media's role. Our aim was to determine whether measures of information exposure derived from Twitter could be used to explain differences in coverage in the United States. We conducted an analysis of exposure to information about HPV vaccines on Twitter, derived from 273.8 million exposures to 258,418 tweets posted between 1 October 2013 and 30 October 2015. Tweets were classified by topic using machine learning methods. Proportional exposure to each topic was used to construct multivariable models for predicting state-level HPV vaccine coverage, and compared to multivariable models constructed using socioeconomic factors: poverty, education, and insurance. Outcome measures included correlations between coverage and the individual topics and socioeconomic factors; and differences in the predictive performance of the multivariable models. Topics corresponding to media controversies were most closely correlated with coverage (both positively and negatively); education and insurance were highest among socioeconomic indicators. Measures of information exposure explained 68% of the variance in one dose 2015 HPV vaccine coverage in females (males: 63%). In comparison, models based on socioeconomic factors explained 42% of the variance in females (males: 40%). Measures of information exposure derived from Twitter explained differences in coverage that were not explained by socioeconomic factors. Vaccine coverage was lower in states where safety concerns, misinformation, and conspiracies made up higher proportions of exposures, suggesting that negative representations of vaccines in the media may reflect or influence vaccine acceptance. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  12. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Rosie D Salazar

    Full Text Available The common toad (Bufo bufo is of increasing conservation concern in the United Kingdom (UK due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat.We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013. We developed a population-level resource selection function (RSF to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads.The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  13. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  14. Globes, Maps, Photographs: Geographic Tools.

    Science.gov (United States)

    McDermott, Paul D.; And Others

    This compilation of reprinted articles that originally appeared in the Journal of Geography from September 1969 through the May 1970 issues, is intended to help teachers use globes, maps, and photographs with skill and understanding. The articles were designed with several objectives in mind: 1) to provide information regarding the design,…

  15. Maps of estimated nitrate and arsenic concentrations in basin-fill aquifers of the southwestern United States

    Science.gov (United States)

    Beisner, Kimberly R.; Anning, David W.; Paul, Angela P.; McKinney, Tim S.; Huntington, Jena M.; Bexfield, Laura M.; Thiros, Susan A.

    2012-01-01

    Human-health concerns and economic considerations associated with meeting drinking-water standards motivated a study of the vulnerability of basin-fill aquifers to nitrate contamination and arsenic enrichment in the southwestern United States. Statistical models were developed by using the random forest classifier algorithm to predict concentrations of nitrate and arsenic across a model grid representing about 190,600 square miles of basin-fill aquifers in parts of Arizona, California, Colorado, Nevada, New Mexico, and Utah. The statistical models, referred to as classifiers, reflect natural and human-related factors that affect aquifer vulnerability to contamination and relate nitrate and arsenic concentrations to explanatory variables representing local- and basin-scale measures of source and aquifer susceptibility conditions. Geochemical variables were not used in concentration predictions because they were not available for the entire study area. The models were calibrated to assess model accuracy on the basis of measured values.Only 2 percent of the area underlain by basin-fill aquifers in the study area was predicted to equal or exceed the U.S. Environmental Protection Agency drinking-water standard for nitrate as N (10 milligrams per liter), whereas 43 percent of the area was predicted to equal or exceed the standard for arsenic (10 micrograms per liter). Areas predicted to equal or exceed the drinking-water standard for nitrate include basins in central Arizona near Phoenix; the San Joaquin Valley, the Santa Ana Inland, and San Jacinto Basins of California; and the San Luis Valley of Colorado. Much of the area predicted to equal or exceed the drinking-water standard for arsenic is within a belt of basins along the western portion of the Basin and Range Physiographic Province that includes almost all of Nevada and parts of California and Arizona. Predicted nitrate and arsenic concentrations are substantially lower than the drinking-water standards in much of

  16. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  17. Bedrock Geologic Map of Vermont - Dikes

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  18. Angola Seismicity MAP

    Science.gov (United States)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  19. Geologic Map of the State of Hawai`i

    Science.gov (United States)

    Sherrod, David R.; Sinton, John M.; Watkins, Sarah E.; Brunt, Kelly M.

    2007-01-01

    1983 and the Universal Transverse Mercator system projection to zone 4. 'This digital statewide map allows engineers, consultants, and scientists from many different fields to take advantage of the geologic database,' said John Sinton, a geology professor at the University of Hawai`i, whose new mapping of the Wai`anae Range (West O`ahu) appears on the map. Indeed, when a testing version was first made available, most requests came from biologists, archaeologists, and soil scientists interested in applying the map's GIS database to their ongoing investigations. Another area newly depicted on the map, in addition to the Wai`anae Range, is Haleakala volcano, East Maui. So too for the active lava flows of Kilauea volcano, Island of Hawai`i, where the landscape has continued to evolve in the ten years since publication of the Big Island's revised geologic map. For the other islands, much of the map is compiled from mapping published in the 1930-1960s. This reliance stems partly from shortage of funding to undertake entirely new mapping but is warranted by the exemplary mapping of those early experts. The boundaries of all map units are digitized to show correctly on modern topographic maps.

  20. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  1. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  2. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  3. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  4. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  5. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  6. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  7. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  8. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  9. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  10. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  11. Fire effects on soils in Lake States forests: A compilation of published research to facilitate long-term investigations

    Science.gov (United States)

    Jessica Miesel; P. Goebel; R. Corace; David Hix; Randall Kolka; Brian Palik; David. Mladenoff

    2012-01-01

    Fire-adapted forests of the Lake States region are poorly studied relative to those of the western and southeastern United States and our knowledge base of regional short- and long-term fire effects on soils is limited. We compiled and assessed the body of literature addressing fire effects on soils in Lake States forests to facilitate the re-measurement of previous...

  12. Defining biological assemblages (biotopes) of conservation interest in the submarine canyons of the South West Approaches (offshore United Kingdom) for use in marine habitat mapping

    Science.gov (United States)

    Davies, Jaime S.; Howell, Kerry L.; Stewart, Heather A.; Guinan, Janine; Golding, Neil

    2014-06-01

    In 2007, the upper part of a submarine canyon system located in water depths between 138 and 1165 m in the South West (SW) Approaches (North East Atlantic Ocean) was surveyed over a 2 week period. High-resolution multibeam echosounder data covering 1106 km2, and 44 ground-truthing video and image transects were acquired to characterise the biological assemblages of the canyons. The SW Approaches is an area of complex terrain, and intensive ground-truthing revealed the canyons to be dominated by soft sediment assemblages. A combination of multivariate analysis of seabed photographs (184-1059 m) and visual assessment of video ground-truthing identified 12 megabenthic assemblages (biotopes) at an appropriate scale to act as mapping units. Of these biotopes, 5 adhered to current definitions of habitats of conservation concern, 4 of which were classed as Vulnerable Marine Ecosystems. Some of the biotopes correspond to descriptions of communities from other megahabitat features (for example the continental shelf and seamounts), although it appears that the canyons host modified versions, possibly due to the inferred high rates of sedimentation in the canyons. Other biotopes described appear to be unique to canyon features, particularly the sea pen biotope consisting of Kophobelemnon stelliferum and cerianthids.

  13. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  14. Annual accumulation over the Greenland ice sheet interpolated from historical and newly compiled observation data

    Science.gov (United States)

    Shen, Dayong; Liu, Yuling; Huang, Shengli

    2012-01-01

    The estimation of ice/snow accumulation is of great significance in quantifying the mass balance of ice sheets and variation in water resources. Improving the accuracy and reducing uncertainty has been a challenge for the estimation of annual accumulation over the Greenland ice sheet. In this study, we kriged and analyzed the spatial pattern of accumulation based on an observation data series including 315 points used in a recent research, plus 101 ice cores and snow pits and newly compiled 23 coastal weather station data. The estimated annual accumulation over the Greenland ice sheet is 31.2 g cm−2 yr−1, with a standard error of 0.9 g cm−2 yr−1. The main differences between the improved map developed in this study and the recently published accumulation maps are in the coastal areas, especially southeast and southwest regions. The analysis of accumulations versus elevation reveals the distribution patterns of accumulation over the Greenland ice sheet.

  15. Geologic Maps as the Foundation of Mineral-Hazards Maps in California

    Science.gov (United States)

    Higgins, C. T.; Churchill, R. K.; Downey, C. I.; Clinkenbeard, J. P.; Fonseca, M. C.

    2010-12-01

    The basic geologic map is essential to the development of products that help planners, engineers, government officials, and the general public make decisions concerning natural hazards. Such maps are the primary foundation that the California Geological Survey (CGS) uses to prepare maps that show potential for mineral-hazards. Examples of clients that request these maps are the California Department of Transportation (Caltrans) and California Department of Public Health (CDPH). Largely because of their non-catastrophic nature, mineral hazards have received much less public attention compared to earthquakes, landslides, volcanic eruptions, and floods. Nonetheless, mineral hazards can be a major concern locally when considering human health and safety and potential contamination of the environment by human activities such as disposal of earth materials. To address some of these concerns, the CGS has focused its mineral-hazards maps on naturally occurring asbestos (NOA), radon, and various potentially toxic metals as well as certain artificial features such as mines and oil and gas wells. The maps range in scope from statewide to counties and Caltrans districts to segments of selected highways. To develop the hazard maps, the CGS begins with traditional paper and digital versions of basic geologic maps, which are obtained from many sources such as its own files, the USGS, USDA Forest Service, California Department of Water Resources, and counties. For each study area, these maps present many challenges of compilation related to vintage, scale, definition of units, and edge-matching across map boundaries. The result of each CGS compilation is a digital geologic layer that is subsequently reinterpreted and transformed into new digital layers (e.g., lithologic) that focus on the geochemical and mineralogical properties of the area’s earth materials and structures. These intermediate layers are then integrated with other technical data to derive final digital layers

  16. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  17. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    Energy Technology Data Exchange (ETDEWEB)

    De Supinski, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Caliga, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  18. Geologic Map of the Shakespeare Quadrangle (H03), Mercury

    Science.gov (United States)

    Guzzetta, L.; Galluzzi, V.; Ferranti, L.; Palumbo, P.

    2018-05-01

    A 1:3M geological map of the H03 Shakespeare quadrangle of Mercury has been compiled through photointerpretation of the MESSENGER images. The most prominent geomorphological feature is the Caloris basin, the largest impact crater on Mercury.

  19. Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California

    Science.gov (United States)

    Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.

    2006-01-01

    The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.

  20. TOXMAP®: Environmental Health Maps

    Data.gov (United States)

    U.S. Department of Health & Human Services — TOXMAP® is a Geographic Information System (GIS) that uses maps of the United States and Canada to help users visually explore data primarily from the EPA's Toxics...

  1. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  2. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  3. Biomass Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Biomass Maps Biomass Maps These maps illustrate the biomass resource in the United States by county . Biomass feedstock data are analyzed both statistically and graphically using a geographic information Data Science Team. Solid Biomass Resources Map of Total Biomass Resources in the United States Solid

  4. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  5. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  6. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  7. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  8. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  9. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  10. Preliminary geologic map of the late Cenozoic sediments of the western half of the Pasco Basin

    International Nuclear Information System (INIS)

    Lillie, J.T.; Tallman, A.M.; Caggiano, J.A.

    1978-09-01

    The U.S. Department of Energy, through the Basalt Waste Isolation Program within the Rockwell Hanford Operations, is investigating the feasibility of terminal storage of radioactive waste in deep caverns constructed in Columbia River Basalt. This report represents a portion of the geological work conducted during fiscal year 1978 to assess the geological conditions in the Pasco Basin. The surficial geology of the western half of the Pasco Basin was studied and mapped in a reconnaissance fashion at a scale of 1:62,500. The map was produced through a compilation of existing geologic mapping publications and additional field data collected during the spring of 1978. The map was produced primarily to: (1) complement other mapping work currently being conducted in the Pasco Basin and in the region by Rockwell Hanford Operations and its subcontractors; and, (2) to provide a framework for more detailed late Cenozoic studies within the Pasco Basin. A description of procedures used to produce the surficial geologic map and geologic map units is summarized in this report

  11. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    Science.gov (United States)

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  12. On palaeogeographic map

    Directory of Open Access Journals (Sweden)

    Zeng-Zhao Feng

    2016-01-01

    Full Text Available The palaeogeographic map is a graphic representation of physical geographical characteristics in geological history periods and human history periods. It is the most important result of palaeogeographic study. The author, as the Editor-in-Chief of Journal of Palaeogeography, Chinese Edition and English Edition, aimed at the problems of the articles submitted to and published in the Journal of Palaeogeography in recent years and the relevant papers and books of others, and integrated with his practice of palaeogeographic study and mapping, wrote this paper. The content mainly includes the data of palaeogeographic mapping, the problems of palaeogeographic mapping method, the “Single factor analysis and multifactor comprehensive mapping method —— Methodology of quantitative lithofacies palaeogeography”, i.e., the “4 steps mapping method”, the nomenclature of each palaeogeographic unit in palaeogeographic map, the explanation of each palaeogeographic unit in palaeogeographic map, the explanation of significance of palaeogeographic map and palaeogeographic article, the evaluative standards of palaeogeographic map and palaeogeographic article, and the self-evaluation. Criticisms and corrections are welcome.

  13. BAC-HAPPY mapping (BAP mapping: a new and efficient protocol for physical mapping.

    Directory of Open Access Journals (Sweden)

    Giang T H Vu

    2010-02-01

    Full Text Available Physical and linkage mapping underpin efforts to sequence and characterize the genomes of eukaryotic organisms by providing a skeleton framework for whole genome assembly. Hitherto, linkage and physical "contig" maps were generated independently prior to merging. Here, we develop a new and easy method, BAC HAPPY MAPPING (BAP mapping, that utilizes BAC library pools as a HAPPY mapping panel together with an Mbp-sized DNA panel to integrate the linkage and physical mapping efforts into one pipeline. Using Arabidopsis thaliana as an exemplar, a set of 40 Sequence Tagged Site (STS markers spanning approximately 10% of chromosome 4 were simultaneously assembled onto a BAP map compiled using both a series of BAC pools each comprising 0.7x genome coverage and dilute (0.7x genome samples of sheared genomic DNA. The resultant BAP map overcomes the need for polymorphic loci to separate genetic loci by recombination and allows physical mapping in segments of suppressed recombination that are difficult to analyze using traditional mapping techniques. Even virtual "BAC-HAPPY-mapping" to convert BAC landing data into BAC linkage contigs is possible.

  14. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  15. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  16. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  17. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  18. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  19. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  20. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  1. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  2. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  3. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  4. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  5. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  6. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  7. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  8. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  9. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  10. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  11. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  12. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  13. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  14. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  15. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  16. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  17. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  18. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  19. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  20. Integrating laser-range finding, electronic compass measurements and GPS to rapidly map vertical changes in volcanic stratigraphy and constrain unit thicknesses and volumes: two examples from the northern Cordilleran volcanic province

    Science.gov (United States)

    Nogier, M.; Edwards, B. R.; Wetherell, K.

    2005-12-01

    We present preliminary results of laser-range finding-GPS surveys from two separate locations in northern British Columbia, in the south-central northern Cordilleran volcanic province: Hoodoo Mountain volcano and Craven Lake cone. This technique, described in detail below, is appropriate for rapidly measuring changes in vertical thicknesses of units that either would be difficult or impossible to measure by most other techniques. The ability to accurately measure thicknesses of geologic units in otherwise difficult-to-access locations will aide in generating better quantitative estimates of deposit geometries and eruption volumes. Such data is particularly important for constraining quantitative models of magma production and eruption dynamics. The deposits of interest in this study comprised at least partly inaccessible, largely pyroclastic units, although the technique could be used to map any vertical surfaces. The first field location was the northern side of Hoodoo Mountain volcano (56deg47'23.72'N/131deg17'36.97'W/1208m-asl), where a sequence of welded to unwelded, trachytic-phonolitic tephra was deposited in a paleovalley. This deposit is informally referred to as the Pointer Ridge deposit, and it comprises at least 7 distinct subunits. The horizontal limit of the exposures is approximately 1.5km, and the vertical limit is approximately 250m. Three different GPS base stations were used to map the lateral and vertical variations in the deposit. The second field location is north of Craven Lake (56deg54'44.55'N/129deg21'42.17'W/1453m-asl), along Craven Creek, where a sequence of basaltic tephra is overlain by pillow lava and glacial diamicton. This exposure is 200m long and approximately 30m high, much smaller than the area mapped at Hoodoo Mountain. The basaltic tephra appears to comprise 4 distinct sequences (measured thicknesses vary from 3-4m) not including the overlying pillow lava (measured thickness varies from 2 to 10m), and measurements of the

  1. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-07-28

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLS compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.

  2. Estimating hourly direct and diffuse solar radiation for the compilation of solar radiation distribution maps

    International Nuclear Information System (INIS)

    Ueyama, H.

    2005-01-01

    This paper presents a new method for estimating hourly direct and diffuse solar radiation. The essence of the method is the estimation of two important factors related to solar radiation, atmospheric transmittance and a dimensionless parameter, using empirical and physical equations and data from general meteorological observation stations. An equation for atmospheric transmittance of direct solar radiation and a dimensionless parameter representing diffuse solar radiation are developed. The equation is based on multiple regression analysis and uses three parameters as explanatory variates: calculated hourly extraterrestrial solar radiation on a horizontal plane, observed hourly sunshine duration and hourly precipitation as observed at a local meteorological observatory. The dimensionless parameter for estimating a diffuse solar radiation is then determined by linear least squares using observed hourly solar radiation at a local meteorological observatory. The estimated root mean square error (RMSE) of hourly direct and diffuse solar radiation is about 0.0-0.2 MJ¥m(-2)¥h(-1) in each mean period. The RMSE of the ten-day and monthly means of these quantities is about 0.0-0.2 MJ¥m(-2)¥h(-1), based on comparisons with AMeDAS station data, located at a distance of 6 km

  3. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  4. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  5. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  6. Compilation and synthesis for embedded reconfigurable systems an aspect-oriented approach

    CERN Document Server

    Diniz, Pedro; Coutinho, José; Petrov, Zlatko

    2013-01-01

    This book provides techniques to tackle the design challenges raised by the increasing diversity and complexity of emerging, heterogeneous architectures for embedded systems. It describes an approach based on techniques from software engineering called aspect-oriented programming, which allow designers to control today’s sophisticated design tool chains, while maintaining a single application source code.  Readers are introduced to the basic concepts of an aspect-oriented, domain specific language that enables control of a wide range of compilation and synthesis tools in the partitioning and mapping of an application to a heterogeneous (and possibly multi-core) target architecture.  Several examples are presented that illustrate the benefits of the approach developed for applications from avionics and digital signal processing. Using the aspect-oriented programming techniques presented in this book, developers can reuse extensive sections of their designs, while preserving the original application source-...

  7. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  8. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  9. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  10. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  11. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  12. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  13. Geological mapping of the Kuiper quadrangle (H06) of Mercury

    Science.gov (United States)

    Giacomini, Lorenza; Massironi, Matteo; Galluzzi, Valentina

    2017-04-01

    Kuiper quadrangle (H06) is located at the equatorial zone of Mercury and encompasses the area between longitudes 288°E - 360°E and latitudes 22.5°N - 22.5°S. The quadrangle was previously mapped for its most part by De Hon et al. (1981) that, using Mariner10 data, produced a final 1:5M scale map of the area. In this work we present the preliminary results of a more detailed geological map (1:3M scale) of the Kuiper quadrangle that we compiled using the higher resolution of MESSENGER data. The main basemap used for the mapping is the MDIS (Mercury Dual Imaging System) 166 m/pixel BDR (map-projected Basemap reduced Data Record) mosaic. Additional datasets were also taken into account, such as DLR stereo-DEM of the region (Preusker et al., 2016), global mosaics with high-incidence illumination from the east and west (Chabot et al., 2016) and MDIS global color mosaic (Denevi et al., 2016). The preliminary geological map shows that the western part of the quadrangle is characterized by a prevalence of crater materials (i.e. crater floor, crater ejecta) which were distinguished into three classes on the basis of their degradation degree (Galluzzi et al., 2016). Different plain units were also identified and classified as: (i) intercrater plains, represented by densely cratered terrains, (ii) intermediate plains, which are terrains with a moderate density of superposed craters, and (iii) smooth plains, which are poorly cratered volcanic deposits emplaced mainly on the larger crater floors. Finally, several structures were mapped all over the quadrangle. Most of these features are represented by thrusts, some of which appear to form systematic alignments. In particular, two main thrust systems have been identified: i) the "Thakur" system, a 1500 km-long system including several scarps with a NNE-SSW orientation, located at the edge between the Kuiper and Beethoven (H07) quadrangles; ii) the "Santa Maria" system, located at the centre of the quadrangle. It is a 1700 km

  14. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  15. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  16. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  17. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  18. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  19. Declarative and Scalable Selection for Map Visualizations

    DEFF Research Database (Denmark)

    Kefaloukos, Pimin Konstantin Balic

    and is itself a source and cause of prolific data creation. This calls for scalable map processing techniques that can handle the data volume and which play well with the predominant data models on the Web. (4) Maps are now consumed around the clock by a global audience. While historical maps were singleuser......-defined constraints as well as custom objectives. The purpose of the language is to derive a target multi-scale database from a source database according to holistic specifications. (b) The Glossy SQL compiler allows Glossy SQL to be scalably executed in a spatial analytics system, such as a spatial relational......, there are indications that the method is scalable for databases that contain millions of records, especially if the target language of the compiler is substituted by a cluster-ready variant of SQL. While several realistic use cases for maps have been implemented in CVL, additional non-geographic data visualization uses...

  20. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  1. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  2. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  3. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  4. Renewable energy atlas of the United States.

    Energy Technology Data Exchange (ETDEWEB)

    Kuiper, J.A.; Hlava, K.Greenwood, H.; Carr, A. (Environmental Science Division)

    2012-05-01

    The Renewable Energy Atlas (Atlas) of the United States is a compilation of geospatial data focused on renewable energy resources, federal land ownership, and base map reference information. It is designed for the U.S. Department of Agriculture Forest Service (USFS) and other federal land management agencies to evaluate existing and proposed renewable energy projects. Much of the content of the Atlas was compiled at Argonne National Laboratory (Argonne) to support recent and current energy-related Environmental Impact Statements and studies, including the following projects: (1) West-wide Energy Corridor Programmatic Environmental Impact Statement (PEIS) (BLM 2008); (2) Draft PEIS for Solar Energy Development in Six Southwestern States (DOE/BLM 2010); (3) Supplement to the Draft PEIS for Solar Energy Development in Six Southwestern States (DOE/BLM 2011); (4) Upper Great Plains Wind Energy PEIS (WAPA/USFWS 2012, in progress); and (5) Energy Transport Corridors: The Potential Role of Federal Lands in States Identified by the Energy Policy Act of 2005, Section 368(b) (in progress). This report explains how to add the Atlas to your computer and install the associated software; describes each of the components of the Atlas; lists the Geographic Information System (GIS) database content and sources; and provides a brief introduction to the major renewable energy technologies.

  5. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  6. Surficial Geologic Map of the Worcester North-Oxford- Wrentham-Attleboro Nine-Quadrangle Area in South- Central Massachusetts

    Science.gov (United States)

    Stone, Byron D.; Stone, Janet R.; DiGiacomo-Cohen, Mary L.

    2008-01-01

    The surficial geologic map layer shows the distribution of nonlithified earth materials at land surface in an area of nine 7.5-minute quadrangles (417 mi2 total) in south-central Massachusetts (fig. 1). Across Massachusetts, these materials range from a few feet to more than 500 ft in thickness. They overlie bedrock, which crops out in upland hills and in resistant ledges in valley areas. The geologic map differentiates surficial materials of Quaternary age on the basis of their lithologic characteristics (such as grain size and sedimentary structures), constructional geomorphic features, stratigraphic relationships, and age. Surficial materials also are known in engineering classifications as unconsolidated soils, which include coarse-grained soils, fine-grained soils, or organic fine-grained soils. Surficial materials underlie and are the parent materials of modern pedogenic soils, which have developed in them at the land surface. Surficial earth materials significantly affect human use of the land, and an accurate description of their distribution is particularly important for water resources, construction aggregate resources, earth-surface hazards assessments, and land-use decisions. The mapped distribution of surficial materials that lie between the land surface and the bedrock surface is based on detailed geologic mapping of 7.5-minute topographic quadrangles, produced as part of an earlier (1938-1982) cooperative statewide mapping program between the U.S. Geological Survey and the Massachusetts Department of Public Works (now Massachusetts Highway Department) (Page, 1967; Stone, 1982). Each published geologic map presents a detailed description of local geologic map units, the genesis of the deposits, and age correlations among units. Previously unpublished field compilation maps exist on paper or mylar sheets and these have been digitally rendered for the present map compilation. Regional summaries based on the Massachusetts surficial geologic mapping

  7. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  8. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  9. A Preliminary Study on the Use of Mind Mapping as a Visual-Learning Strategy in General Education Science Classes for Arabic Speakers in the United Arab Emirates

    Science.gov (United States)

    Wilson, Kenesha; Copeland-Solas, Eddia; Guthrie-Dixon, Natalie

    2016-01-01

    Mind mapping was introduced as a culturally relevant pedagogy aimed at enhancing the teaching and learning experience in a general education, Environmental Science class for mostly Emirati English Language Learners (ELL). Anecdotal evidence suggests that the students are very artistic and visual and enjoy group-based activities. It was decided to…

  10. Bedrock Geologic Map of Vermont - Faults and Contacts

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  11. Bedrock Geologic Map of Vermont - Geochronology Sample Locations

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  12. Multi-hazard Non-regulatory Risk Maps for Resilient Coastal Communities of Washington State in Pacific Northwest Region of the United States

    Science.gov (United States)

    Cakir, R.; Walsh, T. J.; Zou, Y.; Gufler, T.; Norman, D. K.

    2015-12-01

    Washington Department of Natural Resources - Division of Geology and Earth Resources (WADNR-DGER) partnered with FEMA through the FEMA Cooperating Technical Partners (CTP) program to assess annualized losses from flood and other hazards and prepare supportive risk related data for FEMA's coastal RiskMAP projects. We used HAZUS-MH analysis to assess losses from earthquake, flood and other potential hazards such as landslide and tsunami in the project areas; on shorelines of the Pacific Ocean and Puget Sound of Washington Grays Harbor, Pacific, Skagit, Whatcom, Island, Mason, Clallam, Jefferson and San Juan counties. The FEMA's Hazus-MH tool was applied to estimate losses and damages for each building due to floods and earthquakes. User-defined facilities (UDF) inventory data were prepared and used for individual building damage estimations and updating general building stocks. Flood depth grids were used to determine which properties are most impacted by flooding. For example, the HAZUS-MH (flood model) run based on the 1% annual chance event (or 100 year flood) for Grays Harbor County, resulted in a total of 161 million in losses to buildings including residential, commercial properties, and other building and occupancy types. A likely M9 megathrust Cascadia earthquake scenario USGS-ShakeMap was used for the HAZUS-MH earthquake model. For example, the HAZUS-MH (earthquake model) run based on the Cascadia M9 earthquake for Grays Harbor County, resulted in a total of 1.15 billion in losses to building inventory. We produced GIS-based overlay maps of properties exposed to tsunami, landslide, and liquefaction hazards within the communities. This multi-hazard approach is an essential component to produce non-regulatory maps for FEMA's RiskMAP project, and they help further improve local and regional mitigation efforts and emergency response plans, and overall resiliency plan of the communities in and around the coastal communities in western Washington.

  13. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  14. Report on the lands of the arid region of the United States with a more detailed account of the land of Utah with maps

    Science.gov (United States)

    Powell, John Wesley

    1879-01-01

    A report from Maj. J. W.Powell, geologist in charge of the United States Geographical and Geological Survey of the Rocky Mountain Region, upon the lands of the Arid Region of the United States, setting forth the extent of said region, and making suggestions as to the conditions under which the lands embraced within its limit may be rendered available for agricultural and grazing purposes. With the report is transmitted a statement of the rainfall of the western portion of the United States, with reports upon the subject of irrigation by Capt. C. E. Button, U. S. A., Prof. A. H. Thompson, and Mr. G. K. Gilbert.

  15. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  16. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  17. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  18. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  19. Application of ecological mapping

    International Nuclear Information System (INIS)

    Sherk, J.A.

    1982-01-01

    The US Fish and Wildlife Service has initiated the production of a comprehensive ecological inventory map series for use as a major new planning tool. Important species data along with special land use designations are displayed on 1:250,000 scale topographic base maps. Sets of maps have been published for the Atlantic and Pacific coastal areas of the United States. Preparation of a map set for the Gulf of Mexico is underway at the present time. Potential application of ecological inventory map series information to a typical land disposal facility could occur during the narrowing of the number of possible disposal sites, the design of potential disposal site studies of ecological resources, the preparation of the environmental report, and the regulatory review of license applications. 3 figures, 3 tables

  20. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  1. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  2. Environmental sensibility maps of pipelines rows; Mapas de sensibilidade ambiental para faixas de dutos terrestres

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Wilson J. de [PETROBRAS Engenharia, Rio de Janeiro, RJ (Brazil). Engenharia de Avaliacao Ambiental IEGEN/EGE/EAMB; Ferreira Filho, Aluisio Teles; Ferreira, Vanderlei Cardoso [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil). SMS - Seguranca, Meio Ambiente e Saude; Braun, Oscar P.G.; Pereira, Junior, Edson Rodrigues [Geodatum, Rio de Janeiro, RJ (Brazil)

    2003-07-01

    To subsidize its contingency plans for oil leaks, TRANSPETRO, subsidiary of PETROBRAS, set up an internal program with a big extension to obtain Environmental Sensibility Maps over a strip of twenty kilometers of width along more than five thousand kilometers of pipelines. Due to the pioneer characteristic of these natural survey (thematic cartography) it was opted a first approach for integration of this information in 1:50.000 scale. Based on a Geographical Information System (GIS), it was opted the supervised geo processing resources, compiling, firstly, the elevation, soil and geological maps for generation of the physical environment vulnerability units. Using a combination by weight average, ten vulnerability units were generated and were improved through aggregation in five units to decrease the complexity of the representation in map. These classes represent the combinations of variables of the physical environment that can be recognized by theirs corresponding landscapes. Based on interpretation of orbital LANDSAT TM images, aided by verifications in aerial photograph and a systematic survey of notable points of environmental observation (PVAs) along the pipelines, it was elaborated a general map of soil use and vegetable coverage. The classes of this theme were combined with the classes of physical vulnerability environment to generate five classes of Environmental Sensibility (Environmental Sensibility Maps). Over this theme, were attributed the representations of the main types of vegetable coverage and occupation of the soil, as well as the fauna and the other social-economics aspects, obtaining therefore a map with all the essential controller information of the environmental protection measures. (author)

  3. Mapping of wine industry

    Directory of Open Access Journals (Sweden)

    Віліна Пересадько

    2016-10-01

    Full Text Available Having reviewed a variety of approaches to understanding the essence of wine industry, having studied the modern ideas about the future of wine industry, having analyzed more than 50 maps from the Internet we have set the trends and special features of wine industry mapping in the world, such as: - the vast majority of maps displays the development of the industry at regional or national level, whereas there are practically no world maps; - wine-growing regions are represented on maps very unevenly; - all existing maps of the industry could be classified as analytical ascertaining inventory type; - the dominant ways of cartographic representation are area method and qualitative background method, sign method and collation maps are rarely used; - basically all the Internet maps have low quality as they are scanned images with poor resolution; - the special feature of maps published lately is lack of geographical basis (except for state borders and coastline. We created wine production and consumption world map «Wine Industry» in the scale of 1:60 000 000 with simple geographical basis (state names, state borders, major rivers, coastline. It was concluded that from the methodological point of view it is incorrect not to show geographical basis on maps of wine industry. Analysis of this map allowed us to identify areas of traditional wine-making, potential wine-making areas and countries which claim to be the world leaders in the field of wine production. We found disbalans between wine production and wine consumption - increasing wine production in South America, China and the United States and increasing wine consumption (mainly due to the import products in countries where the grape is not the primary agricultural product.

  4. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  5. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  6. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  7. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  8. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  9. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  10. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  11. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  12. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  13. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  14. Thoughts and Views on the Compilation of Monolingual Dictionaries in South Africa

    Directory of Open Access Journals (Sweden)

    N.C.P Golele

    2011-10-01

    Full Text Available Abstract: Developing and documenting the eleven official languages of South Africa on all levels of communication in order to fulfil all the roles and uses characteristic of truly official languages is a great challenge. To meet this need various bodies such as the National Lexicography Units have been established by the Pan South African Language Board (PanSALB. As far as dictionary compilation is concerned, acquaintance with the state-of-the-art developments in the theory and practice of lexicography is necessary. The focus of the African languages should be directed onto the compilation of monolingual dictionaries. It is important that these monolingual dictionaries should be usable right from the start on a continuous basis. Continued attention should be given to enlarging the corpora and actual consultation of these corpora on the macro- and microstructural levels. The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily and naturally accomplished. Advanced and continued training in the compilation of monolingual dictionaries should be presented. Keywords: MONOLINGUAL DICTIONARIES, OFFICIAL LANGUAGES, DICTIONARY COMPILATION, CORPORA, NATIONAL LEXICOGRAPHY UNITS, TARGET USERS, DICTIONARY USE, DICTIONARY CULTURE, CORE TERMS Opsomming: Gedagtes en beskouings oor die samestelling van eentalige woordeboeke in Suid-Afrika. Die ontwikkeling en dokumentering van die elf amptelike tale van Suid-Afrika op alle vlakke van kommunikasie om alle rolle en gebruike van werklik amptelike tale te vervul, is 'n groot uitdaging. Om in hierdie behoefte te voorsien, is liggame soos die Nasionale Leksikografie-eenhede deur die Pan Suid-Afrikaanse Taalraad (PanSAT tot stand gebring. Wat

  15. Decoupled Vector-Fetch Architecture with a Scalarizing Compiler

    OpenAIRE

    Lee, Yunsup

    2016-01-01

    As we approach the end of conventional technology scaling, computer architects are forced to incorporate specialized and heterogeneous accelerators into general-purpose processors for greater energy efficiency. Among the prominent accelerators that have recently become more popular are data-parallel processing units, such as classic vector units, SIMD units, and graphics processing units (GPUs). Surveying a wide range of data-parallel architectures and their parallel programming models and ...

  16. Compilation of data concerning know and suspected water hammer events in nuclear power plants, CY 1969

    International Nuclear Information System (INIS)

    Chapman, R.L.; Christensen, D.D.; Dafoe, R.E.; Hanner, O.M.; Wells, M.E.

    1981-05-01

    This report compiles data concerning known and suspected water hammer events reported by BWR and PWR power plants in the United States from January 1, 1969, to May 1, 1981. This information is summarized for each event and is tabulated for all events by plant, plant type, year of occurrence, type of water hammer, system affected, basis/cause for the event, and damage incurred. Information is also included from other events not specifically identified as water hammer related. These other events involved vibration and/or system components similar to those involved in the water hammer events. The other events are included to ensure completeness of the report, but are not used to point out particular facts or trends. This report does not evaluate findings abstracted from the data

  17. ShoreZone Mapped Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set is a polyline file of mapped ShoreZone units which correspond with data records found in the Unit, Xshr, BioUnit, and BioBand tables of this...

  18. Geologic map of the Ponca quadrangle, Newton, Boone, and Carroll Counties, Arkansas

    Science.gov (United States)

    Hudson, Mark R.; Murray, Kyle E.

    2003-01-01

    This digital geologic map compilation presents new polygon (i.e., geologic map unit contacts), line (i.e., fault, fold axis, and structure contour), and point (i.e., structural attitude, contact elevations) vector data for the Ponca 7 1/2' quadrangle in northern Arkansas. The map database, which is at 1:24,000-scale resolution, provides geologic coverage of an area of current hydrogeologic, tectonic, and stratigraphic interest. The Ponca quadrangle is located in Newton, Boone, and Carroll Counties about 20 km southwest of the town of Harrison. The map area is underlain by sedimentary rocks of Ordovician, Mississippian, and Pennsylvanian age that were mildly deformed by a series of normal and strike-slip faults and folds. The area is representative of the stratigraphic and structural setting of the southern Ozark Dome. The Ponca quadrangle map provides new geologic information for better understanding groundwater flow paths and development of karst features in and adjacent to the Buffalo River watershed.

  19. International survey of environmental programmes - a compilation of information from twelve countries received in response to a questionnaire distributed in 1992

    International Nuclear Information System (INIS)

    Gyllander, C.; Karlberg, O.; Luening, M.; Larsson, C.M.; Johansson, G.

    1995-11-01

    The report compiles information from Cuba, Finland, Germany, Japan, South Korea, Lithuania, Luxembourg, Malaysia, Romania, Sweden, Switzerland and United Kingdom, relevant to the organisation and execution of programmes for environmental surveillance of nuclear facilities (source and environmental monitoring). 28 refs, 19 tabs

  20. International survey of environmental programmes - a compilation of information from twelve countries received in response to a questionnaire distributed in 1992

    Energy Technology Data Exchange (ETDEWEB)

    Gyllander, C; Karlberg, O; Luening, M; Larsson, C M; Johansson, G

    1995-11-01

    The report compiles information from Cuba, Finland, Germany, Japan, South Korea, Lithuania, Luxembourg, Malaysia, Romania, Sweden, Switzerland and United Kingdom, relevant to the organisation and execution of programmes for environmental surveillance of nuclear facilities (source and environmental monitoring). 28 refs, 19 tabs.

  1. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  2. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  3. Preliminary isostatic gravity map of the Grouse Creek and east part of the Jackpot 30 by 60 quadrangles, Box Elder County, Utah, and Cassia County, Idaho

    Science.gov (United States)

    Langenheim, Victoria; Willis, H.; Athens, N.D.; Chuchel, Bruce A.; Roza, J.; Hiscock, H.I.; Hardwick, C.L.; Kraushaar, S.M.; Knepprath, N.E.; Rosario, Jose J.

    2013-01-01

    A new isostatic residual gravity map of the northwest corner of Utah is based on compilation of preexisting data and new data collected by the Utah and United States Geological Surveys. Pronounced gravity lows occur over Junction, Grouse Creek, and upper Raft River Valleys, indicating significant thickness of low-density Tertiary sedimentary rocks and deposits. Gravity highs coincide with exposures of dense pre-Cenozoic rocks in the Raft River Mountains. Higher values in the eastern part of the map may be produced in part by deeper crustal density variations or crustal thinning. Steep linear gravity gradients coincide with mapped Neogene normal faults near Goose Creek and may define basin-bounding faults concealed beneath Junction and Upper Raft River Valleys.

  4. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  5. ON PHYTOCOENOTICAL MAPPING OF CASPIAN DESERT REGION

    Directory of Open Access Journals (Sweden)

    I. SAFRONOVA

    2004-05-01

    Full Text Available The phytoecological map (l :2.500.000 for Desert Region, including the Caspian Lowland and the Mangyshlak. has been compiled. It gives an idea of latitudinal differentiation cf vegetation. Edaphic variants and lithological composition in low mountains. The legend has been constructed according to zonal-typological principle e using an ecological-phytocoenotic classification. Heterogeneity of vegetation is reflected by means of territoria1 units (complex, series, combination and additional marks above the vegetation background. In the northern subzone vegetation is fairly monotonous and characterized by prevalence of wormwood communities (Artemisia of subgenus Seriphidium, joined in three formations: Artemisia lerchiana, A. arenaria. A. pauciflora. Small areas are occupied by shrub deserts of Calligollum aphyllum and Tamarix ramosissima. To southward of 47° N in the middle subzone on the Caspian Lowland the communities of halophyte perennial saltworts essential1y dominate, and to less extent-wormwood communities of hemipsammophytic Artemisia terrae-albae and psammophytic Artemisia arenaria and A. lerchiana. Deserts of Mangyshlak are much diverse. Dwarf semishrubs are presented by species of perennial saltworts (Anabasis salsa, Nanophyton erinaceum,Arthrophytum lehnwnianum, Salsola orientaUs and wonnwood (Artemisia terrae-albae, A. gurganica. A. santolina. To southward of 43° N in the southern subzone dwarf semishrub Salsola gemmascens and Artemisia kemrudica corrnnunities prevail.

  6. ON PHYTOCOENOTICAL MAPPING OF CASPIAN DESERT REGION

    Directory of Open Access Journals (Sweden)

    I. SAFRONOVA

    2004-01-01

    Full Text Available The phytoecological map (l :2.500.000 for Desert Region, including the Caspian Lowland and the Mangyshlak. has been compiled. It gives an idea of latitudinal differentiation cf vegetation. Edaphic variants and lithological composition in low mountains. The legend has been constructed according to zonal-typological principle e using an ecological-phytocoenotic classification. Heterogeneity of vegetation is reflected by means of territoria1 units (complex, series, combination and additional marks above the vegetation background. In the northern subzone vegetation is fairly monotonous and characterized by prevalence of wormwood communities (Artemisia of subgenus Seriphidium, joined in three formations: Artemisia lerchiana, A. arenaria. A. pauciflora. Small areas are occupied by shrub deserts of Calligollum aphyllum and Tamarix ramosissima. To southward of 47° N in the middle subzone on the Caspian Lowland the communities of halophyte perennial saltworts essential1y dominate, and to less extent-wormwood communities of hemipsammophytic Artemisia terrae-albae and psammophytic Artemisia arenaria and A. lerchiana. Deserts of Mangyshlak are much diverse. Dwarf semishrubs are presented by species of perennial saltworts (Anabasis salsa, Nanophyton erinaceum,Arthrophytum lehnwnianum, Salsola orientaUs and wonnwood (Artemisia terrae-albae, A. gurganica. A. santolina. To southward of 43° N in the southern subzone dwarf semishrub Salsola gemmascens and Artemisia kemrudica corrnnunities prevail.

  7. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File

  8. Geologic map of the Maumee quadrangle, Searcy and Marion Counties, Arkansas

    Science.gov (United States)

    Turner, Kenzie J.; Hudson, Mark R.

    2010-01-01

    This map summarizes the geology of the Maumee 7.5-minute quadrangle in northern Arkansas. The map area is in the Ozark plateaus region on the southern flank of the Ozark dome. The Springfield Plateau, composed of Mississippian cherty limestone, overlies the Salem Plateau, composed of Ordovician carbonate and clastic rocks, with areas of Silurian rocks in between. Erosion related to the Buffalo River and its tributaries, Tomahawk, Water, and Dry Creeks, has exposed a 1,200-ft-thick section of Mississippian, Silurian, and Ordovician rocks mildly deformed by faults and folds. An approximately 130-mile-long corridor along the Buffalo River forms the Buffalo National River that is administered by the National Park Service. McKnight (1935) mapped the geology of the Maumee quadrangle as part of a larger 1:125,000-scale map focused on understanding the lead and zinc deposits common in the area. Detailed new mapping for this study was compiled using a Geographic Information System (GIS) at 1:24,000 scale. Site location and elevation were obtained by using a Global Positioning Satellite (GPS) receiver in conjunction with a U.S. Geological Survey 7.5-minute topographic map and barometric altimeter. U.S. Geological Survey 10-m digital elevation model data were used to derive a hill-shade-relief map used along with digital orthophotographs to map ledge-forming units between field sites. Bedding attitudes were measured in drainage bottoms and on well-exposed ledges. Bedding measured at less than 2 degree dip is indicated as horizontal. Structure contours constructed for the base of the Boone Formation are constrained by field-determined elevations on both upper and lower formation contacts.

  9. USGS Elevation Contours Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Elevation Contours service from The National Map (TNM) consists of contours generated for the conterminous United States from 1- and 1/3 arc-second...

  10. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  11. Reporting session of UWTF operation. Compilation of documents

    International Nuclear Information System (INIS)

    Shimizu, Kaoru; Togashi, Akio; Irinouchi, Shigenori

    1999-07-01

    This is the compilation of the papers and OHP transparencies presented, as well as discussions and comments, on the occasion of UWTF reporting session. UWTF stands for The Second Uranium Waste Treatment Facility, which was constructed for compression of metallic wastes and used filters, which are parts of uranium bearing solid wastes generated from Tokai Works, Japan Nuclear Cycle Development Institute. UWTF has been processing wastes since June 4 1998. In the session, based on the one year experience of UWTF operation, the difficulties met and the suggestions to the waste sources are mainly discussed. A brief summary of the UWTF construction, description of waste treatment process, and operation report of fiscal year 1998 are attached. (A. Yamamoto)

  12. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    International Nuclear Information System (INIS)

    Harrington, S.J.

    2011-01-01

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  13. [Version and compilation of Harikyuuhousouyou of Vietnamese medical book].

    Science.gov (United States)

    Du, Fengjuan; Xiao, Yongzhi

    2018-02-12

    Harikyuuhousouyou (《》)was written in 1827 and the author is unknown. The book has only one version which is collected by the National Library of Vietnam. The book contains one volume and includes contraindication of acupuncture and moxibustion, meridian points, point locations, indications and the therapeutic methods at extraordinary points. They are mainly cited from Zhen Jiu Da Quan (《》 Great Compendium on Acupuncture and Moxibustion ) by XU Feng , Yi Xue Ru Men (《》 Elementary Medicine ) by LI Chan and Shou Shi Bao Yuan (《》 Longevity and Health Preservation ) by GONG Tingxian in the Ming Dynasty. In the paper, in view of the characteristics of version and compilation, the hand-coped book was introduced. It was explored that Vietnam acupuncture absorbed Chinese medicine and emphasized clinical practice rather than theoretic statement.

  14. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  15. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  16. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  17. Quantum Programs as Kleisli Maps

    Directory of Open Access Journals (Sweden)

    Abraham Westerbaan

    2017-01-01

    Full Text Available Furber and Jacobs have shown in their study of quantum computation that the category of commutative C*-algebras and PU-maps (positive linear maps which preserve the unit is isomorphic to the Kleisli category of a comonad on the category of commutative C*-algebras with MIU-maps (linear maps which preserve multiplication, involution and unit. [Furber and Jacobs, 2013] In this paper, we prove a non-commutative variant of this result: the category of C*-algebras and PU-maps is isomorphic to the Kleisli category of a comonad on the subcategory of MIU-maps. A variation on this result has been used to construct a model of Selinger and Valiron's quantum lambda calculus using von Neumann algebras. [Cho and Westerbaan, 2016

  18. Semantics-informed cartography: the case of Piemonte Geological Map

    Science.gov (United States)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico

    2016-04-01

    In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially

  19. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  20. Waste management units - Savannah River Site

    International Nuclear Information System (INIS)

    1989-10-01

    This report is a compilation of worksheets from the waste management units of Savannah River Plant. Information is presented on the following: Solid Waste Management Units having received hazardous waste or hazardous constituents with a known release to the environment; Solid Waste Management Units having received hazardous waste or hazardous constituents with no known release to the environment; Solid Waste Management Units having received no hazardous waste or hazardous constituents; Waste Management Units having received source; and special nuclear, or byproduct material only

  1. Single-edition quadrangle maps

    Science.gov (United States)

    ,

    1998-01-01

    In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a

  2. Surficial sediment character of the New York-New Jersey offshore continental shelf region: a GIS compilation

    Science.gov (United States)

    Williams, S. Jeffress; Arsenault, Matthew A.; Poppe, Lawrence J.; Reid, Jane A.; Reid, Jamey M.; Jenkins, Chris J.

    2007-01-01

    Broad continental shelf regions such as the New York Bight are the product of a complex geologic history and dynamic oceanographic processes, dominated by the Holocene marine transgression (>100 m sea-level rise) following the end of the last Pleistocene ice advance ~ 20,000 years ago. The area of the U.S. Exclusive Economic Zone (U.S. EEZ) territory, extending 200 nautical miles seaward from the coast, is larger than the continental U.S. and contains submerged landforms that provide a variety of natural functions and societal benefits, such as: critical habitats for fisheries, ship navigation and homeland security, and engineering activities (i.e. oil and gas platforms, pipeline and cable routes, potential wind-energy-generation sites). Some parts of the continental margins, particularly inner-continental shelf regions, also contain unconsolidated hard-mineral deposits such as sand and gravel that are regarded as potential aggregate resources to meet or augment needs not met by onshore deposits (Williams, 1992). The present distribution of surficial sediment off the northeastern United States is shaped from the deposits left by the last glaciation and reflects the cumulative effects of sediment erosion, transport, sorting, and deposition by storm and tidal processes during the Holocene rise in sea level. As a result, the sediments on the sea floor represent both an historical record of former conditions and a guide to possible future sedimentary environments. The U.S. Geological Survey (USGS) through the Coastal and Marine Geology Program, in cooperation with the University of Colorado and other partners, has compiled extant sediment character and textural data as well as other geologic information on the sea floor from all regions around the U.S. into the usSEABED data system (Reid and others, 2005; Buczkowski and others, 2006; Reid and others, 2006). The usSEABED system, which contains information on sediment grain size and lithology for more than 340

  3. The National Assessment of Shoreline Change: a GIS compilation of vector shorelines and associated shoreline change data for the U.S. southeast Atlantic coast

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.

    2006-01-01

    The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Southeast Atlantic Coast (Florida, Georgia, South Carolina, North Carolina). These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates of shorelines and shoreline change rates can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the U.S. Southeast Atlantic Coast is the second in a series that already includes the Gulf of Mexico, and will eventually include the Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1997-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change for the U.S. Southeast Atlantic Coast at http://pubs.usgs.gov/of/2005/1401/ to get additional

  4. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  5. Geologic map of the Murray Quadrangle, Newton County, Arkansas

    Science.gov (United States)

    Hudson, Mark R.; Turner, Kenzie J.

    2016-07-06

    This map summarizes the geology of the Murray quadrangle in the Ozark Plateaus region of northern Arkansas. Geologically, the area is on the southern flank of the Ozark dome, an uplift that has the oldest rocks exposed at its center, in Missouri. Physiographically, the Murray quadrangle is within the Boston Mountains, a high plateau region underlain by Pennsylvanian sandstones and shales. Valleys of the Buffalo River and Little Buffalo River and their tributaries expose an approximately 1,600-ft-thick (488-meter-thick) sequence of Ordovician, Mississippian, and Pennsylvanian carbonate and clastic sedimentary rocks that have been mildly deformed by a series of faults and folds. The Buffalo National River, a park that encompasses the Buffalo River and adjacent land that is administered by the National Park Service is present at the northwestern edge of the quadrangle.Mapping for this study was carried out by field inspection of numerous sites and was compiled as a 1:24,000 geographic information system (GIS) database. Locations and elevation of sites were determined with the aid of a global positioning satellite receiver and a hand-held barometric altimeter that was frequently recalibrated at points of known elevation. Hill-shade relief and slope maps derived from a U.S. Geological Survey 10-meter digital elevation model as well as orthophotographs were used to help trace ledge-forming units between field traverses within the Upper Mississippian and Pennsylvanian part of the stratigraphic sequence. Strike and dip of beds were typically measured along stream drainages or at well-exposed ledges. Structure contours, constructed on the top of the Boone Formation and the base of a prominent sandstone unit within the Bloyd Formation, were drawn based on the elevations of field sites on these contacts well as other limiting information for their minimum elevations above hilltops or their maximum elevations below valley bottoms.

  6. Compilation and analysis of national and international OPEX for Safe Enclosure prior to decommissioning

    International Nuclear Information System (INIS)

    Dinner, Paul J.C.; Heimlich, Karel

    2016-01-01

    Around the world, a large number of aging nuclear plants are approaching final shutdown. While this is largely driven by plants reaching the end of their design life, economic factors such as low gas prices (in North America) and the smaller unit size of early commercial reactors are important contributors to this trend. In several instances, economic pressures have resulted in a need for a more rapid transition to Safe Enclosure than originally anticipated. Thus plans for this transition taking into account experience with Safe Enclosure periods of varying lengths are being actively prepared in many jurisdictions. The IAEA as well as other national and international authorities have long recognized the importance of the topic of Safe Enclosure and provided guidance, and the IAEA has recently undertaken a study of 'Lessons Learned from Deferred Decommissioning of Nuclear Facilities'. Beginning with preliminary experience from Canadian CANDU reactors in extended shutdown or safe enclosure, this paper aims to compare this experience with the larger pool of experience from the international community to: - classify the main issues or themes, - examine means to mitigate these, and - formulate general measures of 'good practice'. Compilation of this experience represents the first steps towards a comprehensive, searchable database potentially of use to many in the decommissioning community. Tabulation and analysis of the complete list (comprising approximately 70 cases) has provided the 'short list' of issues presented in Table 1. Examples of the most important listed issues are discussed. The authors' objective is to stimulate interest in extending this compilation. In this way it will continue to grow and benefit all those preparing for transition to decommissioning. (authors)

  7. Compilation and analysis of national and international OPEX or safe enclosure prior to decommissioning

    International Nuclear Information System (INIS)

    Dinner, Paul J.C.; Heimlich, Karel

    2016-01-01

    Around the world, a large number of aging nuclear plants are approaching final shutdown. While this is largely driven by plants reaching the end of their design life, economic factors such as low gas prices (in North America) and the smaller unit size of early commercial reactors are important contributors to this trend. In several instances, economic pressures have resulted in a need for a more rapid transition to Safe Enclosure than originally anticipated. Thus plans for this transition taking into account experience with Safe Enclosure periods of varying lengths are being actively prepared in many jurisdictions. The IAEA as well as other national and international authorities have long recognized the importance of the topic of Safe Enclosure and provided guidance [1-7], and the IAEA has recently undertaken a study of 'Lessons Learned from Deferred Decommissioning of Nuclear Facilities' [8]. Beginning with preliminary experience from Canadian CANDU reactors in extended shutdown or safe enclosure, this paper aims to compare this experience with the larger pool of experience from the international community to: - classify the main issues or themes, - examine means to mitigate these, and - formulate general measures of 'good practice'. Compilation of this experience represents the first steps towards a comprehensive, searchable database potentially of use to many in the decommissioning community. Tabulation and analysis of the complete list (comprising approximately 70 cases) has provided the 'short list' of issues presented. Examples of the most important listed issues are discussed. The authors' objective is to stimulate interest in extending this compilation. In this way it will continue to grow and benefit all those preparing for transition to decommissioning. (authors)

  8. Use of multi-sensor active fire detections to map fires in the United States: the future of monitoring trends in burn severity

    Science.gov (United States)

    Picotte, Joshua J.; Coan, Michael; Howard, Stephen M.

    2014-01-01

    The effort to utilize satellite-based MODIS, AVHRR, and GOES fire detections from the Hazard Monitoring System (HMS) to identify undocumented fires in Florida and improve the Monitoring Trends in Burn Severity (MTBS) mapping process has yielded promising results. This method was augmented using regression tree models to identify burned/not-burned pixels (BnB) in every Landsat scene (1984–2012) in Worldwide Referencing System 2 Path/Rows 16/40, 17/39, and 1839. The burned area delineations were combined with the HMS detections to create burned area polygons attributed with their date of fire detection. Within our study area, we processed 88,000 HMS points (2003–2012) and 1,800 Landsat scenes to identify approximately 300,000 burned area polygons. Six percent of these burned area polygons were larger than the 500-acre MTBS minimum size threshold. From this study, we conclude that the process can significantly improve understanding of fire occurrence and improve the efficiency and timeliness of assessing its impacts upon the landscape.

  9. Characterization and compilation of polymorphic simple sequence repeat (SSR markers of peanut from public database

    Directory of Open Access Journals (Sweden)

    Zhao Yongli

    2012-07-01

    Full Text Available Abstract Background There are several reports describing thousands of SSR markers in the peanut (Arachis hypogaea L. genome. There is a need to integrate various research reports of peanut DNA polymorphism into a single platform. Further, because of lack of uniformity in the labeling of these markers across the publications, there is some confusion on the identities of many markers. We describe below an effort to develop a central comprehensive database of polymorphic SSR markers in peanut. Findings We compiled 1,343 SSR markers as detecting polymorphism (14.5% within a total of 9,274 markers. Amongst all polymorphic SSRs examined, we found that AG motif (36.5% was the most abundant followed by AAG (12.1%, AAT (10.9%, and AT (10.3%.The mean length of SSR repeats in dinucleotide SSRs was significantly longer than that in trinucleotide SSRs. Dinucleotide SSRs showed higher polymorphism frequency for genomic SSRs when compared to trinucleotide SSRs, while for EST-SSRs, the frequency of polymorphic SSRs was higher in trinucleotide SSRs than in dinucleotide SSRs. The correlation of the length of SSR and the frequency of polymorphism revealed that the frequency of polymorphism was decreased as motif repeat number increased. Conclusions The assembled polymorphic SSRs would enhance the density of the existing genetic maps of peanut, which could also be a useful source of DNA markers suitable for high-throughput QTL mapping and marker-assisted selection in peanut improvement and thus would be of value to breeders.

  10. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  11. Compilation of modal analyses of volcanic rocks from the Nevada Test Site area, Nye County, Nevada

    International Nuclear Information System (INIS)

    Page, W.R.

    1990-01-01

    Volcanic rock samples collected from the Nevada Test Site, Nye County, Nevada, between 1960 and 1985 were analyzed by thin section to obtain petrographic mode data. In order to provide rapid accessibility to the entire database, all data from the cards were entered into a computerized database. This computer format will enable workers involved in stratigraphic studies in the Nevada Test Site area and other locations in southern Nevada to perform independent analyses of the data. The data were compiled from the mode cards into two separate computer files. The first file consists of data collected from core samples taken from drill holes in the Yucca Mountain area. The second group of samples were collected from measured sections and surface mapping traverses in the Nevada Test Site area. Each data file is composed of computer printouts of tables with mode data from thin section point counts, comments on additional data, and location data. Tremendous care was taken in transferring the data from the cards to computer, in order to preserve the original information and interpretations provided by the analyzer. In addition to the data files above, a file is included that consists of Nevada Test Site petrographic data published in other US Geological Survey and Los Alamos National Laboratory reports. These data are presented to supply the user with an essentially complete modal database of samples from the volcanic stratigraphic section in the Nevada Test Site area. 18 refs., 4 figs

  12. Silicon compilation: From the circuit to the system

    Science.gov (United States)

    Obrien, Keven

    The methodology used for the compilation of silicon from a behavioral level to a system level is presented. The aim was to link the heretofore unrelated areas of high level synthesis and system level design. This link will play an important role in the development of future design automation tools as it will allow hardware/software co-designs to be synthesized. A design methodology that alllows, through the use of an intermediate representation, SOLAR, a System level Design Language (SDL), to be combined with a Hardware Description Language (VHDL) is presented. Two main steps are required in order to transform this specification into a synthesizable one. Firstly, a system level synthesis step including partitioning and communication synthesis is required in order to split the model into a set of interconnected subsystems, each of which will be processed by a high level synthesis tool. For this latter step AMICAL is used and this allows powerful scheduling techniques to be used, that accept very abstract descriptions of control flow dominated circuits as input, and interconnected RTL blocks that may feed existing logic-level synthesis tools to be generated.

  13. Development of automatic cross section compilation system for MCNP

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Sakurai, Kiyoshi

    1999-01-01

    A development of a code system to automatically convert cross-sections for MCNP is in progress. The NJOY code is, in general, used to convert the data compiled in the ENDF format (Evaluated Nuclear Data Files by BNL) into the cross-section libraries required by various reactor physics codes. While the cross-section library: FSXLIB-J3R2 was already converted from the JENDL-3.2 version of Japanese Evaluated Nuclear Data Library for a continuous energy Monte Carlo code MCNP, the library keeps only the cross-sections at room temperature (300 K). According to the users requirements which want to have cross-sections at higher temperature, say 600 K or 900 K, a code system named 'autonj' is under development to provide a set of cross-section library of arbitrary temperature for the MCNP code. This system can accept any of data formats adopted JENDL that may not be treated by NJOY code. The input preparation that is repeatedly required at every nuclide on NJOY execution is greatly reduced by permitting the conversion process of as many nuclides as the user wants in one execution. A few MCNP runs were achieved for verification purpose by using two libraries FSXLIB-J3R2 and the output of autonj'. The almost identical MCNP results within the statistical errors show the 'autonj' output library is correct. In FY 1998, the system will be completed, and in FY 1999, the user's manual will be published. (K. Tsuchihashi)

  14. Mars Pathfinder and Mars Global Surveyor Outreach Compilation

    Science.gov (United States)

    1999-09-01

    This videotape is a compilation of the best NASA JPL (Jet Propulsion Laboratory) videos of the Mars Pathfinder and Mars Global Surveyor missions. The mission is described using animation and narration as well as some actual footage of the entire sequence of mission events. Included within these animations are the spacecraft orbit insertion; descent to the Mars surface; deployment of the airbags and instruments; and exploration by Sojourner, the Mars rover. JPL activities at spacecraft control during significant mission events are also included at the end. The spacecraft cameras pan the surrounding Mars terrain and film Sojourner traversing the surface and inspecting rocks. A single, brief, processed image of the Cydonia region (Mars face) at an oblique angle from the Mars Global Surveyor is presented. A description of the Mars Pathfinder mission, instruments, landing and deployment process, Mars approach, spacecraft orbit insertion, rover operation are all described using computer animation. Actual color footage of Sojourner as well as a 360 deg pan of the Mars terrain surrounding the spacecraft is provided. Lower quality black and white photography depicting Sojourner traversing the Mars surface and inspecting Martian rocks also is included.

  15. The significant event compilation tree-sect: Theory and application

    International Nuclear Information System (INIS)

    Ishack, G.A.

    1990-01-01

    The Significant Event Compilation Tree (SECT) is a computer programme that was developed by staff of the Canadian Atomic Energy Control Board during the period 1984-86. Its primary purpose is to link seemingly unrelated events, or parts of events, that could have occurred at different points in time at various nuclear power plants. Using such a software tool aids in the identification of potential paths and/or scenarios that: a. may not have been foreseen in the accident analysis (including fault tree verification), b. could lead to a certain failure; or c. could have been caused by a certain initiating event (which may have ended or been terminated at an earlier stage). This paper describes: a. the basic idea of SECT; b. the criteria whereby events are selected and coded; c. the options available to the user; d. an example of the programme's application in Canada; and e. a demonstration of its possible use in conjunction with the NEA-IRS

  16. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1987-03-01

    The Panel on Basic Nuclear Data Compilations believes that it is of paramount importance to achieve as short a cycle time as is reasonably possible in the evaluation and publication of the A-chains. The panel, therefore, has concentrated its efforts on identifying those factors that have tended to increase the cycle time and on finding ways to remove the obstacles. An important step was made during the past year to address reduction of the size of the published evaluations - another factor that can reduce cycle time. The Nuclear Structure and Decay Data (NSDD) network adopted new format guidelines, which generated a 30% reduction by eliminating redundancy and/or duplication. A current problem appears to be the rate at which the A-chains are being evaluated, which, on the average, is only about one-half of what it could be. It is hoped that the situation will improve with an increase in the number of foreign centers and an increase in efficiency as more A-chains are recycled by the same evaluator who did the previous evaluation. Progress has been made in the area of on-line access to the nuclear data files in that a subcommittee report describing the requirements of an on-line system has been produced. 2 tabs

  17. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  18. A compilation of structure functions in deep inelastic scattering

    International Nuclear Information System (INIS)

    Gehrmann, T.; Roberts, R.G.; Whalley, M.R.

    1999-01-01

    A compilation of all the available data on the unpolarized structure functions F 2 and xF 3 , R=(σ L /σ T ), the virtual photon asymmetries A 1 and A 2 and the polarized structure functions g 1 and g 2 , from deep inelastic lepton scattering off protons, deuterium and nuclei is presented. The relevant experiments at CERN, DESY, Fermilab and SLAC from 1991, the date of our earlier review [1], to the present day are covered. A brief general theoretical introduction is given followed by the data presented both in tabular and graphical form and, for the F 2 and xF 3 data, the predictions based on the MRST98 and CTEQ4 parton distribution functions are also displayed. All the data in this review, together with data on a wide variety of other reactions, can be found in and retrieved from the Durham-RAL HEP Databases on the World-Wide-Web (http://durpdg.dur.ac.uk/HEPDATA). (author)

  19. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  20. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  1. Digital Field Mapping with the British Geological Survey

    Science.gov (United States)

    Leslie, Graham; Smith, Nichola; Jordan, Colm

    2014-05-01

    The BGS•SIGMA project was initiated in 2001 in response to a major stakeholder review of onshore mapping within the British Geological Survey (BGS). That review proposed a significant change for BGS with the recommendation that digital methods should be implemented for field mapping and data compilation. The BGS•SIGMA project (System for Integrated Geoscience MApping) is an integrated workflow for geoscientific surveying and visualisation using digital methods for geological data visualisation, recording and interpretation, in both 2D and 3D. The project has defined and documented an underpinning framework of best practice for survey and information management, best practice that has then informed the design brief and specification for a toolkit to support this new methodology. The project has now delivered BGS•SIGMA2012. BGS•SIGMA2012 is a integrated toolkit which enables assembly and interrogation/visualisation of existing geological information; capture of, and integration with, new data and geological interpretations; and delivery of 3D digital products and services. From its early days as a system which used PocketGIS run on Husky Fex21 hardware, to the present day system which runs on ruggedized tablet PCs with integrated GPS units, the system has evolved into a complete digital mapping and compilation system. BGS•SIGMA2012 uses a highly customised version of ESRI's ArcGIS 10 and 10.1 with a fully relational Access 2007/2010 geodatabase. BGS•SIGMA2012 is the third external release of our award-winning digital field mapping toolkit. The first free external release of the award-winning digital field mapping toolkit was in 2009, with the third version (BGS-SIGMAmobile2012 v1.01) released on our website (http://www.bgs.ac.uk/research/sigma/home.html) in 2013. The BGS•SIGMAmobile toolkit formed the major part of the first two releases but this new version integrates the BGS•SIGMAdesktop functionality that BGS routinely uses to transform our field

  2. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  3. Understanding map projections: Chapter 15

    Science.gov (United States)

    Usery, E. Lynn; Kent, Alexander J.; Vujakovic, Peter

    2018-01-01

    It has probably never been more important in the history of cartography than now that people understand how maps work. With increasing globalization, for example, world maps provide a key format for the transmission of information, but are often poorly used. Examples of poor understanding and use of projections and the resultant maps are many; for instance, the use of rectangular world maps in the United Kingdom press to show Chinese and Korean missile ranges as circles, something which can only be achieved on equidistant projections and then only from one launch point (Vujakovic, 2014).

  4. Geologic map of the east half of the Lime Hills 1:250,000-scale quadrangle, Alaska

    Science.gov (United States)

    Gamble, Bruce M.; Reed, Bruce L.; Richter, Donald H.; Lanphere, Marvin A.

    2013-01-01

    This map is compiled from geologic mapping conducted between 1985 and 1992 by the U.S. Geological Survey as part of the Alaska Mineral Resource Assessment Program. That mapping built upon previous USGS work (1963–1988) unraveling the magmatic history of the Alaska–Aleutian Range batholith. Quaternary unit contacts depicted on this map are derived largely from aerial-photograph interpretation. K-Ar ages made prior to this study have been recalculated using 1977 decay constants. The east half of the Lime Hills 1:250,000-scale quadrangle includes part of the Alaska–Aleutian Range batholith and several sequences of sedimentary rocks or mixed sedimentary and volcanic rocks. The Alaska–Aleutian Range batholith contains rocks that represent three major igneous episodes, (1) Early and Middle Jurassic, (2) Late Cretaceous and early Tertiary, and (3) middle Tertiary; only rocks from the latter two episodes are found in this map area. The map area is one of very steep and rugged terrain; elevations range from a little under 1,000 ft (305 m) to 9,828 ft (2,996 m). Foot traverses are generally restricted to lowermost elevations. Areas suitable for helicopter landings can be scarce at higher elevations. Most of the area was mapped from the air, supplemented by direct examination of rocks where possible. This restricted access greatly complicates understanding some of the more complex geologic units. For example, we know there are plutons whose compositions vary from gabbro to granodiorite, but we have little insight as to how these phases are distributed and what their relations might be to each other. It is also possible that some of what we have described as compositionally complex plutons might actually be several distinct intrusions.

  5. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  6. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  7. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  8. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  9. Fracture mapping at the Spent Fuel Test-Climax

    International Nuclear Information System (INIS)

    Wilder, D.G.; Yow, J.L. Jr.

    1981-05-01

    Mapping of geologic discontinuities has been done in several phases at the Spent Fuel Test-Climax (SFT-C) in the granitic Climax stock at the Nevada Test Site. Mapping was carried out in the tail drift, access drift, canister drift, heater drifts, instrumentation alcove, and receiving room. The fractures mapped as intersecting a horizontal datum in the canister and heater drifts are shown on one figure. Fracture sketch maps have been compiled as additional figures. Geologic mapping efforts were scheduled around and significantly impacted by the excavation and construction schedules. Several people were involved in the mapping, and over 2500 geologic discontinuities were mapped, including joints, shears, and faults. Some variance between individuals' mapping efforts was noticed, and the effects of various magnetic influences upon a compass were examined. The examination of compass errors improved the credibility of the data. The compass analysis work is explained in Appendix A. Analysis of the fracture data will be presented in a future report

  10. Mapping the Information Trace in Local Field Potentials by a Computational Method of Two-Dimensional Time-Shifting Synchronization Likelihood Based on Graphic Processing Unit Acceleration.

    Science.gov (United States)

    Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You

    2017-12-01

    The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.

  11. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    disseminate evaluated data on request to users within their respective service areas; k) In order to increase the effectiveness of world evaluation work: The IAEA should organise discussions on evaluation methods and bring together groups engaged in evaluation; l) The compilation of integral data should he considered by IWGPR and EACRP, the centres may have a role to play in the dissemination of compiled integral data; m) The centres' activities should be publicised in such a way as to increase their usefulness per unit cost; n) The centres should keep abreast of general developments in computer technology and information science and seek closer links with other centres working in related fields; o) The need and right of individual centres and their associated committees to give different priorities to the implementation of these recommendations should be recognized but several centres should work for ever-increasing cooperation. No attempt has been made to classify these recommendations as directed specifically to the IAEA, INDC, Centres etc. The Panel believes they should be studied by all appropriate corporate bodies and individuals.

  12. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    disseminate evaluated data on request to users within their respective service areas; k) In order to increase the effectiveness of world evaluation work: The IAEA should organise discussions on evaluation methods and bring together groups engaged in evaluation; l) The compilation of integral data should he considered by IWGPR and EACRP, the centres may have a role to play in the dissemination of compiled integral data; m) The centres' activities should be publicised in such a way as to increase their usefulness per unit cost; n) The centres should keep abreast of general developments in computer technology and information science and seek closer links with other centres working in related fields; o) The need and right of individual centres and their associated committees to give different priorities to the implementation of these recommendations should be recognized but several centres should work for ever-increasing cooperation. No attempt has been made to classify these recommendations as directed specifically to the IAEA, INDC, Centres etc. The Panel believes they should be studied by all appropriate corporate bodies and individuals

  13. Compilation of Cognitive and Personality Norms for Military Aviators.

    Science.gov (United States)

    Carretta, Thomas R; King, Raymond E; Ree, Malcolm James; Teachout, Mark S; Barto, Erica

    2016-09-01

    The assessment of individuals on abilities or other characteristics is based on comparison to a representative sample. General population norms provide an appropriate reference group when the distribution of scores in the sample can be expected to be similar to those for the general population (e.g., comparing high school students at a particular school to national high school norms on a college entrance test). Specialized norms are needed, however, when subsets of the population differ from the population at large. Military pilot trainees represent a special population; they are highly screened on cognitive ability and other characteristics thought to be related to job performance. Other characteristics (e.g., personality) are thought to be "self-selected," resulting in distinctive profiles. Normative tables were developed for U.S. Air Force pilot trainees for two widely used tests, the Multidimensional Aptitude Battery-II (MAB-II) and NEO Personality Inventory-Revised (NEO PI-R). The MAB-II and NEO PI-R were administered to large samples of USAF cadets, ROTC students, and officers selected for pilot training. The mean MAB-II full-scale IQ was about 1.5 SD above the adult population norm and was much less variable, supporting the need for specialized norms. Tables showing the percentile equivalents are provided for use by clinicians. Use of these tables, in addition to, or in lieu of, commercially published norms, will prove helpful when clinical psychologists perform assessments on pilots; in particular when evaluating them for return-to-duty status following a disqualifying condition that may have affected cognitive functioning or emotional stability. Carretta TR, King RE, Ree MJ, Teachout MS, Barto E. Compilation of cognitive and personality norms for military aviators. Aerosp Med Hum Perform. 2016; 87(9):764-771.

  14. Evaluation and compilation of fission product yields 1993

    International Nuclear Information System (INIS)

    England, T.R.; Rider, B.F.

    1995-01-01

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993

  15. Evaluation and compilation of fission product yields 1993

    Energy Technology Data Exchange (ETDEWEB)

    England, T.R.; Rider, B.F.

    1995-12-31

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993.

  16. Nuclear physics at Ganil. A compilation 1989-1991

    International Nuclear Information System (INIS)

    1991-01-01

    This compilation deals with experimental and theoretical work performed at GANIL for the 1989-1991 years about the nuclear structure and nuclear reactions. During this period, the accelerator performances have been strongly increased, as well for the delivered energies and intensities as for the span of accelerated ions. In the experimental areas, a totally new data acquisition system has been set up, and the adding of a Wien filter to the Lise spectrometer results now in a versatile and efficient isotope separator, called LISE III. The time structure and the large intensity of the beam were decisive in identifying, for the first time, kaon production in heavy ions collisions at the GANIL subthreshold energies. Nucleons have to undergo several collisions before inducing such a process, and the strange particle emission should be very sensitive to the physical conditions of the hot and compressed interacting zone. Lead and Uranium beams now available at the Fermi energy, have been used to study the nuclear disassembly of very large and heavy systems. New results have been obtained on the collective flow in heavy ion reactions, giving new insights on the Equation of State problematics. In the field of nuclear structure, the magnetic spectrometer SPEG, coupled with large particle or gamma detectors shed light on new aspects of the giant resonance excitations. Exotic nuclei are extensively studied, with a particular emphasis on the 11Li nucleus. A new method of mass measurement, using the CSS2 as a mass separator, has been successfully tested; it will greatly improve the accuracy achieved on intermediate and heavy nuclei. Last but not least, the theory group is actively working to include fluctuations in the description of the nuclear dynamics and to characterise the onset of the multifragmentation process in heavy ion collisions. Author index and publication list are added

  17. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    Riemer, R.L.

    1992-01-01

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets

  18. A compiled checklist of seaweeds of Sudanese Red Sea coast

    Directory of Open Access Journals (Sweden)

    Nahid Abdel Rahim Osman

    2016-02-01

    Full Text Available Objective: To present an updated and compiled checklist of Sudanese seaweeds as an example for the region for conservational as well as developmental purposes. Methods: The checklist was developed based on both field investigations using line transect method at 4 sites along the Red Sea coast of Sudan and review of available studies done on Sudanese seaweeds. Results: In total 114 macroalgal names were recorded and were found to be distributed in 16 orders, 34 families, and 62 genera. The Rhodophyceae macroalgae contained 8 orders, 17 families, 32 genera and 47 species. The Phaeophyceae macroalgae composed of 4 orders, 5 families, 17 genera, and 28 species. The 39 species of the Chlorophyceae macroalgae belong to 2 classes, 4 orders, 12 families, and 14 genera. The present paper proposed the addition of 11 macroalgal taxa to be included in Sudan seaweeds species list. These include 3 red seaweed species, 1 brown seaweed species and 7 green seaweed species. Conclusions: This list is not yet inclusive and it only represents the macroalgal species common to the intertidal areas of Sudan Red Sea coast. Further investigation may reveal the presence of more species. While significant levels of diversity and endemism were revealed for other groups of organisms in the Red Sea region, similar work still has to be performed for seaweeds. Considering the impact of climate change on communities’ structure and composition and the growing risk of maritime transportation through the Red Sea particularly that may originate from oil tankers as well as that may emanate from oil exploration, baseline data on seaweeds are highly required for management purposes.

  19. Fifth Baltic Sea pollution load compilation (PLC-5)

    Energy Technology Data Exchange (ETDEWEB)

    Knuuttila, S.; Svendsen, L. M.; Staaf, H.; Kotilainen, P.; Boutrup, S.; Pyhala, M.; Durkin, M.

    2011-07-01

    This report includes the main results from the Fifth Pollution Load Compilation abbreviated PLC-5. It includes quantified annual waterborne total loads (from rivers, unmonitored and coastal areas as well as direct point and diffuse sources discharging directly to the Baltic Sea) from 1994 to 2008 to provide a basis for evaluating any decreasing (or increasing) trends in the total waterborne inputs to the Baltic Sea. Chapter 1 contains the objectives of PLC and the framework on classification of inputs and sources. Chapter 2 includes a short description of the Baltic Sea catchment area, while the methods for quantification and analysis together with quality assurance topics are briefly introduced in Chapter 3. More detailed information on methodologies is presented in the PLC-5 guidelines (HELCOM 2006). Chapter 4 reports the total inputs to the Baltic Sea of nutrients and selected heavy metals. Furthermore, the results of the quatification of discharges and losses of nitrogen and phosphorus from point and diffuse sources into inland surface waters within the Baltic Sea catchment area (source-oriented approach or gross loads) as well as the total load to the maritime area (load-oriented approarch or net loads) in 2006 are shown. Typically, results are presented by country and by main Baltic Sea sub-region. In Chapter 5, flow normalization is introduced and the results of trend analyses on 1994-2008 time series data on total waterborne loads of nitrogen and phosphorus are given together with a first evaluation of progress in obtaining the provisional reduction targets by country and by main Baltic Sea sub-region. Chapter 6 includes discussion of some of the main conclusions and advice for future PLCs. The annexes contain the flow-normalized annual load data and figures and tables with results from the PLC-5.

  20. Distribution of soil organic carbon in the conterminous United States

    Science.gov (United States)

    Bliss, Norman B.; Waltman, Sharon; West, Larry T.; Neale, Anne; Mehaffey, Megan; Hartemink, Alfred E.; McSweeney, Kevin M.

    2014-01-01

    The U.S. Soil Survey Geographic (SSURGO) database provides detailed soil mapping for most of the conterminous United States (CONUS). These data have been used to formulate estimates of soil carbon stocks, and have been useful for environmental models, including plant productivity models, hydrologic models, and ecological models for studies of greenhouse gas exchange. The data were compiled by the U.S. Department of Agriculture Natural Resources Conservation Service (NRCS) from 1:24,000-scale or 1:12,000-scale maps. It was found that the total soil organic carbon stock in CONUS to 1 m depth is 57 Pg C and for the total profile is 73 Pg C, as estimated from SSURGO with data gaps filled from the 1:250,000-scale Digital General Soil Map. We explore the non-linear distribution of soil carbon on the landscape and with depth in the soil, and the implications for sampling strategies that result from the observed soil carbon variability.