WorldWideScience

Sample records for unit map compiled

  1. Compiler validates units and dimensions

    Science.gov (United States)

    Levine, F. E.

    1980-01-01

    Software added to compiler for automated test system for Space Shuttle decreases computer run errors by providing offline validation of engineering units used system command programs. Validation procedures are general, though originally written for GOAL, a free-form language that accepts "English-like" statements, and may be adapted to other programming languages.

  2. Circum-Arctic Map Compilation

    Science.gov (United States)

    Saltus, Richard W.; Gaina, Carmen

    2007-05-01

    Second Workshop of the Circum-Arctic Geophysical Maps Project, Trondheim, Norway, 12-13 February 2007 The eyes of the world are increasingly focused on the polar regions. Exploration and assessment of energy and mineral resources for the growing world economy are moving to high-latitude frontier areas. The effects of climatic changes are particularly pronounced at these ends of the Earth and have already attracted worldwide attention and concern. Many recent articles related to the International Polar Year underscore the importance of even basic mapping of the Arctic and Antarctic.

  3. Geospatial compilation and digital map of centerpivot irrigated areas in the mid-Atlantic region, United States

    Science.gov (United States)

    Finkelstein, Jason S.; Nardi, Mark R.

    2015-01-01

    To evaluate water availability within the Northern Atlantic Coastal Plain, the U.S. Geological Survey, in cooperation with the University of Delaware Agricultural Extension, created a dataset that maps the number of acres under center-pivot irrigation in the Northern Atlantic Coastal Plain study area. For this study, the extent of the Northern Atlantic Coastal Plain falls within areas of the States of New York, New Jersey, Delaware, Maryland, Virginia, and North Carolina. The irrigation dataset maps about 271,900 acres operated primarily under center-pivot irrigation in 57 counties. Manual digitizing was performed against aerial imagery in a process where operators used observable center-pivot irrigation signatures—such as irrigation arms, concentric wheel paths through cropped areas, and differential colors—to identify and map irrigated areas. The aerial imagery used for digitizing came from a variety of sources and seasons. The imagery contained a variety of spatial resolutions and included online imagery from the U.S. Department of Agriculture National Agricultural Imagery Program, Microsoft Bing Maps, and the Google Maps mapping service. The dates of the source images ranged from 2010 to 2012 for the U.S. Department of Agriculture imagery, whereas maps from the other mapping services were from 2013.

  4. Hydrothermal alteration maps of the central and southern Basin and Range province of the United States compiled from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data

    Science.gov (United States)

    Mars, John L.

    2013-01-01

    Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and Interactive Data Language (IDL) logical operator algorithms were used to map hydrothermally altered rocks in the central and southern parts of the Basin and Range province of the United States. The hydrothermally altered rocks mapped in this study include (1) hydrothermal silica-rich rocks (hydrous quartz, chalcedony, opal, and amorphous silica), (2) propylitic rocks (calcite-dolomite and epidote-chlorite mapped as separate mineral groups), (3) argillic rocks (alunite-pyrophyllite-kaolinite), and (4) phyllic rocks (sericite-muscovite). A series of hydrothermal alteration maps, which identify the potential locations of hydrothermal silica-rich, propylitic, argillic, and phyllic rocks on Landsat Thematic Mapper (TM) band 7 orthorectified images, and geographic information systems shape files of hydrothermal alteration units are provided in this study.

  5. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  6. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  7. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  8. Compilation of VS30 Data for the United States

    Science.gov (United States)

    Yong, Alan; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Odum, Jack K.; Stephenson, William J.; Haefner, Scott

    2016-01-01

    VS30, the time-averaged shear-wave velocity (VS) to a depth of 30 meters, is a key index adopted by the earthquake engineering community to account for seismic site conditions. VS30 is typically based on geophysical measurements of VS derived from invasive and noninvasive techniques at sites of interest. Owing to cost considerations, as well as logistical and environmental concerns, VS30 data are sparse or not readily available for most areas. Where data are available, VS30 values are often assembled in assorted formats that are accessible from disparate and (or) impermanent Web sites. To help remedy this situation, we compiled VS30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, we have compiled VS30 values for 2,997 sites in the United States, along with metadata for each measurement from government-sponsored reports, Web sites, and scientific and engineering journals. Most of the data in our VS30 compilation originated from publications directly reporting the work of field investigators. A small subset (less than 20 percent) of VS30 values was previously compiled by the USGS and other research institutions. Whenever possible, VS30 originating from these earlier compilations were crosschecked against published reports. Both downhole and surface-based VS30 estimates are represented in our VS30 compilation. Most of the VS30 data are for sites in the western contiguous United States (2,141 sites), whereas 786 VS30 values are for sites in the Central and Eastern United States; 70 values are for sites in other parts of the United States, including Alaska (15 sites), Hawaii (30 sites), and Puerto Rico (25 sites). An interactive map is hosted on the primary USGS Web site for accessing VS30 data (http://earthquake.usgs.gov/research/vs30/).

  9. Hydrothermal Alteration Maps of the Central and Southern Basin and Range Province of the United States Compiled From Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Data

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and Interactive Data Language (IDL) logical operator algorithms were used to map...

  10. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    Science.gov (United States)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  11. Spatial Compilation of Holocene Volcanic Vents in the Western Conterminous United States

    Science.gov (United States)

    Ramsey, D. W.; Siebert, L.

    2015-12-01

    A spatial compilation of all known Holocene volcanic vents in the western conterminous United States has been assembled. This compilation records volcanic vent location (latitude/longitude coordinates), vent type (cinder cone, dome, etc.), geologic map unit description, rock type, age, numeric age and reference (if dated), geographic feature name, mapping source, and, where available, spatial database source. Primary data sources include: USGS geologic maps, USGS Data Series, the Smithsonian Global Volcanism Program (GVP) catalog, and published journal articles. A total of 726 volcanic vents have been identified from 45 volcanoes or volcanic fields spanning ten states. These vents are found along the length of the Cascade arc in the Pacific Northwest, widely around the Basin and Range province, and at the southern margin of the Colorado Plateau into New Mexico. The U.S. Geological Survey (USGS) National Volcano Early Warning System (NVEWS) identifies 28 volcanoes and volcanic centers in the western conterminous U.S. that pose moderate, high, or very high threats to surrounding communities based on their recent eruptive histories and their proximity to vulnerable people, property, and infrastructure. This compilation enhances the understanding of volcano hazards that could threaten people and property by providing the context of where Holocene eruptions have occurred and where future eruptions may occur. Locations in this compilation can be spatially compared to located earthquakes, used as generation points for numerical hazard models or hazard zonation buffering, and analyzed for recent trends in regional volcanism and localized eruptive activity.

  12. Computer-assisted methods for the construction, compilation and display of geoscientific maps

    Science.gov (United States)

    Gabert, Gottfried

    The paper reviews modern methods for map construction, compilation and display on the basis of current applications at the Geological Surveys of the Federal Republic of Germany and Lower Saxony. The graphical representation of geoscientific data, for example mapping and exploration results, is generally done in the traditional way of analog maps. Different possibilities to produce digital maps exist: map construction directly from geological field data, digitization of existing maps, especially manuscript maps, conversion of remotely sensed data into raster or vector maps.

  13. Unit 02 - Maps and Map Analysis

    OpenAIRE

    Unit 55, CC in GIS; Rhind, David

    1990-01-01

    This unit explores the map analysis roots of GIS. It discusses cartography and its relationship to GIS, including topics such as map types and characteristics, the concept of scale, map projections, applications of maps, computer-assisted cartography and geographic data display and analysis.

  14. Optimization of Map Compilation for County-level Land Consolidation Planning

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Based on practice of the land consolidation planning in Changfeng County of Hefei City,taking full account of reality of land consolidation and its significance as livelihood project,we analyzed map compilation procedure.In combination with actual effect of land consolidation,we carried out consolidation assessment of same elements by overall planning method,and optimized the map compilation for county-level land consolidation planning.Results show that planning map of land consolidation potential is to be improved and legends should be merged.After consolidation of legends,it is convenient to apply in potential planning map and solve complicated problem of reading maps.

  15. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  16. Completion of compilation of the 1:2,000,000-scale topographic map series of Mars

    Science.gov (United States)

    Wu, Sherman S. C.; Jordan, Raymond; Garcia, Patricia A.; Ablin, Karyn K.

    1991-06-01

    Using special photogrammetric techniques, Mars' topography is being systematically mapped at a scale of 1:2,000,000 from high-altitude Viking Orbiter images. Of the 140 maps in the series, 120 have previously been compiled on the AS-11AM analytical stereoplotters. In fiscal year 1991, the remaining 20 maps will be compiled (most of these are between +/- 30 degrees latitude and the poles). Elevations on the maps are related to the Mars topographic datum. The Mars planetwide control net is used for the control of compilation. The maps have a contour interval of 1 km and a vertical precision of +/- 1 km; thus, they are more detailed than previous maps.

  17. A Compilation of Vs30 Values in the United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Compiled Vs30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, there are 2,997 sites in the...

  18. The paradigm compiler: Mapping a functional language for the connection machine

    Science.gov (United States)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  19. North Alaska petroleum analysis: the regional map compilation

    Science.gov (United States)

    Saltus, Richard W.; Bird, Kenneth J.

    2003-01-01

    The U.S. Geological Survey initiated an effort to model north Alaskan petroleum systems. The geographic and geologic basis for modeling systems is provided by a set of regional digital maps that allow evaluation of the widest possible extent of each system. Accordingly, we laid out a rectangular map grid 1300 km (800 miles) east-west and 600 km (375 miles) north-south. The resulting map area extends from the Yukon Territory of Canada on the east to the Russian-U.S. Chukchi Sea on the west and from the Brooks Range on the south to the Canada basin-Chukchi borderland on the north. Within this map region, we combined disparate types of publicly available data to produce structure contour maps. Data types range from seismic-based mapping as in the National Petroleum Reserve to well penetrations in areas of little or no seismic data where extrapolation was required. With these types of data, we produced structure contour maps on three horizons: top of pre-Mississippian (basement), top of Triassic (Ellesmerian sequence), and top of Neocomian (Beaufortian sequence). These horizons, when combined with present-day topography and bathymetry, provide the bounding structural/stratigraphic surfaces of the north Alaskan petroleum province that mark major defining moments of the region's geologic history and allow regional portrayal of preserved sediment accumulations.

  20. Bedrock Outcrop Points Compilation

    Data.gov (United States)

    Vermont Center for Geographic Information — A compilation of bedrock outcrops as points and/or polygons from 1:62,500 and 1:24,000 geologic mapping by the Vermont Geological Survey, the United States...

  1. Compilation of geogenic radon potential map of Pest County, Hungary

    Science.gov (United States)

    Szabó, K. Zs.; Pásztor, L.; Horváth, Á.; Bakacsi, Zs.; Szabó, J.; Szabó, Cs.

    2010-05-01

    222Rn and its effect on the human health have recently received major importance in environmental studies. This natural radioactive gas accounts for about 9% of lung cancer death and about 2% of all deaths from cancer in Europe due to indoor radon concentrations. It moves into the buildings from the natural decay chain of uranium in soils, rocks and building materials. Radon mapping regionalizes the average hazard from radon in a selected area as a radon risk map. Two major methods (concerning the applied radon data) have been used for mapping. One uses indoor radon data whereas the other is based on soil gas radon data. The outputs of the second approach are the geogenic radon potential maps. The principal objective of our work is to take the first step in geogenic radon mapping in Hungary. Soil samples collected in Pest County (Central Region of Hungary) in the frame of a countrywide soil survey (Soil Information and Monitoring System) were studied to have empirical information of the potential radon risk. As the first two steps radium concentration of soil samples, collected at 43 locations sampling soil profiles by genetic horizons from the surface level down to 60-150 cm, were determined using HPGe gamma-spectroscopy technique, as well as measurement of radon exhalation on the soil samples were carried out applying closed radon accumulation chamber coupled with RAD7 radon monitor detector. From these data the exhalation coefficient was calculated, which shows how many percent of the produced radon can come out from the sample. This rate strongly depends on the depth: at circa 100 cm a drastic decrease have been noticed, which is explained by the change in soil texture. The major source of indoor radon is the soil gas radon concentration (Barnet et al., 2005). We estimated this value from the measured radon exhalation and calculated soil porosity and density. The soil gas radon concentration values were categorized after Kemski et al. (2001) and then the

  2. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  3. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  4. Database compilation for the geologic map of the San Francisco volcanic field, north-central Arizona

    Science.gov (United States)

    Bard, Joseph A.; Ramsey, David W.; Wolfe, Edward W.; Ulrich, George E.; Newhall, Christopher G.; Moore, Richard B.; Bailey, Norman G.; Holm, Richard F.

    2016-01-08

    The main component of this publication is a geologic map database prepared using geographic information system (GIS) applications. The geodatabase of geologic points, lines, and polygons was produced as a compilation from five adjoining map sections originally published as printed maps in 1987 (see references in metadata). Four of the sections (U.S. Geological Survey Miscellaneous Field Studies Maps MF–1957, MF–1958, MF–1959, MF–1960) were created by scanning and geo-referencing stable base map material consisting of mylar positives. The final section (MF–1956) was compiled by hand tracing an enlargement of the available printed paper base map onto mylar using a #00 rapidograph pen, the mylar positive was then digitally scanned and geo-referenced. This method was chosen because the original basemap materials (mylar positives) for the MF–1956 section were unavailable at the time of this publication. Due to the condition of the available MF–1956 map section used as the base (which had previously been folded) the accuracy within the boundary of the MF–1956 section is presumed to be degraded in certain areas. The locations of the degraded areas and the degree of degradation within these areas is unclear. Final compilation of the database was completed using the ArcScan toolset, and the Editor toolset in ESRI ArcMap 10.1. Polygon topology was created from the lines and labels were added to the resultant geological polygons, lines, and points. Joseph A. Bard and David W. Ramsey updated and corrected the geodatabase, created the metadata and web presence, and provided the GIS-expertise to bring the geodatabase and metadata to completion. Included are links to files to view or print the original map sheets and the accompanying pamphlets.

  5. Geologic Map of Alaska: geologic units

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset consists of a polygon coverage and associated attribute data derived from the 1980 Geologic Map of Alaska compiled by H.M. Beikman and published by the...

  6. Geologic map of the Shaida deposit and Misgaran prospect, Herat Province, Afghanistan, modified from the 1973 original map compilation of V.I. Tarasenko and others

    Science.gov (United States)

    Tucker, Robert D.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2014-01-01

    This map is a modified version of Geological map and map of useful minerals, Shaida area, scale 1:50,000, which was compiled by V.I. Tarasenko, N.I. Borozenets, and others in 1973. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in August 2010. This modified map illustrates the geological structure of the Shaida copper-lead-zinc deposit and Misgaran copper-lead-zinc prospect in western Afghanistan and includes cross sections of the same area. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross sections and includes modifications based on our examination of these documents and on observations made during our field visit. Elevations on the cross sections are derived from the original Soviet topography and might not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map. The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  7. Bedrock Geologic Map of Vermont - Units

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  8. Stress field modeling of the Carpathian Basin based on compiled tectonic maps

    Science.gov (United States)

    Albert, Gáspár; Ungvári, Zsuzsanna; Szentpéteri, Krisztián

    2014-05-01

    The estimation of the stress field in the Carpathian Basin is tackled by several authors. Their modeling methods usually based on measurements (borehole-, focal mechanism- and geodesic data) and the result is a possible structural pattern of the region. Our method works indirectly: the analysis is aimed to project a possible 2D stress field over the already mapped/known/compiled lineament pattern. This includes a component-wise interpolation of the tensor-field, which is based on the generated irregular point cloud in the puffer zone of the mapped lineaments. The interpolated values appear on contour and tensor maps, and show the relative stress field of the area. In 2006 Horváth et al. compiled the 'Atlas of the present-day geodynamics of the Pannonian basin'. To test our method we processed the lineaments of the 1:1 500 000 scale 'Map of neotectonic (active) structures' published in this atlas. The geodynamic parameters (i.e. normal, reverse, right- and left lateral strike-slip faults, etc.) of the lines on this map were mostly explained in the legend. We classified the linear elements according to these parameters and created a geo-referenced mapping database. This database contains the polyline sections of the map lineaments as vectors (i.e. line sections), and the directions of the stress field as attributes of these vectors. The directions of the dip-parallel-, strike-parallel- and vertical stress-vectors are calculated from the geodynamical parameters of the line section. Since we created relative stress field properties, the eigenvalues of the vectors were maximized to one. Each point in the point cloud inherits the stress property of the line section, from which it was derived. During the modeling we tried several point-cloud generating- and interpolation methods. The analysis of the interpolated tensor fields revealed that the model was able to reproduce a geodynamic synthesis of the Carpathian Basin, which can be correlated with the synthesis of the

  9. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  10. Compiling Dictionaries

    African Journals Online (AJOL)

    Information Technology

    quiring efficient techniques. The text corpus .... make the process of compiling a dictionary simpler and more efficient. If we are ever ... need a mass production technique. ..... Mapping semantic relationships in the lexicon using lexical functions.

  11. Basement domain map of the conterminous United States and Alaska

    Science.gov (United States)

    Lund, Karen; Box, Stephen E.; Holm-Denoma, Christopher S.; San Juan, Carma A.; Blakely, Richard J.; Saltus, Richard W.; Anderson, Eric D.; DeWitt, Ed

    2015-01-01

    The basement-domain map is a compilation of basement domains in the conterminous United States and Alaska designed to be used at 1:5,000,000-scale, particularly as a base layer for national-scale mineral resource assessments. Seventy-seven basement domains are represented as eighty-three polygons on the map. The domains are based on interpretations of basement composition, origin, and architecture and developed from a variety of sources. Analysis of previously published basement, lithotectonic, and terrane maps as well as models of planetary development were used to formulate the concept of basement and the methodology of defining domains that spanned the ages of Archean to present but formed through different processes. The preliminary compilations for the study areas utilized these maps, national-scale gravity and aeromagnetic data, published and limited new age and isotopic data, limited new field investigations, and conventional geologic maps. Citation of the relevant source data for compilations and the source and types of original interpretation, as derived from different types of data, are provided in supporting descriptive text and tables.

  12. Compilation of the neonatal palliative care clinical guideline in neonatal intensive care unit.

    Science.gov (United States)

    Zargham-Boroujeni, Ali; Zoafa, Aniyehsadat; Marofi, Maryam; Badiee, Zohreh

    2015-01-01

    Clinical guidelines are important instruments for increasing the quality of clinical practice in the treatment team. Compilation of clinical guidelines is important due to special condition of the neonates and the nurses facing critical conditions in the neonatal intensive care unit (NICU). With 98% of neonatal deaths occurring in NICUs in the hospitals, it is important to pay attention to this issue. This study aimed at compilation of the neonatal palliative care clinical guidelines in NICU. This study was conducted with multistage comparative strategies with localization in Isfahan in 2013. In the first step, the components of the neonatal palliative care clinical guidelines were determined by searching in different databases. In the second stage, the level of expert group's consensus with each component of neonatal palliative care in the nominal group and focus group was investigated, and the clinical guideline was written based on that. In the third stage, the quality and applicability were determined with the positive viewpoints of medical experts, nurses, and members of the science board of five cities in Iran. Data were analyzed by descriptive statistics through SPSS. In the first stage, the draft of neonatal palliative care was designed based on neonates', their parents', and the related staff's requirements. In the second stage, its rank and applicability were determined and after analyzing the responses, with agreement of the focus group, the clinical guideline was written. In the third stage, the means of indication scores obtained were 75%, 69%, 72%, 72%, and 68% by Appraisal of Guidelines for Research and Evaluation (AGREE) instrument. The compilation of the guideline can play an effective role in provision of neonatal care in nursing.

  13. Map of assessed shale gas in the United States, 2012

    Science.gov (United States)

    ,; Biewick, Laura R. H.

    2013-01-01

    The U.S. Geological Survey has compiled a map of shale-gas assessments in the United States that were completed by 2012 as part of the National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the U.S. Geological Survey quantitatively estimated potential volumes of undiscovered gas within shale-gas assessment units. These shale-gas assessment units are mapped, and square-mile cells are shown to represent proprietary shale-gas wells. The square-mile cells include gas-producing wells from shale intervals. In some cases, shale-gas formations contain gas in deeper parts of a basin and oil at shallower depths (for example, the Woodford Shale and the Eagle Ford Shale). Because a discussion of shale oil is beyond the scope of this report, only shale-gas assessment units and cells are shown. The map can be printed as a hardcopy map or downloaded for interactive analysis in a Geographic Information System data package using the ArcGIS map document (file extension MXD) and published map file (file extension PMF). Also available is a publications access table with hyperlinks to current U.S. Geological Survey shale gas assessment publications and web pages. Assessment results and geologic reports are available as completed at the U.S. Geological Survey Energy Resources Program Web Site, http://energy.usgs.gov/OilGas/AssessmentsData/NationalOilGasAssessment.aspx. A historical perspective of shale gas activity in the United States is documented and presented in a video clip included as a PowerPoint slideshow.

  14. Geologic map of the western Haji-Gak iron deposit, Bamyan Province, Afghanistan, modified from the 1965 original map compilation of V.V. Reshetniak and I.K. Kusov

    Science.gov (United States)

    Renaud, Karine M.; Tucker, Robert D.; Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geologic-prospecting plan of western area of Hajigak iron-ore deposit, scale 1:2,000, which was compiled by V.V. Reshetniak and I.K. Kusov in 1965. (Refer to the References Cited section in the Map PDF for complete citations of the original map and related reports.) USGS scientists, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original documents and also visited the field area in November 2009. This modified map illustrates the geological structure of the western Haji-Gak iron deposit and includes cross sections of the same area. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and includes modifications based on our examination of that document. We constructed the cross sections from data derived from the original map. Elevations on the cross sections are derived from the original Soviet topography and may not match the newer topography used on the current map. We have attempted to translate the original Russian terminology and rock classification into modern English geologic usage as literally as possible without changing any genetic or process-oriented implications in the original descriptions. We also use the age designations from the original map. The unit colors on the map and cross sections differ from the colors shown on the original version. The units are colored according to the color and pattern scheme of the Commission for the Geological Map of the World (CGMW) (http://www.ccgm.org).

  15. Genetic map of triticale compiling DArT, SSR, and AFLP markers.

    Science.gov (United States)

    Tyrka, M; Bednarek, P T; Kilian, A; Wędzony, M; Hura, T; Bauer, E

    2011-05-01

    A set of 90 doubled haploid (DH) lines derived from F(1) plants that originated from a cross between × Triticosecale Wittm. 'Saka3006' and ×Triticosecale Wittm. 'Modus', via wide crossing with maize, were used to create a genetic linkage map of triticale. The map has 21 linkage groups assigned to the A, B, and R genomes including 155 simple sequence repeat (SSR), 1385 diversity array technology (DArT), and 28 amplified fragment length polymorphism (AFLP) markers covering 2397 cM with a mean distance between two markers of 4.1 cM. Comparative analysis with wheat consensus maps revealed that triticale chromosomes of the A and B genomes were represented by 15 chromosomes, including combinations of 2AS.2AL#, 2AL#2BL, 6AS.6AL#, and 2BS.6AL# instead of 2A, 2B, and 6A. In respect to published maps of rye, substantial rearrangements were found also for chromosomes 1R, 2R, and 3R of the rye genome. Chromosomes 1R and 2R were truncated and the latter was linked with 3R. A nonhomogeneous distribution of markers across the triticale genome was observed with evident bias (48%) towards the rye genome. This genetic map may serve as a reference linkage map of triticale for efficient studies of structural rearrangements, gene mapping, and marker-assisted selection.

  16. Compilation of functional soil maps for the support of spatial planning and land management in Hungary

    Science.gov (United States)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor; Fodor, Nándor; Illés, Gábor; Bakacsi, Zsófia; Szabó, József

    2015-04-01

    The main objective of the DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) project is to significantly extend the potential, how demands on spatial soil related information could be satisfied in Hungary. Although a great amount of soil information is available due to former mappings and surveys, there are more and more frequently emerging discrepancies between the available and the expected data. The gaps are planned to be filled with optimized DSM products heavily based on legacy soil data. Delineation of Areas with Excellent Productivity in the framework of the National Regional Development Plan or delimitation of Areas with Natural Constraints in Hungary according to the common European biophysical criteria are primary issues in national level spatial planning. Impact assessment of the forecasted climate change and the analysis of the possibilities of the adaptation in the agriculture and forestry can be supported by scenario based land management modelling, whose results can be also incorporated in spatial planning. All these challenges require adequate, preferably timely and spatially detailed knowledge of the soil cover. For the satisfaction of these demands the soil conditions of Hungary have been digitally mapped based on the most detailed, available recent and legacy soil data, applying proper DSM techniques. Various soil related information were mapped in three distinct approaches: (i) basic soil properties determining agri-environmental conditions (e.g.: soil type according to the Hungarian genetic classification, rootable depth, sand, silt and clay content by soil layers, pH, OM and carbonate content for the plough layer); (ii) biophysical criteria of natural handicaps (e.g.: poor drainage, unfavourable texture and stoniness, shallow rooting depth, poor chemical properties and soil moisture balance) defined by common European system and (iii) agro-meteorologically modelled yield values for different crops, meteorological

  17. Mapping and prediction of schistosomiasis in Nigeria using compiled survey data and Bayesian geospatial modelling

    DEFF Research Database (Denmark)

    Ekpo, Uwem F.; Hürlimann, Eveline; Schur, Nadine

    2013-01-01

    Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35 of the cou......Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35......% confidence interval (CI): 22.8-23.1%). The model suggests that the mean temperature, annual precipitation and soil acidity significantly influence the spatial distribution. Prevalence estimates, adjusted for school-aged children in 2010, showed that the prevalence is...

  18. Global Map: Ports of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing ferry ports in the United States and Puerto Rico. The data are a modified version of the National Atlas of the United...

  19. Map of assessed continuous (unconventional) oil resources in the United States, 2014

    Science.gov (United States)

    ,; Biewick, Laura R. H.

    2015-01-01

    The U.S. Geological Survey (USGS) conducts quantitative assessments of potential oil and gas resources of the onshore United States and associated coastal State waters. Since 2000, the USGS has completed assessments of continuous (unconventional) resources in the United States based on geologic studies and analysis of well-production data and has compiled digital maps of the assessment units classified into four categories: shale gas, tight gas, coalbed gas, and shale oil or tight oil (continuous oil). This is the fourth digital map product in a series of USGS unconventional oil and gas resource maps; its focus being shale-oil or tight-oil (continuous-oil) assessments. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, which includes an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and a published map file (.pmf). Supporting geologic studies of total petroleum systems and assessment units, as well as studies of the methodology used in the assessment of continuous-oil resources in the United States, are listed with hyperlinks in table 1. Assessment results and geologic reports are available at the USGS websitehttp://energy.usgs.gov/OilGas/AssessmentsData/NationalOilGasAssessment.aspx.

  20. Global Map: Railroad Stations of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing Amtrak intercity railroad terminals in the United States. The data are a modified version of the National Atlas of...

  1. Seismic Hazard Map for the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer shows seismic hazard in the United States. The data represent a model showing the probability that ground motion will reach a certain level. This map...

  2. Global Map: Airports of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing airports in the United States, Puerto Rico and the U.S. Virgin Islands. The data are a modified version of the...

  3. Use of Bedrock and Geomorphic Mapping Compilations in Assessing Geologic Hazards at Recreation Sites on National Forests in NW California

    Science.gov (United States)

    de La Fuente, J. A.; Bell, A.; Elder, D.; Mowery, R.; Mikulovsky, R.; Klingel, H.; Stevens, M.

    2010-12-01

    coverage is a compilation of the best available mapping for all National Forests in California. The geomorphic coverage includes features such as active and dormant landslides, alluvial fans, headwall basins, glacial features, and valley inner gorge. Criteria will be developed which utilize elements of this data to evaluate geologic hazards in the vicinity of developed recreation sites. The second phase will be conducted later and involves site specific analyses focusing on areas identified as higher hazard in the first phase, along with verification and updating of phase 1 findings. The third phase will complete any site level geologic or hydrologic investigations, and wrap up the hazard assessment process. A summary report with hazard maps and recommendations will be prepared at the end of each phase. The overriding goal of this project is to provide sound geologic information to managers so they can use a science-based approach in recognizing and managing geologic hazards at recreation sites.

  4. Quaternary geologic map of the Boston 4 degrees x 6 degrees quadrangle, United States and Canada

    Science.gov (United States)

    State compilations by Hartshorn, Joseph H.; Thompson, W.B.; Chapman, W.F.; Black, R.F.; Richmond, Gerald Martin; Grant, D.R.; Fullerton, David S.; edited and integrated by Richmond, Gerald Martin

    1991-01-01

    The Quaternary Geologic Map of the Boston 4 deg x 6 deg Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale.

  5. Map service: United States Decadal Production History Cells

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map service displays present and past oil and gas production in the United States, as well as the location and intensity of exploratory drilling outside...

  6. Map service: United States Oil and Gas Production 2008

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map service displays present and past oil and gas production in the United States, as well as the location and intensity of exploratory drilling outside...

  7. Quaternary Geologic Map of the Lake of the Woods 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Goebel, Joseph E.; Ringrose, Susan M.; Edited and Integrated by Fullerton, David S.

    1995-01-01

    The Quaternary Geologic Map of the Lake of the Woods 4 deg x 6 deg Quadrangle, United States and Canada, was mapped as part of the U.S. Geological Survey Quaternary Geologic Atlas of the United States map series (Miscellaneous Investigations Series I-1420, NM-15). The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the Minnesota Geological Survey, the Manitoba Department of Energy and Mines, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and

  8. Quaternary geologic map of the Winnipeg 4 degrees x 6 degrees quadrangle, United States and Canada

    Science.gov (United States)

    Fullerton, D. S.; Ringrose, S.M.; Clayton, Lee; Schreiner, B.T.; Goebel, J.E.

    2000-01-01

    The Quaternary Geologic Map of the Winnipeg 4? ? 6? Quadrangle, United States and Canada, is a component of the U.S. Geological Survey Quaternary Geologic Atlas of the United States map series (Miscellaneous Investigations Series I-1420), an effort to produce 4? ? 6? Quaternary geologic maps, at 1:1 million scale, of the entire conterminous United States and adjacent Canada. The map and the accompanying text and supplemental illustrations provide a regional overview of the areal distributions and characteristics of surficial deposits and materials of Quaternary age (~1.8 Ma to present) in parts of North Dakota, Minnesota, Manitoba, and Saskatchewan. The map is not a map of soils as soils are recognized in agriculture. Rather, it is a map of soils as recognized in engineering geology, or of substrata or parent materials in which agricultural soils are formed. The map units are distinguished chiefly on the basis of (1)genesis (processes of origin) or environments of deposition: for example, sediments deposited primarily by glacial ice (glacial deposits or till), sediments deposited in lakes (lacustrine deposits), or sediments deposited by wind (eolian deposits); (2) age: for example, how long ago the deposits accumulated; (3) texture (grain size)of the deposits or materials; (4) composition (particle lithology) of the deposits or materials; (5) thickness; and (6) other physical, chemical, and engineering properties. Supplemental illustrations show (1) temporal correlation of the map units, (2) the areal relationships of late Wisconsin glacial ice lobes and sublobes, (3) temporal and spatial correlation of late Wisconsin glacial phases, readvance limits, and ice margin stillstands, (4) temporal and stratigraphic correlation of surface and subsurface glacial deposits in the Winnipeg quadrangle and in adjacent 4? ? 6? quadrangles, and (5) responsibility for state and province compilations. The database provides information related to geologic hazards (for example

  9. COMPILATION OF GEOMORPHOLOGICAL MAP FOR RECONSTRUCTING THE DEGLACIATION OF ICE-FREE AREAS IN THE MARTEL INLET, KING GEORGE ISLAND, ANTARCTICA

    Directory of Open Access Journals (Sweden)

    Kátia Kellem Rosa

    2014-03-01

    Full Text Available We compiled a geomorphological map and a reconstruction map of glacier extension and ice-free areas in the Martel Inlet, located in King George Island, South Shetlands, Antarctica. Glacier extension data were derived of the digitized over a orthophotomosaic (2003, SPOT (February, 1988; March, 1995 and 2000, Quickbird (October, 2006 and Cosmo-Skymed (February, 2011 images. This mapping was supported by fieldworks carried out in the summers of 2007, 2010 and 2011, and by topographic surveys and geomorphic map in the proglacial area. Several types of glacial deposits were identified in the study area, such as frontal and lateral moraines, flutes, meltwater channels and erosional features like rock moutonnés, striations and U-shaped valleys. These features allowed reconstructing the evolution of the deglaciation environment in the Martel Inlet ice-free areas, which has been affected by a regional climate warming trend. The mapped data indicated the glaciers in study area lost about 0.71 km² of their ice masses (13.2% of the 50.3 km² total area, without any advances during 1979-2011. Since those years these glaciers receded by an average of 25.9 m a-1. These ice-free areas were susceptible to rapid post-depositional changes.

  10. Quaternary Geologic Map of the Lake Nipigon 4 Degrees x 6 Degrees Quadrangle, United States and Canada

    Science.gov (United States)

    Sado, Edward V.; Fullerton, David S.; Farrand, William R.; Edited and Integrated by Fullerton, David S.

    1994-01-01

    The Quaternary Geologic Map of the Lake Nipigon 4 degree x 6 degree Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the earth. They make up the 'ground' on which we walk, the 'dirt' in which we dig foundations, and the 'soil' in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. This map is a product of collaboration of the Ontario Geological Survey, the University of Michigan, and the U.S. Geological Survey, and is designed for both scientific and practical purposes. It was prepared in two stages. First, separate maps and map explanations were prepared by the compilers. Second, the maps were combined, integrated, and supplemented by the editor. Map unit symbols were revised to a uniform system of classification and the map unit descriptions were prepared by the editor from information received from the compilers and from additional sources listed under Sources of Information. Diagrams accompanying the map were prepared by the editor. For scientific purposes, the map differentiates Quaternary surficial deposits on the basis of lithology or composition, texture or particle size, structure, genesis, stratigraphic relationships, engineering geologic properties, and relative age, as shown on the correlation diagram and indicated in the map unit descriptions. Deposits of some constructional landforms, such as kame moraine deposits, are distinguished as map units. Deposits of

  11. USGS Governmental Unit Boundaries Overlay Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Governmental Unit Boundaries service from The National Map (TNM) represents major civil areas for the Nation, including States or Territories, counties (or...

  12. Modern aerial gamma-ray spectrometry and regional potassium map of the conterminous United States

    Science.gov (United States)

    Duval, Joseph S.

    1990-01-01

    Aerial gamma-ray surveys of the natural environment measure the flux of gamma rays produced by the radioactive decay of 40K, 214Bi, and 208Tl in the upper 10–20 cm of surface materials. 40K is a radioactive potassium isotope which can be used to estimate the total amount of potassium in the soils and rocks. 214Bi is a decay product of the 238U radioactive decay series and is used to estimate the uranium concentrations, and 208Tl, a decay product of the 232Th radioactive decay series, is used to estimate thorium concentrations. Aerial gamma-ray data covering the 48 contiguous states of the United States have been compiled to produce maps showing the distributions of equivalent uranium, equivalent thorium, and potassium. This compilation involved processing the aerial survey data from about 470 1° × 2° quadrangle maps.

  13. Geologic map of the Zarkashan-Anguri copper and gold deposits, Ghazni Province, Afghanistan, modified from the 1968 original map compilation of E.P. Meshcheryakov and V.P. Sayapin

    Science.gov (United States)

    Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological map of the area of Zarkashan-Anguri gold deposits, scale 1:50,000, which was compiled by E.P. Meshcheryakov and V.P. Sayapin in 1968. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in April 2010. This modified map, which includes a cross section, illustrates the geologic setting of the Zarkashan-Anguri copper and gold deposits. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross section and includes modifications based on our examination of that and other documents, and based on observations made and sampling undertaken during our field visit. (Refer to the Introduction and the References in the Map PDF for an explanation of our methodology and for complete citations of the original map and related reports.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map.

  14. Quaternary geologic map of the White Lake 4° x 6° quadrangle, United States

    Science.gov (United States)

    State compilations by Pope, David E.; Gilliland, William A.; Wermund, E.G.; edited and integrated by Richmond, Gerald Martin; Weide, David L.; Moore, David W.; Digital edition by Bush, Charles A.

    1990-01-01

    This map is part of the Quaternary Geologic Atlas of the United States (I-1420). It was first published as a printed edition in 1990. The geologic data have now been captured digitally and are presented here along with images of the printed map sheet and component parts as PDF files. The Quaternary Geologic Map of the White Lake 4° x 6° Quadrangle was mapped as part of the Quaternary Geologic Atlas of the United States. The atlas was begun as an effort to depict the areal distribution of surficial geologic deposits and other materials that accumulated or formed during the past 2+ million years, the period that includes all activities of the human species. These materials are at the surface of the Earth. They make up the ground on which we walk, the dirt in which we dig foundations, and the soil in which we grow crops. Most of our human activity is related in one way or another to these surface materials that are referred to collectively by many geologists as regolith, the mantle of fragmental and generally unconsolidated material that overlies the bedrock foundation of the continent. The maps were compiled at 1:1,000,000 scale. In recent years, surficial deposits and materials have become the focus of much interest by scientists, environmentalists, governmental agencies, and the general public. They are the foundations of ecosystems, the materials that support plant growth and animal habitat, and the materials through which travels much of the water required for our agriculture, our industry, and our general well being. They also are materials that easily can become contaminated by pesticides, fertilizers, and toxic wastes. In this context, the value of the surficial geologic map is evident.

  15. 谈东干渠地区山洪风险图的编制%Compilation of Flash Flood Risk Map in Area around East Trunk Canal

    Institute of Scientific and Technical Information of China (English)

    郭梁

    2012-01-01

    山洪灾害一直以来是山区最严重的自然灾害,治理山洪灾害是山区防灾的重点.山洪风险图是以图表的形式反映一定频率洪水的分布、淹没范围、水深、流量大小的专题风险图,编制山洪风险图能够为山区制定防洪规划提供科学依据.本文以宁夏河东灌区的东干渠地区为例,论述了在该地区编制山洪风险图的具体步骤以及分析了在编制中存在的难点.%Flash flood has been a most serious natural disasters in mountain areas, governance is the focus of mountain disaster prevention. Flash flood risk maps reflect the distribution, flooded area, water depth, flow size of the flood with certain frequency in the form of charts. Compiling flash flood risk maps can provide the scientific basis for the development of flood control planning in mountain areas.Taking the east trunk canal area in Ningxia east irrigation district for example, this paper discusses the specific steps for preparing flash flood risk maps and analyses the difficulties.

  16. Digital Compilation of "Preliminary Map of Landslide Deposits in Santa Cruz County, California, By Cooper-Clark and Associates, 1975": A Digital Map Database

    Science.gov (United States)

    Report by Roberts, Sebastian; Barron, Andrew D.; Preface by Brabb, Earl E.; Pike, Richard J.

    1998-01-01

    A 1:62,500-scale black-and-white map identifying some 2,000 landslides of various types in Santa Cruz County, California, has been converted to a digital-map database that can be acquired from the U.S. Geological Survey over the Internet or on magnetic tape.

  17. Improved method for drawing of a glycan map, and the first page of glycan atlas, which is a compilation of glycan maps for a whole organism.

    Science.gov (United States)

    Natsuka, Shunji; Masuda, Mayumi; Sumiyoshi, Wataru; Nakakita, Shin-ichi

    2014-01-01

    Glycan Atlas is a set of glycan maps over the whole body of an organism. The glycan map that includes data of glycan structure and quantity displays micro-heterogeneity of the glycans in a tissue, an organ, or cells. The two-dimensional glycan mapping is widely used for structure analysis of N-linked oligosaccharides on glycoproteins. In this study we developed a comprehensive method for the mapping of both N- and O-glycans with and without sialic acid. The mapping data of 150 standard pyridylaminated glycans were collected. The empirical additivity rule which was proposed in former reports was able to adapt for this extended glycan map. The adapted rule is that the elution time of pyridylamino glycans on high performance liquid chromatography (HPLC) is expected to be the simple sum of the partial elution times assigned to each monosaccharide residue. The comprehensive mapping method developed in this study is a powerful tool for describing the micro-heterogeneity of the glycans. Furthermore, we prepared 42 pyridylamino (PA-) glycans from human serum and were able to draw the map of human serum N- and O-glycans as an initial step of Glycan Atlas editing.

  18. Improved method for drawing of a glycan map, and the first page of glycan atlas, which is a compilation of glycan maps for a whole organism.

    Directory of Open Access Journals (Sweden)

    Shunji Natsuka

    Full Text Available Glycan Atlas is a set of glycan maps over the whole body of an organism. The glycan map that includes data of glycan structure and quantity displays micro-heterogeneity of the glycans in a tissue, an organ, or cells. The two-dimensional glycan mapping is widely used for structure analysis of N-linked oligosaccharides on glycoproteins. In this study we developed a comprehensive method for the mapping of both N- and O-glycans with and without sialic acid. The mapping data of 150 standard pyridylaminated glycans were collected. The empirical additivity rule which was proposed in former reports was able to adapt for this extended glycan map. The adapted rule is that the elution time of pyridylamino glycans on high performance liquid chromatography (HPLC is expected to be the simple sum of the partial elution times assigned to each monosaccharide residue. The comprehensive mapping method developed in this study is a powerful tool for describing the micro-heterogeneity of the glycans. Furthermore, we prepared 42 pyridylamino (PA- glycans from human serum and were able to draw the map of human serum N- and O-glycans as an initial step of Glycan Atlas editing.

  19. Geologic map of the Ahankashan-Rakhna basin, Badghis, Ghor, and Herat Provinces, Afghanistan, modified from the 1974 original map compilation of Y.I. Shcherbina and others

    Science.gov (United States)

    Tucker, Robert D.; Stettner, Will R.; Masonic, Linda M.; Bogdanow, Anya K.

    2014-01-01

    This geologic map of the Ahankashan-Rakhna basin, Afghanistan, is a redrafted and modified version of the Geological map of the area of Ahankashan-Rakhna basin, scale 1:50,000 and Geological map of the Ahankashan area with data on mineral resources, scale 1:12,000 from Shcherbina and others (1974) (Soviet report no. 0822). That unpublished Soviet report contains the original maps and cross sections, which were prepared in cooperation with the Ministry of Mines and Industries of the Republic of Afghanistan in Kabul during 1974 under contract no. 50728 (Technoexport, USSR). The redrafted maps and cross sections in this USGS publication illustrate the geology of the Ahankashan and Rakhna basins, located within Badghis, Ghor, and Herat Provinces.

  20. Geologic map of Kundelan ore deposits and prospects, Zabul Province, Afghanistan; modified from the 1971 original map compilations of K.I. Litvinenko and others

    Science.gov (United States)

    Tucker, Robert D.; Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2015-10-26

    This map and cross sections are redrafted modified versions of the Geological map of the Kundelan ore deposit area, scale 1:10,000 (graphical supplement no. 18) and the Geological map of the Kundelan deposits, scale 1:2,000 (graphical supplement no. 3) both contained in an unpublished Soviet report by Litvinenko and others (1971) (report no. 0540). The unpublished Soviet report was prepared in cooperation with the Ministry of Mines and Industries of the Royal Government of Afghanistan in Kabul during 1971. This redrafted map and cross sections illustrate the geology of the main Kundelan copper-gold skarn deposit, located within the Kundelan copper and gold area of interest (AOI), Zabul Province, Afghanistan. Areas of interest (AOIs) of non-fuel mineral resources within Afghanistan were first described and defined by Peters and others (2007) and later by the work of Peters and others (2011a). The location of the main Kundelan copper-gold skarn deposit (area of this map) and the Kundelan copper and gold AOI is shown on the index map provided on this map sheet.

  1. THE COMPILATION OF A DTM AND A NEW SATELLITE IMAGE MAP FOR KING GEORGE ISLAND,ANTARCTICA

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    An improved topographic database for King George Island,one of the most frequently visited regions in Antarctica,is presented.A first step consisted in combining data from differential GPS surveys gained during the austral summers 1997~1998 and 1999~2000,with the current coastline from a SPOT satellite image mosaic,topographic information from existing maps and from the Antarctic Digital Database.From this data sets,a digital terrain model (DTM) was generated using Arc/Info GIS.In a second step,a satellite image map at the scale 1∶100 000 was assembled from contour lines derived from the DTM and the satellite mosaic.A lack of accurate topographic information in the eastern part of the island was identified.Additional topographic surveying or SAR interferometry should be used to improve the data quality in that area.The GIS integrated database will be indispensable for glaciological and climatological studies and administrative and scientific purposes.In future,the application of GIS techniques will be mandatory for environmental impact studies and environmental monitoring as well as for management plans on King George Island.

  2. Digital Compilation of 1∶1 000 000 Geological and Geophysical Map Series of China and Adjacent Regions on MapGIS Platform%中国海域1:100万地质地球物理MapGIS制图

    Institute of Scientific and Technical Information of China (English)

    温珍河; 张训华; 杨金玉; 尹延鸿; 邱燕; 王乃东; 张明

    2011-01-01

    We try to summarize the techniques used for digital map compilation for the marine territory of China in this paper. The new version of 1∶1000000 geological and geophysical map series of China Seas and Adjacent Regions include five maps, I. E. The Bouguer gravity anomaly map, spatial gravity anomaly map, magnetic anomaly map, surface sediment distribution map and regional geotectonic map. The mapping area also covers part of the land and adjacent areas in addition to China seas. The geological and geo physical maps have not been renewed since the last map series published 20 years ago. And the maps published then were drawn by hands. Either precision or quality is not able to meet the demand of users. This time; we compile maps with MapGIS as a platform. Various equipments, such as computers, advanced digitizers and other facilities, are adopted for the map compilation. At first, we, using digital map compilation technology, standardized and unified all the data, including both the old and new data, and the data collected with different methods and from different channels. Rationale fitting technique and reliable coordinates transform are used to guarantee the quality of the maps. In the process of map compilation, a GIS database has been constructed. All the maps, including the base map and specific maps, were drawn on the MapGIS platform using point, line and polygons as legends and layers were designed and constructed according to the data available and the contents the maps need. The use of the MapGIS platform made us possible to ensure the precision of the maps and enhance the efficiency of map compilation. It is believed that the compilation and publication of the new version of 1∶1 000 000 Geological and Geophysical Map Series of China Seas and Adjacent Regions will certainly help the readers better understand the basic geological regulations of China.%中国海域1∶100万地质地球物理系列图包括布格重力异常图、空同重力异常图

  3. Seismic-hazard maps for the conterminous United States, 2014

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter M.; Mueller, Charles S.; Haller, Kathleen M.; Frankel, Arthur D.; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen C.; Boyd, Oliver S.; Field, Edward H.; Chen, Rui; Luco, Nicolas; Wheeler, Russell L.; Williams, Robert A.; Olsen, Anna H.; Rukstales, Kenneth S.

    2015-01-01

    The maps presented here provide an update to the 2008 data contained in U.S Geological Survey Scientific Investigations Map 3195 (http://pubs.usgs.gov/sim/3195/).Probabilistic seismic-hazard maps were prepared for the conterminous United States for 2014 portraying peak horizontal acceleration and horizontal spectral response acceleration for 0.2- and 1.0-second periods with probabilities of exceedance of 10 percent in 50 years and 2 percent in 50 years. All of the maps were prepared by combining the hazard derived from spatially smoothed historical seismicity with the hazard from fault-specific sources. The acceleration values contoured are the random horizontal component. The reference site condition is firm rock, defined as having an average shear-wave velocity of 760 m/s in the top 30 meters corresponding to the boundary between NEHRP (National Earthquake Hazards Reduction program) site classes B and C.

  4. A proposal of Potentially Meaningful Teaching Unit using concept maps

    Directory of Open Access Journals (Sweden)

    Thaís Rafaela Hilger

    2013-12-01

    Full Text Available This paper presents preliminary results from the implementation of a Potentially Meaningful Teaching Unit in four classes of third grade of secondary educational from a public school in the city of Bagé (Rio Grande do Sul. The proposed content deals with concepts related to Quantum Physics (quantization, uncertainty principle, state and superposition of states, presented in accordance with the sequence of eight steps of Potentially Meaningful Teaching Unit, seeking meaningful learning of these concepts. Are analyzed in this work mental maps and concept maps produced in pairs, as well as the comparison between them. Also presented are some comments from students about their development in the understanding of the concepts covered in the proposal. The proposal was a well received and, although the study is still in progress and part of a broader research, already provide evidence of significant learning, which is the goal of a Potentially Meaningful Teaching Unit.

  5. Mapping Curie temperature depth in the western United States with a fractal model for crustal magnetization

    Science.gov (United States)

    Bouligand, C.; Glen, J.M.G.; Blakely, R.J.

    2009-01-01

    We have revisited the problem of mapping depth to the Curie temperature isotherm from magnetic anomalies in an attempt to provide a measure of crustal temperatures in the western United States. Such methods are based on the estimation of the depth to the bottom of magnetic sources, which is assumed to correspond to the temperature at which rocks lose their spontaneous magnetization. In this study, we test and apply a method based on the spectral analysis of magnetic anomalies. Early spectral analysis methods assumed that crustal magnetization is a completely uncorrelated function of position. Our method incorporates a more realistic representation where magnetization has a fractal distribution defined by three independent parameters: the depths to the top and bottom of magnetic sources and a fractal parameter related to the geology. The predictions of this model are compatible with radial power spectra obtained from aeromagnetic data in the western United States. Model parameters are mapped by estimating their value within a sliding window swept over the study area. The method works well on synthetic data sets when one of the three parameters is specified in advance. The application of this method to western United States magnetic compilations, assuming a constant fractal parameter, allowed us to detect robust long-wavelength variations in the depth to the bottom of magnetic sources. Depending on the geologic and geophysical context, these features may result from variations in depth to the Curie temperature isotherm, depth to the mantle, depth to the base of volcanic rocks, or geologic settings that affect the value of the fractal parameter. Depth to the bottom of magnetic sources shows several features correlated with prominent heat flow anomalies. It also shows some features absent in the map of heat flow. Independent geophysical and geologic data sets are examined to determine their origin, thereby providing new insights on the thermal and geologic crustal

  6. Landscape similarity, retrieval, and machine mapping of physiographic units

    Science.gov (United States)

    Jasiewicz, Jaroslaw; Netzel, Pawel; Stepinski, Tomasz F.

    2014-09-01

    We introduce landscape similarity - a numerical measure that assesses affinity between two landscapes on the basis of similarity between the patterns of their constituent landform elements. Such a similarity function provides core technology for a landscape search engine - an algorithm that parses the topography of a study area and finds all places with landscapes broadly similar to a landscape template. A landscape search can yield answers to a query in real time, enabling a highly effective means to explore large topographic datasets. In turn, a landscape search facilitates auto-mapping of physiographic units within a study area. The country of Poland serves as a test bed for these novel concepts. The topography of Poland is given by a 30 m resolution DEM. The geomorphons method is applied to this DEM to classify the topography into ten common types of landform elements. A local landscape is represented by a square tile cut out of a map of landform elements. A histogram of cell-pair features is used to succinctly encode the composition and texture of a pattern within a local landscape. The affinity between two local landscapes is assessed using the Wave-Hedges similarity function applied to the two corresponding histograms. For a landscape search the study area is organized into a lattice of local landscapes. During the search the algorithm calculates the similarity between each local landscape and a given query. Our landscape search for Poland is implemented as a GeoWeb application called TerraEx-Pl and is available at http://sil.uc.edu/. Given a sample, or a number of samples, from a target physiographic unit the landscape search delineates this unit using the principles of supervised machine learning. Repeating this procedure for all units yields a complete physiographic map. The application of this methodology to topographic data of Poland results in the delineation of nine physiographic units. The resultant map bears a close resemblance to a conventional

  7. A PROGRESS REVIEW OF TECTONIC MAP COMPILATION IN CHINA IN PERSPECTIVE OF OFFSHORE MAPPING%从海洋角度看中国大地构造编图进展

    Institute of Scientific and Technical Information of China (English)

    郭兴伟; 张训华; 王忠蕾; 温珍河; 杨金玉; 尹航; 侯方辉

    2012-01-01

    大地构造图反映着作者对大地构造演化过程和动力机制的理解,往往以某种大地构造观点为指导.经历了20世纪七八十年代百家争鸣的大地构造编图时代以后,随着板块构造理论的引进,各家观点在承认板块构造理论的同时,表述各有侧重.地质地球物理调查是大地构造编图的资料基础,近年来,调查资料无论在数量上还是在精度上,都有较大的突破.但是,海区的大地构造编图还存在一些问题,就是海区调查研究基础与陆区相比还相对薄弱,近十年来海区的地调科研成果也没有表现在大区域的,尤其是涵盖全海区或整个中国海陆的大地构造图上.多种地球物理资料的综合研究,可以作为中国海陆大地构造对比和研究的纽带,地质、地球物理和地球化学方法和资料的综合应用,可作为大地构造编图的重要基础.从一个海洋地质构造工作者的角度,在阐释对我国大地构造编图的历史、观点、资料、成果及努力方向理解的基础上,希望通过正在执行的“中国海陆大地构造格架图”,为编制中国大地构造图的思路做一个新探索.%A tectonic map is a vehicle to express author's understanding of tectonic patterns and its evolution processes and dynamic mechanism, under the guidance of a tectonic theory. The Plate Tectonics Theory was established during the 1970s and 1980s after a long time of debate, and has been accepted by most of the tectonic geologists of the world. However, arguments remain in tectonic map compilation due to the diverse understanding of the mapping elements selected by different authors. The compilation of a tectonic map lays its basis on data accumulation. A great amount of high quality geological and geophysical data has been acquired recently in China, in particular in China Seas. However, the survey and research data in China Seas in the past ten years have not yet been integrated into a tectonic map

  8. Digital Geologic Map of the American Camp Unit and vicinity, Washington (NPS, GRD, GRE, SAJH, SJIS digital map)

    Data.gov (United States)

    National Park Service, Department of the Interior — The Digital Geologic Map of the American Camp Unit and vicinity, Washington is composed of GIS data layers complete with ArcMap 9.2 layer (.LYR) files, two ancillary...

  9. USGS Small-scale Dataset - Global Map: Ports of the United States 201406 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing ferry ports in the United States and Puerto Rico. The data are a modified version of the National Atlas of the United...

  10. Lattice Simulations using OpenACC compilers

    CERN Document Server

    Majumdar, Pushan

    2013-01-01

    OpenACC compilers allow one to use Graphics Processing Units without having to write explicit CUDA codes. Programs can be modified incrementally using OpenMP like directives which causes the compiler to generate CUDA kernels to be run on the GPUs. In this article we look at the performance gain in lattice simulations with dynamical fermions using OpenACC compilers.

  11. Geoelectric hazard maps for the continental United States

    Science.gov (United States)

    Love, Jeffrey J.; Pulkkinen, Antti; Bedrosian, Paul A.; Jonas, Seth; Kelbert, Anna; Rigler, E. Joshua; Finn, Carol A.; Balch, Christopher C.; Rutledge, Robert; Waggel, Richard M.; Sabata, Andrew T.; Kozyra, Janet U.; Black, Carrie E.

    2016-09-01

    In support of a multiagency project for assessing induction hazards, we present maps of extreme-value geoelectric amplitudes over about half of the continental United States. These maps are constructed using a parameterization of induction: estimates of Earth surface impedance, obtained at discrete geographic sites from magnetotelluric survey data, are convolved with latitude-dependent statistical maps of extreme-value geomagnetic activity, obtained from decades of magnetic observatory data. Geoelectric amplitudes are estimated for geomagnetic waveforms having 240 s sinusoidal period and amplitudes over 10 min that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic activity and Earth surface impedance, once-per-century geoelectric amplitudes span more than 2 orders of magnitude and are an intricate function of location. For north-south induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.26 V/km; for east-west geomagnetic variation the median value is 0.23 V/km. At some locations, once-per-century geoelectric amplitudes exceed 3 V/km.

  12. Mapping the Mineral Resource Base for Mineral Carbon-Dioxide Sequestration in the Conterminous United States

    Science.gov (United States)

    Krevor, S.C.; Graves, C.R.; Van Gosen, B. S.; McCafferty, A.E.

    2009-01-01

    This database provides information on the occurrence of ultramafic rocks in the conterminous United States that are suitable for sequestering captured carbon dioxide in mineral form, also known as mineral carbon-dioxide sequestration. Mineral carbon-dioxide sequestration is a proposed greenhouse gas mitigation technology whereby carbon dioxide (CO2) is disposed of by reacting it with calcium or magnesium silicate minerals to form a solid magnesium or calcium carbonate product. The technology offers a large capacity to permanently store CO2 in an environmentally benign form via a process that takes little effort to verify or monitor after disposal. These characteristics are unique among its peers in greenhouse gas disposal technologies. The 2005 Intergovernmental Panel on Climate Change report on Carbon Dioxide Capture and Storage suggested that a major gap in mineral CO2 sequestration is locating the magnesium-silicate bedrock available to sequester the carbon dioxide. It is generally known that silicate minerals with high concentrations of magnesium are suitable for mineral carbonation. However, no assessment has been made in the United States that details their geographical distribution and extent, nor has anyone evaluated their potential for use in mineral carbonation. Researchers at Columbia University and the U.S. Geological Survey have developed a digital geologic database of ultramafic rocks in the conterminous United States. Data were compiled from varied-scale geologic maps of magnesium-silicate ultramafic rocks. The focus of our national-scale map is entirely on ultramafic rock types, which typically consist primarily of olivine- and serpentine-rich rocks. These rock types are potentially suitable as source material for mineral CO2 sequestration.

  13. The Compiler Forest

    OpenAIRE

    Budiu, Mihai; Galenson, Joel; Plotkin, Gordon D.

    2013-01-01

    We address the problem of writing compilers targeting complex execution environments, such as computer clusters composed of machines with multi-core CPUs. To that end we introduce partial compilers. These compilers can pass sub-programs to several child (partial) compilers, combining the code generated by their children to generate the final target code. We define a set of high-level polymorphic operations manipulating both compilers and partial compilers as first-class values. These mechanis...

  14. Appendix B: Description of Map Units for Northeast Asia Summary Geodynamics Map

    Science.gov (United States)

    Parfenov, Leonid M.; Badarch, Gombosuren; Berzin, Nikolai A.; Hwang, Duk-Hwan; Khanchuk, Alexander I.; Kuzmin, Mikhail I.; Nokleberg, Warren J.; Obolenskiy, Alexander A.; Ogasawara, Masatsugu; Prokopiev, Andrei V.; Rodionov, Sergey M.; Smelov, Alexander P.; Yan, Hongquan

    2009-01-01

    The major purposes of this chapter are to provide (1) an overview of the regional geology, tectonics, and metallogenesis of Northeast Asia for readers who are unfamiliar with the region, (2) a general scientific introduction to the succeeding chapters of this volume, and (3) an overview of the methodology of metallogenic and tectonic analysis used in this study. We also describe how a high-quality metallogenic and tectonic analysis, including construction of an associated metallogenic-tectonic model will greatly benefit other mineral resource studies, including synthesis of mineral-deposit models; improve prediction of undiscovered mineral deposit as part of a quantitative mineral-resource-assessment studies; assist land-use and mineral-exploration planning; improve interpretations of the origins of host rocks, mineral deposits, and metallogenic belts, and suggest new research. Research on the metallogenesis and tectonics of such major regions as Northeast Asia (eastern Russia, Mongolia, northern China, South Korea, and Japan) and the Circum-North Pacific (the Russian Far East, Alaska, and the Canadian Cordillera) requires a complex methodology including (1) definitions of key terms, (2) compilation of a regional geologic base map that can be interpreted according to modern tectonic concepts and definitions, (3) compilation of a mineral-deposit database that enables a determination of mineral-deposit models and clarification of the relations of deposits to host rocks and tectonic origins, (4) synthesis of a series of mineral-deposit models that characterize the known mineral deposits and inferred undiscovered deposits in the region, (5) compilation of a series of metallogenic-belt belts constructed on the regional geologic base map, and (6) construction of a unified metallogenic and tectonic model. The summary of regional geology and metallogenesis presented here is based on publications of the major international collaborative studies of the metallogenesis and

  15. Improving the Magnetic Anomaly Map of the United States

    Science.gov (United States)

    McIndoo, M.; Shaw, A.; Batir, J.; Ravat, D.; Milligan, P.; Kucks, R. P.; Hill, P.; Hildenbrand, T. G.

    2007-05-01

    We have improved magnetic anomaly map of the United States using National Uranium Reconnaissance & Evaluations (NURE) aeromagnetic surveys collected during the 1970s. Previous versions of these data processed using IGRF/DGRF do not mesh well at the survey boundaries because of leveling artifacts. Similarly, the U.S. component of the North American magnetic anomaly map has long wavelength errors caused by warping of hundreds of state and local aeromagnetic surveys during the merging process. The main difference in our processing that has allowed us to retain proper base levels is the use of the temporally continuous main field Comprehensive Model (CM4) by Sabaka et al. (2004, GJI, 159, 521-547). The advantage of using the NURE surveys is that most of these surveys have time information and diurnal variation observed with basestation magnetometers is removed from them. Furthermore, we have cleaned the NURE data by removing many spurious values through visual inspection. Some NURE surveys did not have total field values or time information. For these surveys, we reintroduced the IGRF for their approximate date and removed the core field determined by CM4. We compare the results of our processing and improvements with the U.S. aeromagnetic anomaly data prepared by different merging techniques. The improved map is more suitable for regional geologic and geodynamic interpretations.

  16. Parabolic starlike mappings of the unit ball $B^n$

    Directory of Open Access Journals (Sweden)

    Samira Rahrovi

    2016-02-01

    Full Text Available Let $f$ be a locally univalent function on the unit disk $U$. We consider the normalized extensions of $f$ to the Euclidean unit ball $B^nsubseteqmathbb{C}^n$ given by$$Phi_{n,gamma}(f(z=left(f(z_1,(f'(z_1^gammahat{z}right,$$ where $gammain[0,1/2]$, $z=(z_1,hat{z}in B^n$ and$$Psi_{n,beta}(f(z=left(f(z_1,(frac{f(z_1}{z_1}^betahat{z}right,$$in which $betain[0,1]$, $f(z_1neq 0$ and $z=(z_1,hat{z}inB^n$. In the case $gamma=1/2$, the function $Phi_{n,gamma}(f$ reduces to the well known Roper-Suffridge extension operator. By using different methods, we prove that if $f$ is parabolic starlike mapping on $U$ then $Phi_{n,gamma}(f$ and $Psi_{n,beta}(f$ are parabolic starlike mappings on $B^n$.

  17. Using historical aerial photography and softcopy photogrammetry for waste unit mapping in L Lake.

    Energy Technology Data Exchange (ETDEWEB)

    Christel, L.M.

    1997-10-01

    L Lake was developed as a cooling water reservoir for the L Reactor at the Savannah River Site. The construction of the lake, which began in the fall of 1984, altered the structure and function of Steel Creek. Completed in the fall of 1985, L Lake has a capacity of 31 million cubic meters and a normal pool of 58 meters. When L Reactor operations ceased in 1988, the water level in the lake still had to be maintained. Site managers are currently trying to determine the feasibility of draining or drawing down the lake in order to save tax dollars. In order to understand the full repercussions of such an undertaking, it was necessary to compile a comprehensive inventory of what the lake bottom looked like prior to filling. Aerial photographs, acquired nine days before the filling of the lake began, were scanned and used for softcopy photogrammetry processing. A one-meter digital elevation model was generated and a digital orthophoto mosaic was created as the base map for the project. Seven categories of features, including the large waste units used to contain the contaminated soil removed from the dam site, were screen digitized and used to generate accurate maps. Other map features include vegetation waste piles, where contaminated vegetation from the flood plain was contained, and ash piles, which are sites where vegetation debris was burned and then covered with clean soil. For all seven categories, the area of disturbance totaled just over 63 hectares. When the screen digitizing was completed, the elevation at the centroid of each disturbance was determined. When the information is used in the Savannah River Site Geographical Information System, it can be used to visualize the various L Lake draw-down scenarios suggested by site managers and hopefully, to support evaluations of the cost effectiveness for each proposed activity.

  18. The compilation of the lunar digital geological map and a discussion on the tectonic evolution of the moon%月球数字地质图的编制与研究

    Institute of Scientific and Technical Information of China (English)

    王梁; 丁孝忠; 韩坤英; 庞健峰; 许可娟; 郑洪伟; 吴昊

    2015-01-01

    The compilation of the Lunar Digital Geological Map was based on the scientific exploration data obtained by Chang’E-1 and Chang’E-2 and other lunar geological data as well as research results. According to the material compositions, structure elements and the information of geochronology of the Moon, the authors compiled the Lunar Geological Map at a scale of 1∶2500000 and established spatial database by using the ArcGIS platform. The authors developed a mapping programs, processes and legends for the Lunar Digital Geological Map, and established a spatial database based on Geodatabase model by compiling and investigating geological map of the typical region, which can effectively update and manage the Digital Geological Map and thus lay the foundation for the geological comprehensive study of the Moon, the geological mapping of the whole Moon, and also the geological mapping of other celestial bodies in the future. This paper also deals with the tectonic evolution of the moon on the basis of summarizing the compilation of the Lunar Digital Geological Map and comprehensive research on a large number of lunar geological data.%提中国月球数字地质图的编制是利用嫦娥一号、嫦娥二号月球科学探测数据和其他已有月球地质资料与研究成果,通过对月球岩石成分、地质构造和形成时代等要素的研究,应用ArcGIS平台编制1∶250万月球地质图,并建立数字地质图空间数据库。本文通过对月球典型地区地质图的编制与研究,制定了月球数字地质图的编图方案、流程与图示图例,建立了Geodatabase空间数据库,为有效地对数字地质图进行更新与管理,开展月球地质综合研究、编制全月球地质图及未来开展其他天体的地质编图工作奠定了基础。通过地质图编制与大量月球资料的综合集成研究,对月球形成与构造演化进行了初步的探讨。

  19. Landslide overview map of the conterminous United States

    Science.gov (United States)

    Radbruch-Hall, Dorothy H.; Colton, Roger B.; Davies, William E.; Lucchitta, Ivo; Skipp, Betty A.; Varnes, David J.

    1982-01-01

    The accompanying landslide overview map of the conterminous United States is one of a series of National Environmental Overview Maps that summarize geologic, hydrogeologic, and topographic data essential to the assessment of national environmental problems. The map delineates areas where large numbers of landslides exist and areas which are susceptible to landsliding. It was prepared by evaluating the geologic map of the United States and classifying the geologic units according to high, medium, or low landslide incidence (number) and high, medium, or low susceptibility to landsliding. Rock types, structures, topography, precipitation, landslide type, and landslide incidence are mentioned for each physical subdivision of the United States. The differences in slope stability between the Colorado Plateau, the Appalachian Highlands, the Coast Ranges of California, and the Southern Rocky Mountains are compared in detail, to illustrate the influence of various natural factors on the types of landsliding that occur in regions having different physical conditions. These four mountainous regions are among the most landslide-prone areas in the United States. The Colorado Plateau is a deformed platform where interbedded sedimentary rocks of varied lithologic properties have been gently warped and deeply eroded. The rocks are extensively fractured. Regional fracture systems, joints associated with individual geologic structures, and joints parallel to topographic surfaces, such as cliff faces, greatly influence slope stability. Detached blocks at the edges of mesas, as well as columns, arched recesses, and many natural arches on the Colorado Plateau, were formed wholly or in part by mass movement. In the Appalachian Highlands, earth flows, debris flows, and debris avalanches predominate in weathered bedrock and colluvium. Damaging debris avalanches result when persistent steady rainfall is followed by a sudden heavy downpour. Landsliding in unweathered bedrock is controlled

  20. Crater-based dating of geological units on Mars: methods and application for the new global geological map

    Science.gov (United States)

    Platz, Thomas; Michael, Gregory; Tanaka, Kenneth L.; Skinner, James A.; Fortezzo, Corey M.

    2013-01-01

    The new, post-Viking generation of Mars orbital imaging and topographical data provide significant higher-resolution details of surface morphologies, which induced a new effort to photo-geologically map the surface of Mars at 1:20,000,000 scale. Although from unit superposition relations a relative stratigraphical framework can be compiled, it was the ambition of this mapping project to provide absolute unit age constraints through crater statistics. In this study, the crater counting method is described in detail, starting with the selection of image data, type locations (both from the mapper’s and crater counter’s perspectives) and the identification of impact craters. We describe the criteria used to validate and analyse measured crater populations, and to derive and interpret crater model ages. We provide examples of how geological information about the unit’s resurfacing history can be retrieved from crater size–frequency distributions. Three cases illustrate short-, intermediate, and long-term resurfacing histories. In addition, we introduce an interpretation-independent visualisation of the crater resurfacing history that uses the reduction of the crater population in a given size range relative to the expected population given the observed crater density at larger sizes. From a set of potential type locations, 48 areas from 22 globally mapped units were deemed suitable for crater counting. Because resurfacing ages were derived from crater statistics, these secondary ages were used to define the unit age rather than the base age. Using the methods described herein, we modelled ages that are consistent with the interpreted stratigraphy. Our derived model ages allow age assignments to be included in unit names. We discuss the limitations of using the crater dating technique for global-scale geological mapping. Finally, we present recommendations for the documentation and presentation of crater statistics in publications.

  1. Mapping Snow Cover Loss Patterns in the Western United States

    Science.gov (United States)

    Moore, C.; Kampf, S. K.; Richer, E.; Stone, B.

    2011-12-01

    Cara Moore, Stephanie Kampf, Eric Richer, Brandon Stone Natural Resource Ecology Laboratory, Colorado State University, Fort Collins, CO 80523-1499 The Western United States depends on snowmelt to provide water for industrial, municipal, and agricultural needs. Some areas in this region have observed an increase in the proportion of precipitation falling as rain rather than snow in response to climate warming, a trend that can alter the timing and magnitude of runoff. Transitional snow zones, which lie between lower elevation intermittent snowpack and higher elevation persistent snowpack, may be particularly sensitive to changing climate conditions. Snow covered area is an easily obtainable measurement that can help identify the locations and elevations of these transitional snow zones. The purpose of this study is to improve the understanding of snowpack characteristics in the Western U.S. by mapping snow cover loss patterns using the Moderate-Resolution Imaging Spectroradiometer (MODIS) snow covered area (SCA) product. Snow cover loss patterns can be difficult to compare objectively between regions because spring snow storms lead to abrupt increases and decreases in SCA. Therefore, we develop a curve-fitting snow cover depletion model (SCoDMod) used to derive standardized snow cover loss curves. We fit the model to snow cover patterns within 100m elevation zones from January 1st until July 19th for each USGS eight digit hydrologic unit in the Western US. We use the model to identify 11 year (2000-2010) average snow cover loss patterns and compare those patterns to snow cover loss behavior in wet and dry years. Model results give maps of average SCA in the Western United States on the first of the month from January to July, as well as maps of the date of SCA loss to 75% (Q75), 50% (Q50), and 25% (Q25) SCA. Results show that the Cascade, Sierra Nevada, and Rocky mountains from Colorado northward retain >90% SCA until March, whereas most parts of lower elevation

  2. The Design and Compilation of High-Resolution Image Map of Cities and Administrative District of Shaanxi Province%陕西省设区市高分辨率影像地图的设计与编制

    Institute of Scientific and Technical Information of China (English)

    赵力彬; 周晓敏; 赵曦; 李庆东

    2013-01-01

    Image map is of the attributes of RS image and DLC, which can be absolutely clear at a glance and well understood by the readers without certain skills of mapping. Based on the design and compilation of high - resolution image map of 10 cities and administrative district of Shaanxi province, the author introduces the general idea about image map and the key technology and methods of image map compilation.%影像地图具有影像与线划地图的双重优势,读者无需专门的地图知识便能对图上的内容一目了然,受到广大读者的接受和喜爱.本文结合陕西省十个设区市高分辨率影像地图的设计与编制,介绍了影像地图的总体设计思想以及编制的关键技术和方法.

  3. Saline aquifer mapping project in the southeastern United States

    Science.gov (United States)

    Williams, Lester J.; Spechler, Rick M.

    2011-01-01

    In 2009, the U.S. Geological Survey initiated a study of saline aquifers in the southeastern United States to evaluate the potential use of brackish or saline water from the deeper portions of the Floridan aquifer system and the underlying Coastal Plain aquifer system (Fig. 1). The objective of this study is to improve the overall understanding of the available saline water resources for potential future development. Specific tasks are to (1) develop a digital georeferenced database of borehole geophysical data to enable analysis and characterization of saline aquifers (see locations in Fig. 1), (2) identify and map the regional extent of saline aquifer systems and describe the thickness and character of hydrologic units that compose these systems, and (3) delineate salinity variations at key well sites and along section lines to provide a regional depiction of the freshwater-saltwater interfaces. Electrical resistivity and induction logs, coupled with a variety of different porosity logs (sonic, density, and neutron), are the primary types of borehole geophysical logs being used to estimate the water quality in brackish and saline formations. The results from the geophysical log calculations are being compared to available water-quality data obtained from water wells and from drill-stem water samples collected in test wells. Overall, the saline aquifer mapping project is helping to improve the understanding of saline water resources in the area. These aquifers may be sources of large quantities of water that could be treated by using reverse osmosis or similar technologies, or they could be used for aquifer storage and recovery systems.

  4. Principles of compilers

    CERN Document Server

    Su, Yunlin

    2011-01-01

    ""Principles of Compilers: A New Approach to Compilers Including the Algebraic Method"" introduces the ideas of the compilation from the natural intelligence of human beings by comparing similarities and differences between the compilations of natural languages and programming languages. The notation is created to list the source language, target languages, and compiler language, vividly illustrating the multilevel procedure of the compilation in the process. The book thoroughly explains the LL(1) and LR(1) parsing methods to help readers to understand the how and why. It not only covers estab

  5. Global Map: 1:1,000,000-Scale Political Areas of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the counties and equivalent entities of the United States, Puerto Rico, and the U.S. Virgin Islands. States and the...

  6. USGS Small-scale Dataset - Global Map: Railroad Stations of the United States 201403 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing Amtrak intercity railroad terminals in the United States. The data are a modified version of the National Atlas of...

  7. USGS Small-scale Dataset - Global Map: Airports of the United States 201403 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing airports in the United States, Puerto Rico and the U.S. Virgin Islands. The data are a modified version of the...

  8. A mitotically inheritable unit containing a MAP kinase module.

    Science.gov (United States)

    Kicka, Sébastien; Bonnet, Crystel; Sobering, Andrew K; Ganesan, Latha P; Silar, Philippe

    2006-09-05

    Prions are novel kinds of hereditary units, relying solely on proteins, that are infectious and inherited in a non-Mendelian fashion. To date, they are either based on autocatalytic modification of a 3D conformation or on autocatalytic cleavage. Here, we provide further evidence that in the filamentous fungus Podospora anserina, a MAP kinase cascade is probably able to self-activate and generate C, a hereditary unit that bears many similarities to prions and triggers cell degeneration. We show that in addition to the MAPKKK gene, both the MAPKK and MAPK genes are necessary for the propagation of C, and that overexpression of MAPK as that of MAPKKK facilitates the appearance of C. We also show that a correlation exists between the presence of C and localization of the MAPK inside nuclei. These data emphasize the resemblance between prions and a self-positively regulated cascade in terms of their transmission. This thus further expands the concept of protein-base inheritance to regulatory networks that have the ability to self-activate.

  9. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  10. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  11. Map of Arsenic concentrations in groundwater of the United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The map graphic image at http://water.usgs.gov/GIS/browse/arsenic_map.png illustrates arsenic values, in micrograms per liter, for groundwater samples from about...

  12. Documentation of methods and inventory of irrigation data collected for the 2000 and 2005 U.S. Geological Survey Estimated use of water in the United States, comparison of USGS-compiled irrigation data to other sources, and recommendations for future compilations

    Science.gov (United States)

    Dickens, Jade M.; Forbes, Brandon T.; Cobean, Dylan S.; Tadayon, Saeid

    2011-01-01

    Every five years since 1950, the U.S. Geological Survey (USGS) National Water Use Information Program (NWUIP) has compiled water-use information in the United States and published a circular report titled "Estimated use of water in the United States," which includes estimates of water withdrawals by State, sources of water withdrawals (groundwater or surface water), and water-use category (irrigation, public supply, industrial, thermoelectric, and so forth). This report discusses the impact of important considerations when estimating irrigated acreage and irrigation withdrawals, including estimates of conveyance loss, irrigation-system efficiencies, pasture, horticulture, golf courses, and double cropping.

  13. Creating a national scale floodplain map for the United States using soil information

    Science.gov (United States)

    Merwade, V.; Du, L.; Sangwan, N.

    2015-12-01

    Floods are the most damaging of all natural disasters, adversely affecting millions of lives and causing financial losses worth billions of dollars every year across the globe. Flood inundation maps play a key role in the assessment and mitigation of potential flood hazards. However, there are several communities in the United States where flood risk maps are not available due to the lack of the resources needed to create such maps through the conventional modeling approach. The objective of this study is to develop and examine an economical alternative approach to floodplain mapping using widely available gSSURGO soil data in the United States. The gSSURGO dataset is used to create floodplain maps for the entire United States by using attributes related to soil taxonomy, flood frequency and geomorphic description. For validation, the flood extents obtained from the soil data are compared with existing maps, including the Federal Emergency Management Agency (FEMA) issued Flood Insurance Rate Maps (FIRM), flood extents observed during past floods, and other flood maps derived using Digital Elevation Models (DEMs). Preliminary results show that overlap between the SSURGO based floodplain maps and FEMA maps range from 65 to 90 percent. While these results are promising, a more comprehensive validation of these maps must be conducted. The soil based approach floodplain mapping approach offers an objective, economical and faster alternative in areas where detailed flood modeling and mapping has not been conducted.

  14. Reflections on the Value of Mapping the Final Theory Examination in a Molecular Biochemistry Unit

    OpenAIRE

    Rajaraman Eri; Anthony Cook; Natalie Brown

    2014-01-01

    This article assesses the impact of examination mapping as a tool to enhancing assessment and teaching quality in a second-year biochemistry unit for undergraduates. Examination mapping is a process where all questions in a written examination paper are assessed for links to the unit’s intended learning outcomes. We describe how mapping a final written examination helped visualise the impact of the assessment task on intended learning outcomes and skills for that biochemistry unit. The method...

  15. Calculating correct compilers

    DEFF Research Database (Denmark)

    Bahr, Patrick; Hutton, Graham

    2015-01-01

    In this article, we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high-level semantics by systematic calculation, with all details of the implementation of the compilers...... falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language features and their combination, including arithmetic expressions, exceptions, state, various forms...

  16. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  17. Compilation of watershed models for tributaries to the Great Lakes, United States, as of 2010, and identification of watersheds for future modeling for the Great Lakes Restoration Initiative

    Science.gov (United States)

    Coon, William F.; Murphy, Elizabeth A.; Soong, David T.; Sharpe, Jennifer B.

    2011-01-01

    As part of the Great Lakes Restoration Initiative (GLRI) during 2009–10, the U.S. Geological Survey (USGS) compiled a list of existing watershed models that had been created for tributaries within the United States that drain to the Great Lakes. Established Federal programs that are overseen by the National Oceanic and Atmospheric Administration (NOAA) and the U.S. Army Corps of Engineers (USACE) are responsible for most of the existing watershed models for specific tributaries. The NOAA Great Lakes Environmental Research Laboratory (GLERL) uses the Large Basin Runoff Model to provide data for the management of water levels in the Great Lakes by estimating United States and Canadian inflows to the Great Lakes from 121 large watersheds. GLERL also simulates streamflows in 34 U.S. watersheds by a grid-based model, the Distributed Large Basin Runoff Model. The NOAA National Weather Service uses the Sacramento Soil Moisture Accounting model to predict flows at river forecast sites. The USACE created or funded the creation of models for at least 30 tributaries to the Great Lakes to better understand sediment erosion, transport, and aggradation processes that affect Federal navigation channels and harbors. Many of the USACE hydrologic models have been coupled with hydrodynamic and sediment-transport models that simulate the processes in the stream and harbor near the mouth of the modeled tributary. Some models either have been applied or have the capability of being applied across the entire Great Lakes Basin; they are (1) the SPAtially Referenced Regressions On Watershed attributes (SPARROW) model, which was developed by the USGS; (2) the High Impact Targeting (HIT) and Digital Watershed models, which were developed by the Institute of Water Research at Michigan State University; (3) the Long-Term Hydrologic Impact Assessment (L–THIA) model, which was developed by researchers at Purdue University; and (4) the Water Erosion Prediction Project (WEPP) model, which was

  18. Singular SRB Measures for a Non 1-1 Map of the Unit Square

    Science.gov (United States)

    Góra, Paweł; Boyarsky, Abraham; Li, Zhenyang

    2016-09-01

    We consider a map of the unit square which is not 1-1, such as the memory map studied in Góra (Statistical and deterministic dynamics of maps with memory, http://arxiv.org/abs/1604.06991). Memory maps are defined as follows: x_{n+1}=M_{α }(x_{n-1},xn)=τ (α \\cdot xn+(1-α )\\cdot x_{n-1}), where τ is a one-dimensional map on I=[0,1] and 0http://arxiv.org/abs/1604.06991). In this paper we prove that for α in (3/4,1) the map G_α admits a singular Sinai-Ruelle-Bowen measure. We do this by applying Rychlik's results for the Lozi map. However, unlike the Lozi map, the maps G_α are not invertible which creates complications that we are able to overcome.

  19. Kokkos GPU Compiler

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-15

    The Kokkos Clang compiler is a version of the Clang C++ compiler that has been modified to perform targeted code generation for Kokkos constructs in the goal of generating highly optimized code and to provide semantic (domain) awareness throughout the compilation toolchain of these constructs such as parallel for and parallel reduce. This approach is taken to explore the possibilities of exposing the developer’s intentions to the underlying compiler infrastructure (e.g. optimization and analysis passes within the middle stages of the compiler) instead of relying solely on the restricted capabilities of C++ template metaprogramming. To date our current activities have focused on correct GPU code generation and thus we have not yet focused on improving overall performance. The compiler is implemented by recognizing specific (syntactic) Kokkos constructs in order to bypass normal template expansion mechanisms and instead use the semantic knowledge of Kokkos to directly generate code in the compiler’s intermediate representation (IR); which is then translated into an NVIDIA-centric GPU program and supporting runtime calls. In addition, by capturing and maintaining the higher-level semantics of Kokkos directly within the lower levels of the compiler has the potential for significantly improving the ability of the compiler to communicate with the developer in the terms of their original programming model/semantics.

  20. USGS Small-scale Dataset - Global Map: Ports of the United States 201406 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing ferry ports in the United States and Puerto Rico. The data are a modified version of the National Atlas of the United...

  1. 《军用分类主题映射表》的总体设计与编制%Mapping Table of Military Classification and Thesaurus:Design and Compilation

    Institute of Scientific and Technical Information of China (English)

    赵建国; 周健

    2015-01-01

    Mapping Table of Military Classification and Thesaurus is a retrieval languages tool of integration of classification and subject method. It can be used both in classification indexing&retrieval and subject indexing&retrieval of military information resources. It can also be used in the process of one-time classification and subject indexing&retrieval. This paper briefly introduces the overal design of Mapping Table of Military Classification and Thesaurus, discussed the frame structure of Compilation and Management System for Mapping Table of Military Classification and Thesaurus, elaborated the mapping and review process and put forwards the conclusion and prospect.%《军用分类主题映射表》是分类主题一体化的检索语言工具,既可以用于军事信息资源的分类标引,也可以用于军事信息资源的主题标引,实现分类与主题相结合的检索方式.本文简述了《军用分类主题映射表》的总体框架,阐述了《军用分类主题映射表》映射与审查过程,叙述了"军用分类主题映射表编制管理系统"功能结构.

  2. Compilation and Validation of SAR and Optical Data Products for a Complete and Global Map of Inland/Ocean Water Tailored to the Climate Modeling Community

    Directory of Open Access Journals (Sweden)

    Céline Lamarche

    2017-01-01

    Full Text Available Accurate maps of surface water extent are of paramount importance for water management, satellite data processing and climate modeling. Several maps of water bodies based on remote sensing data have been released during the last decade. Nonetheless, none has a truly (90 ∘ N/90 ∘ S global coverage while being thoroughly validated. This paper describes a global, spatially-complete (void-free and accurate mask of inland/ocean water for the 2000–2012 period, built in the framework of the European Space Agency (ESA Climate Change Initiative (CCI. This map results from the synergistic combination of multiple individual SAR and optical water body and auxiliary datasets. A key aspect of this work is the original and rigorous stratified random sampling designed for the quality assessment of binary classifications where one class is marginally distributed. Input and consolidated products were assessed qualitatively and quantitatively against a reference validation database of 2110 samples spread throughout the globe. Using all samples, overall accuracy was always very high among all products, between 98 % and 100 % . The CCI global map of open water bodies provided the best water class representation (F-score of 89 % compared to its constitutive inputs. When focusing on the challenging areas for water bodies’ mapping, such as shorelines, lakes and river banks, all products yielded substantially lower accuracy figures with overall accuracies ranging between 74 % and 89 % . The inland water area of the CCI global map of open water bodies was estimated to be 3.17 million km 2 ± 0.24 million km 2 . The dataset is freely available through the ESA CCI Land Cover viewer.

  3. Map Design and Compiling for Leader′s Work%领导工作用图设计与编制

    Institute of Scientific and Technical Information of China (English)

    王爱琴

    2013-01-01

    Map, with its unique feature , serves particular departments and leaders .The paper introduces the design of map sheet and technical program and discusses its usages for the government .%地图以其独有的特性,为各级领导和专业部门服务。本文从领导工作用图的编制和设计,详细介绍市级领导工作用图图幅设计、技术方案等,并以此为例探讨测绘工作如何为政府服务。

  4. Map of assessed coalbed-gas resources in the United States, 2014

    Science.gov (United States)

    ,; Biewick, Laura R. H.

    2014-01-01

    This report presents a digital map of coalbed-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within coalbed-gas assessment units (AUs). This is the third digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS coalbed-gas assessment publications and web pages.

  5. Singular SRB Measures for a Non 1-1 Map of the Unit Square

    Science.gov (United States)

    Góra, Paweł; Boyarsky, Abraham; Li, Zhenyang

    2016-10-01

    We consider a map of the unit square which is not 1-1, such as the memory map studied in Góra (Statistical and deterministic dynamics of maps with memory, http://arxiv.org/abs/1604.06991). Memory maps are defined as follows: x_{n+1}=M_{α }(x_{n-1},xn)=τ (α \\cdot xn+(1-α )\\cdot x_{n-1}), where τ is a one-dimensional map on I=[0,1] and 0tent map. To study the dynamics of M_α , we consider the two-dimensional map G_{α }:[x_{n-1},xn]mapsto [xn,τ (α \\cdot xn+(1-α )\\cdot x_{n-1})]. The map G_α for α in (0,3/4] was studied in Góra (Statistical and deterministic dynamics of maps with memory, http://arxiv.org/abs/1604.06991). In this paper we prove that for α in (3/4,1) the map G_α admits a singular Sinai-Ruelle-Bowen measure. We do this by applying Rychlik's results for the Lozi map. However, unlike the Lozi map, the maps G_α are not invertible which creates complications that we are able to overcome.

  6. Mapping of Phytoecological Units of the ’Cerrados’ of the Central Plateaus of Brazil,

    Science.gov (United States)

    The mapping of phytoecological units in the Region of Cerrado , of Brazil, has the purpose of giving a global view of the result obtained through the...mapping of a big extension of the country’s vegetation cover, where predominates the ’ cerrado ’. The phytoecological units represent the close links...which, besides several ’ cerrado ’ physiognomies, some forests were found. On the north of the area it occurs the contact between the dominium of the

  7. Global Map: Cities and Towns of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing cities and towns in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of...

  8. USGS Small-scale Dataset - Global Map: Cities and Towns of the United States 201403 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing cities and towns in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of...

  9. Global Map: 1:1,000,000-Scale Major Roads of the United States - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the major roads in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of...

  10. United States Crimes Database 1994-2000 - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer shows crime statistics for the United States for the years 1994-2000, drawn from the Uniform Crime Reporting Program data compiled by the Federal...

  11. United States Crimes Database 2001-2002 - Direct Download

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer shows crime statistics for the United States for the years 2001-2002, drawn from the Uniform Crime Reporting Program data compiled by the Federal...

  12. A Compilation of Boiling Water Reactor Operational Experience for the United Kingdom's Office for Nuclear Regulation's Advanced Boiling Water Reactor Generic Design Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, Timothy A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Liao, Huafei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    United States nuclear power plant Licensee Event Reports (LERs), submitted to the United States Nuclear Regulatory Commission (NRC) under law as required by 10 CFR 50.72 and 50.73 were evaluated for reliance to the United Kingdom’s Health and Safety Executive – Office for Nuclear Regulation’s (ONR) general design assessment of the Advanced Boiling Water Reactor (ABWR) design. An NRC compendium of LERs, compiled by Idaho National Laboratory over the time period January 1, 2000 through March 31, 2014, were sorted by BWR safety system and sorted into two categories: those events leading to a SCRAM, and those events which constituted a safety system failure. The LERs were then evaluated as to the relevance of the operational experience to the ABWR design.

  13. Documentation for the 2008 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Haller, Kathleen M.; Wheeler, Russell L.; Wesson, Robert L.; Zeng, Yuehua; Boyd, Oliver S.; Perkins, David M.; Luco, Nicolas; Field, Edward H.; Wills, Chris J.; Rukstales, Kenneth S.

    2008-01-01

    The 2008 U.S. Geological Survey (USGS) National Seismic Hazard Maps display earthquake ground motions for various probability levels across the United States and are applied in seismic provisions of building codes, insurance rate structures, risk assessments, and other public policy. This update of the maps incorporates new findings on earthquake ground shaking, faults, seismicity, and geodesy. The resulting maps are derived from seismic hazard curves calculated on a grid of sites across the United States that describe the frequency of exceeding a set of ground motions. The USGS National Seismic Hazard Mapping Project developed these maps by incorporating information on potential earthquakes and associated ground shaking obtained from interaction in science and engineering workshops involving hundreds of participants, review by several science organizations and State surveys, and advice from two expert panels. The National Seismic Hazard Maps represent our assessment of the 'best available science' in earthquake hazards estimation for the United States (maps of Alaska and Hawaii as well as further information on hazard across the United States are available on our Web site at http://earthquake.usgs.gov/research/hazmaps/).

  14. The isometric extension of “into” mappings on unit spheres of AL-spaces

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this paper, we show that if V0 is an isometric mapping from the unit sphere of an AL-space onto the unit sphere of a Banach space E, then V0 can be extended to a linear isometry defined on the whole space.

  15. Compilation of Shona Children's

    African Journals Online (AJOL)

    Mev. R.B. Ruthven

    Peniah Mabaso, African Languages Research Institute (ALRI), University of. Zimbabwe, Harare ... thirteen years age group and their teachers. Student ... The Compilation of a Shona Children's Dictionary: Challenges and Solutions. 113 language .... The current orthography is linguistically constricting in a number of ways.

  16. Bloch constant of holomorphic mappings on the unit polydisk of C~n

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this paper,we give a definition of Bloch mappings defined in the unit polydisk Dn, which generalizes the concept of Bloch functions defined in the unit disk D.It is known that Bloch theorem fails unless we have some restrictive assumption on holomorphic mappings in several complex variables.We shall establish the corresponding distortion theorems for subfamiliesβ(K)andβloc(K) of Bloch mappings defined in the polydisk Dn,which extend the distortion theorems of Liu and Minda to higher dimensions.As an application,we obtain lower and upper bounds of Bloch constants for various subfamilies of Bloeh mappings defined in Dn.In particular,our results reduce to the classical results of Ahlfors and Landau when n=1.

  17. Embedded Processor Oriented Compiler Infrastructure

    Directory of Open Access Journals (Sweden)

    DJUKIC, M.

    2014-08-01

    Full Text Available In the recent years, research of special compiler techniques and algorithms for embedded processors broaden the knowledge of how to achieve better compiler performance in irregular processor architectures. However, industrial strength compilers, besides ability to generate efficient code, must also be robust, understandable, maintainable, and extensible. This raises the need for compiler infrastructure that provides means for convenient implementation of embedded processor oriented compiler techniques. Cirrus Logic Coyote 32 DSP is an example that shows how traditional compiler infrastructure is not able to cope with the problem. That is why the new compiler infrastructure was developed for this processor, based on research. in the field of embedded system software tools and experience in development of industrial strength compilers. The new infrastructure is described in this paper. Compiler generated code quality is compared with code generated by the previous compiler for the same processor architecture.

  18. Aleutian Islands Coastal Resources Inventory and Environmental Sensitivity Maps: FAULTS (Fault Lines)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set is a compilation of faults mapped by the United States Geological Survey (USGS) and published in USGS bulletin series 1028 (1956-1971). These bulletins...

  19. USGS Small-scale Dataset - Global Map: 100-Meter Resolution Elevation of the Conterminous United States 201403 BIL

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing elevation for the conterminous United States, in meters relative to mean sea level. The data are a modified version...

  20. USGS Small-scale Dataset - Global Map: Railroad Stations of the United States 201403 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing Amtrak intercity railroad terminals in the United States. The data are a modified version of the National Atlas of...

  1. USGS Small-scale Dataset - Global Map: Airports of the United States 201403 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing airports in the United States, Puerto Rico and the U.S. Virgin Islands. The data are a modified version of the...

  2. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Streams of the United States 201406 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing streams in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of the...

  3. Three-dimensional mapping of equiprobable hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, C.; Pohlmann, K.; Andricevic, R.

    1996-09-01

    Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguity of high and low hydraulic conductivity regions.

  4. Mapping Atmospheric Moisture Climatologies across the Conterminous United States.

    Directory of Open Access Journals (Sweden)

    Christopher Daly

    Full Text Available Spatial climate datasets of 1981-2010 long-term mean monthly average dew point and minimum and maximum vapor pressure deficit were developed for the conterminous United States at 30-arcsec (~800m resolution. Interpolation of long-term averages (twelve monthly values per variable was performed using PRISM (Parameter-elevation Relationships on Independent Slopes Model. Surface stations available for analysis numbered only 4,000 for dew point and 3,500 for vapor pressure deficit, compared to 16,000 for previously-developed grids of 1981-2010 long-term mean monthly minimum and maximum temperature. Therefore, a form of Climatologically-Aided Interpolation (CAI was used, in which the 1981-2010 temperature grids were used as predictor grids. For each grid cell, PRISM calculated a local regression function between the interpolated climate variable and the predictor grid. Nearby stations entering the regression were assigned weights based on the physiographic similarity of the station to the grid cell that included the effects of distance, elevation, coastal proximity, vertical atmospheric layer, and topographic position. Interpolation uncertainties were estimated using cross-validation exercises. Given that CAI interpolation was used, a new method was developed to allow uncertainties in predictor grids to be accounted for in estimating the total interpolation error. Local land use/land cover properties had noticeable effects on the spatial patterns of atmospheric moisture content and deficit. An example of this was relatively high dew points and low vapor pressure deficits at stations located in or near irrigated fields. The new grids, in combination with existing temperature grids, enable the user to derive a full suite of atmospheric moisture variables, such as minimum and maximum relative humidity, vapor pressure, and dew point depression, with accompanying assumptions. All of these grids are available online at http://prism.oregonstate.edu, and

  5. Elements of compiler design

    CERN Document Server

    Meduna, Alexander

    2007-01-01

    PREFACEINTRODUCTIONMathematical PreliminariesCompilationRewriting SystemsLEXICAL ANALYSISModelsMethodsTheorySYNTAX ANALYSISModelsMethodsTheoryDETERMINISTIC TOP-DOWN PARSINGPredictive Sets and LL GrammarsPredictive ParsingDETERMINISTIC BOTTOM-UP PARSINGPrecedence ParsingLR ParsingSYNTAX-DIRECTED TRANSLATION AND INTERMEDIATE CODE GENERATIONBottom-Up Syntax-Directed Translation and Intermediate Code GenerationTop-Down Syntax-Directed TranslationSymbol TableSemantic AnalysisSoftw

  6. Metallurgy: A compilation

    Science.gov (United States)

    1972-01-01

    A compilation on the technical uses of various metallurgical processes is presented. Descriptions are given of the mechanical properties of various alloys, ranging from TAZ-813 at 2200 F to investment cast alloy 718 at -320 F. Methods are also described for analyzing some of the constituents of various alloys from optical properties of carbide precipitates in Rene 41 to X-ray spectrographic analysis of the manganese content of high chromium steels.

  7. Using the Large Fire Simulator System to map wildland fire potential for the conterminous United States

    Science.gov (United States)

    LaWen Hollingsworth; James Menakis

    2010-01-01

    This project mapped wildland fire potential (WFP) for the conterminous United States by using the large fire simulation system developed for Fire Program Analysis (FPA) System. The large fire simulation system, referred to here as LFSim, consists of modules for weather generation, fire occurrence, fire suppression, and fire growth modeling. Weather was generated with...

  8. USGS National Assessment of Oil and Gas Project - Shale Gas Assessment Units

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey has compiled a map of shale gas assessments in the United States that were completed by 2012, such assessments having been included as...

  9. Mean annual runoff, precipitation, and evapotranspiration in the glaciated northeastern United States, 1951-80

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Two maps, compiled at 1:1,000,000 scale, depict mean annual runoff, precipitation, and evapotranspiration in the part of the United States east of Cleveland, Ohio...

  10. Mean annual runoff, precipitation, and evapotranspiration in the glaciated northeastern United States, 1951-80

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Two maps, compiled at 1:1,000,000 scale, depict mean annual runoff, precipitation, and evapotranspiration in the part of the United States east of Cleveland, Ohio...

  11. Fault-Tree Compiler

    Science.gov (United States)

    Butler, Ricky W.; Boerschlein, David P.

    1993-01-01

    Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.

  12. The Onomastic Phraseological Units in the Works of I.V. Goethe (Compiled By Authors «German Onomastic Phraseological Dictionary»

    Directory of Open Access Journals (Sweden)

    Manana Napireli

    2015-06-01

    Full Text Available In the given work we’ve got interested in one of the branch of linguistics, onomastics, actually, onomastical phraseology. The research of the onomastical field concerns the proper nouns. This trend of linguistics is very actual in our modern words and attracts much attention. Although, the research of proper nouns comes from the ancient time. The aim of our study is to find those quotes from the works of the greatest representative of German classical literature I.V. Goethe, which possess the proper nouns that became the phraseological units and expressions. The work of I.V. Goethe seems to be very interesting from the point of onomastical phraseology. We’ve revealed 12 onomastical expression; 8 in “Phaust” and 4 in defferent writings.

  13. Compilation, quality control, analysis, and summary of discrete suspended-sediment and ancillary data in the United States, 1901-2010

    Science.gov (United States)

    Lee, Casey J.; Glysson, G. Douglas

    2013-01-01

    Human-induced and natural changes to the transport of sediment and sediment-associated constituents can degrade aquatic ecosystems and limit human uses of streams and rivers. The lack of a dedicated, easily accessible, quality-controlled database of sediment and ancillary data has made it difficult to identify sediment-related water-quality impairments and has limited understanding of how human actions affect suspended-sediment concentrations and transport. The purpose of this report is to describe the creation of a quality-controlled U.S. Geological Survey suspended-sediment database, provide guidance for its use, and summarize characteristics of suspended-sediment data through 2010. The database is provided as an online application at http://cida.usgs.gov/sediment to allow users to view, filter, and retrieve available suspended-sediment and ancillary data. A data recovery, filtration, and quality-control process was performed to expand the availability, representativeness, and utility of existing suspended-sediment data collected by the U.S. Geological Survey in the United States before January 1, 2011. Information on streamflow condition, sediment grain size, and upstream landscape condition were matched to sediment data and sediment-sampling sites to place data in context with factors that may influence sediment transport. Suspended-sediment and selected ancillary data are presented from across the United States with respect to time, streamflow, and landscape condition. Examples of potential uses of this database for identifying sediment-related impairments, assessing trends, and designing new data collection activities are provided. This report and database can support local and national-level decision making, project planning, and data mining activities related to the transport of suspended-sediment and sediment-associated constituents.

  14. AN INEQUALITY OF HOMOGENEOUS EXPANSION FOR BIHOLOMORPHIC QUASI-CONVEX MAPPINGS ON THE UNIT POLYDISK AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Liu Xiaosong; Liu Taishun

    2009-01-01

    In this article, the authors obtain an inequality of homogeneous expansion for f, where f is a quasi-convex mapping (including quasi-convex mapping of type A and quasi-convex mapping of type B) defined on the open unit polydisk in Cn. Meanwhile, the authors also investigate its application.

  15. A multicriteria framework for producing local, regional, and national insect and disease risk maps

    Science.gov (United States)

    Frank J. Jr. Krist; Frank J. Sapio; Borys M. Tkacz

    2010-01-01

    The construction of the 2006 National Insect and Disease Risk Map, compiled by the USDA Forest Service, State and Private Forestry Area, Forest Health Protection Unit, resulted in the development of a GIS-based, multicriteria approach for insect and disease risk mapping that can account for regional variations in forest health concerns and threats. This risk mapping...

  16. Mapping extent and change in surface mines within the United States for 2001 to 2006

    Science.gov (United States)

    Soulard, Christopher E.; Acevedo, William; Stehman, Stephen V.; Parker, Owen P.

    2016-01-01

    A complete, spatially explicit dataset illustrating the 21st century mining footprint for the conterminous United States does not exist. To address this need, we developed a semi-automated procedure to map the country's mining footprint (30-m pixel) and establish a baseline to monitor changes in mine extent over time. The process uses mine seed points derived from the U.S. Energy Information Administration (EIA), U.S. Geological Survey (USGS) Mineral Resources Data System (MRDS), and USGS National Land Cover Dataset (NLCD) and recodes patches of barren land that meet a “distance to seed” requirement and a patch area requirement before mapping a pixel as mining. Seed points derived from EIA coal points, an edited MRDS point file, and 1992 NLCD mine points were used in three separate efforts using different distance and patch area parameters for each. The three products were then merged to create a 2001 map of moderate-to-large mines in the United States, which was subsequently manually edited to reduce omission and commission errors. This process was replicated using NLCD 2006 barren pixels as a base layer to create a 2006 mine map and a 2001–2006 mine change map focusing on areas with surface mine expansion. In 2001, 8,324 km2 of surface mines were mapped. The footprint increased to 9,181 km2 in 2006, representing a 10·3% increase over 5 years. These methods exhibit merit as a timely approach to generate wall-to-wall, spatially explicit maps representing the recent extent of a wide range of surface mining activities across the country. 

  17. Compilation of 3D global conductivity model of the Earth for space weather applications

    Science.gov (United States)

    Alekseev, Dmitry; Kuvshinov, Alexey; Palshin, Nikolay

    2015-07-01

    We have compiled a global three-dimensional (3D) conductivity model of the Earth with an ultimate goal to be used for realistic simulation of geomagnetically induced currents (GIC), posing a potential threat to man-made electric systems. Bearing in mind the intrinsic frequency range of the most intense disturbances (magnetospheric substorms) with typical periods ranging from a few minutes to a few hours, the compiled 3D model represents the structure in depth range of 0-100 km, including seawater, sediments, earth crust, and partly the lithosphere/asthenosphere. More explicitly, the model consists of a series of spherical layers, whose vertical and lateral boundaries are established based on available data. To compile a model, global maps of bathymetry, sediment thickness, and upper and lower crust thicknesses as well as lithosphere thickness are utilized. All maps are re-interpolated on a common grid of 0.25×0.25 degree lateral spacing. Once the geometry of different structures is specified, each element of the structure is assigned either a certain conductivity value or conductivity versus depth distribution, according to available laboratory data and conversion laws. A numerical formalism developed for compilation of the model, allows for its further refinement by incorporation of regional 3D conductivity distributions inferred from the real electromagnetic data. So far we included into our model four regional conductivity models, available from recent publications, namely, surface conductance model of Russia, and 3D conductivity models of Fennoscandia, Australia, and northwest of the United States.

  18. 《预制组合立管技术规范》GB50682-2011编制与介绍%Compilation and Introduction of Technical Code for Pre.fabricated United Pipe Risers GB50682-2011

    Institute of Scientific and Technical Information of China (English)

    张杰; 叶永杨; 尹奎; 蒋隆

    2011-01-01

    高层建筑高度高,管井管道密集,传统管井技术已不能满足管线布置要求.住房和城乡建设部建标[ 2009]88号文件将《预制组合立管技术规范》GB50682-2011列入国家规范编制计划,由中建三局第一建设工程有限责任公司、同济大学等单位编制,将于2012年1月1日实施.介绍了规范编制思路与主要内容,指出预制组合立管技术的适用条件、预制组合立管的设计原则与方法、预制组合立管设计的荷载计算、预制组合立管管道与其支架的制作和安装工艺,以及预制立管试验与验收方法等.该规范独创了密集立管与主体结构同步安装的施工体系,可加快施工速度、降低管井作业风险、提高工程质量.%The traditional pipe riser technology can not meet requirements of pipelines arrangement of tall buildings because of the large height and dense pipelines. According to the file of No. [2009] 88 from the Ministry of Housing and Urban-rural Development, the National Code of Technical Code for Pre-fabricated ?United Pipe Risers GB50682-2011 is compiled by the First Construction Limited Company of China Construction Third Engineering Bureau, Tongji University and some other related companies. It will implement on January 1, 2012. The compiling thinking and main contents of the code are introduced such as applicable conditions, design principle, loads calculation of pre-fabricated united pipe risers, fabrication and assembly of the pipe and support frame, as well as test and acceptance of pre-fabricated unit pipes. The code originally created synchronizing assembly of dense pipes and main structure to shorten the construction time, lower work risks and improve project quality.

  19. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    Optimizing compilers are vital for performance. However, compilers ability to optimize aggressively is limited in some cases. To address this limitation, we have developed a compiler guiding the programmer in making small source code changes, potentially making the source code more amenable...... to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...... of the programmers development flow. We have evaluated our preliminary implementation and show it can guide to a 12% improvement in performance. Furthermore the tool can be used as an interactive optimization adviser improving the performance of the code generated by a production compiler. Here it can lead to a 153...

  20. Concept Mapping as an Instrument for Evaluating an Instruction Unit on Holography (Concept Maps als Evaluierungsinstrumente einer Unterrichtseinheit zur Holographie)

    CERN Document Server

    Horn, M E; Horn, Martin Erik; Mikelskis, Helmut F.

    2004-01-01

    Due to its amazing three-dimensional effects, holography is a very motivating, yet very demanding subject for physics classes at the upper level in school. For this reason an instruction unit on holography that supplement holographic experiments with computer-supported work sessions and a simulation program was developed. The effects of the lessons on holography were determined by a pre-post-test design. In addition to videotaping the lessons, knowledge and motivational tests as well as student interviews, students were asked to prepare concept maps, which were used to track processes of model construction. The way this knowledge was applied largely depends on the students' understanding of models. In particular it was shown that the participating students' demonstrated capacity for distinguishing between the different models of light is of great importance. Only students with a developed capacity for distinguishing between models are able to reason in an problem-oriented manner. They recognize the limits of ...

  1. Non-Markovianity Measure Based on Brukner–Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    Science.gov (United States)

    He, Zhi; Zhu, Lie-Qiang; Li, Li

    2017-03-01

    A non-Markovianity measure based on Brukner–Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner–Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner–Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. Supported by the National Natural Science Foundation of China under Grant No. 61505053, the Natural Science Foundation of Hunan Province under Grant No. 2015JJ3092, the Research Foundation of Education Bureau of Hunan Province, China under Grant No. 16B177, the School Foundation from the Hunan University of Arts and Science under Grant No. 14ZD01

  2. Square-Mile Cells that represent Proprietary Gas-producing Wells from Shale Intervals in the United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey has compiled a map of shale gas assessments in the United States that were completed by 2012, such assessments having been included as...

  3. An Automatic Uav Mapping System for Supporting un (united Nations) Field Operations

    Science.gov (United States)

    Choi, K.; Cheon, J. W.; Kim, H. Y.; Lee, I.

    2016-06-01

    The United Nations (UN) has performed field operations worldwide such as peacekeeping or rescue missions. When such an operation is needed, the UN dispatches an operation team usually with a GIS (Geographic Information System) customized to a specific operation. The base maps for the GIS are generated mostly with satellite images which may not retain a high resolution and the current situation. To build an up-to-date high resolution map, we propose a UAV (unmanned aerial vehicle) based automatic mapping system, which can operate in a fully automatic way from the data acquisition of sensory data to the data processing for the generation of the geospatial products such as a mosaicked orthoimage of a target area. In this study, we analyse the requirements for UN field operations, suggest a UAV mapping system with an operation scenario, and investigate the applicability of the system. With the proposed system, we can construct a tailored GIS with up-to-date and high resolution base maps for a specific operation efficiently.

  4. Documentation for the 2014 update of the United States national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter M.; Mueller, Charles S.; Haller, Kathleen M.; Frankel, Arthur D.; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen C.; Boyd, Oliver S.; Field, Ned; Chen, Rui; Rukstales, Kenneth S.; Luco, Nico; Wheeler, Russell L.; Williams, Robert A.; Olsen, Anna H.

    2014-01-01

    The national seismic hazard maps for the conterminous United States have been updated to account for new methods, models, and data that have been obtained since the 2008 maps were released (Petersen and others, 2008). The input models are improved from those implemented in 2008 by using new ground motion models that have incorporated about twice as many earthquake strong ground shaking data and by incorporating many additional scientific studies that indicate broader ranges of earthquake source and ground motion models. These time-independent maps are shown for 2-percent and 10-percent probability of exceedance in 50 years for peak horizontal ground acceleration as well as 5-hertz and 1-hertz spectral accelerations with 5-percent damping on a uniform firm rock site condition (760 meters per second shear wave velocity in the upper 30 m, VS30). In this report, the 2014 updated maps are compared with the 2008 version of the maps and indicate changes of plus or minus 20 percent over wide areas, with larger changes locally, caused by the modifications to the seismic source and ground motion inputs.

  5. Multiscale sampling of plant diversity: Effects of minimum mapping unit size

    Science.gov (United States)

    Stohlgren, T.J.; Chong, G.W.; Kalkhan, M.A.; Schell, L.D.

    1997-01-01

    Only a small portion of any landscape can be sampled for vascular plant diversity because of constraints of cost (salaries, travel time between sites, etc.). Often, the investigator decides to reduce the cost of creating a vegetation map by increasing the minimum mapping unit (MMU), and/or by reducing the number of vegetation classes to be considered. Questions arise about what information is sacrificed when map resolution is decreased. We compared plant diversity patterns from vegetation maps made with 100-ha, 50-ha, 2-ha, and 0.02-ha MMUs in a 754-ha study area in Rocky Mountain National Park, Colorado, United States, using four 0.025-ha and 21 0.1-ha multiscale vegetation plots. We developed and tested species-log(area) curves, correcting the curves for within-vegetation type heterogeneity with Jaccard's coefficients. Total species richness in the study area was estimated from vegetation maps at each resolution (MMU), based on the corrected species-area curves, total area of the vegetation type, and species overlap among vegetation types. With the 0.02-ha MMU, six vegetation types were recovered, resulting in an estimated 552 species (95% CI = 520-583 species) in the 754-ha study area (330 plant species were observed in the 25 plots). With the 2-ha MMU, five vegetation types were recognized, resulting in an estimated 473 species for the study area. With the 50-ha MMU, 439 plant species were estimated for the four vegetation types recognized in the study area. With the 100-ha MMU, only three vegetation types were recognized, resulting in an estimated 341 plant species for the study area. Locally rare species and keystone ecosystems (areas of high or unique plant diversity) were missed at the 2-ha, 50-ha, and 100-ha scales. To evaluate the effects of minimum mapping unit size requires: (1) an initial stratification of homogeneous, heterogeneous, and rare habitat types; and (2) an evaluation of within-type and between-type heterogeneity generated by environmental

  6. Voyager Outreach Compilation

    Science.gov (United States)

    1998-01-01

    This NASA JPL (Jet Propulsion Laboratory) video presents a collection of the best videos that have been published of the Voyager mission. Computer animation/simulations comprise the largest portion of the video and include outer planetary magnetic fields, outer planetary lunar surfaces, and the Voyager spacecraft trajectory. Voyager visited the four outer planets: Jupiter, Saturn, Uranus, and Neptune. The video contains some live shots of Jupiter (actual), the Earth's moon (from orbit), Saturn (actual), Neptune (actual) and Uranus (actual), but is mainly comprised of computer animations of these planets and their moons. Some of the individual short videos that are compiled are entitled: The Solar System; Voyage to the Outer Planets; A Tour of the Solar System; and the Neptune Encounter. Computerized simulations of Viewing Neptune from Triton, Diving over Neptune to Meet Triton, and Catching Triton in its Retrograde Orbit are included. Several animations of Neptune's atmosphere, rotation and weather features as well as significant discussion of the planet's natural satellites are also presented.

  7. Preliminary overview map of volcanic hazards in the 48 conterminous United States

    Science.gov (United States)

    Mullineaux, D.R.

    1976-01-01

    Volcanic eruptions and related phenomena can be expected to occur in the Western United States, and in some places are potentially hazardous enough to be considered in longe-range land-use planning. But the immediate risk from volcanic hazards is low because eruptions are so infrequent in the conterminous United States that few, if any, occur during any one person 1s lifetime. Furthermore, severely destructive effects of eruptions, other than extremely rare ones of catastrophic scale, probably would be limited to areas within a few tens of kilometers downvalley or downwind from a volcano. Thus, the area seriously endangered by any one eruption would be only a very small part of the Western United States. The accompanying map identifies areas in which volcanic hazards pose some degree of risk, and shows that the problem is virtually limited to the far western States. The map also shows the possible areal distribution of several kinds of dangerous eruptive events and indicates the relative likelihood of their occurrence at various volcanoes. The kinds of events described here as hazards are those that can occur suddenly and with little or no warning; they do not include long-term geologic processes. Table 1 summarizes the origin and some characteristics of potentially hazardous volcanic phenomena. The map is diagrammatic. It does not show the specific location of the next expected eruption , because such an event cannot be reliably predicted . Instead, the map shows general areas or zones that, over a long period of time, are relatively likely to be affected in one or more places by various kinds of hazardous volcanic events. However, only a small part of one of these areas would be affected by any single eruption.

  8. Mapping quantal touch using 7 Tesla functional magnetic resonance imaging and single-unit intraneural microstimulation.

    Science.gov (United States)

    Sanchez Panchuelo, Rosa Maria; Ackerley, Rochelle; Glover, Paul M; Bowtell, Richard W; Wessberg, Johan; Francis, Susan T; McGlone, Francis

    2016-05-07

    Using ultra-high field 7 Tesla (7T) functional magnetic resonance imaging (fMRI), we map the cortical and perceptual responses elicited by intraneural microstimulation (INMS) of single mechanoreceptive afferent units in the median nerve, in humans. Activations are compared to those produced by applying vibrotactile stimulation to the unit's receptive field, and unit-type perceptual reports are analyzed. We show that INMS and vibrotactile stimulation engage overlapping areas within the topographically appropriate digit representation in the primary somatosensory cortex. Additional brain regions in bilateral secondary somatosensory cortex, premotor cortex, primary motor cortex, insula and posterior parietal cortex, as well as in contralateral prefrontal cortex are also shown to be activated in response to INMS. The combination of INMS and 7T fMRI opens up an unprecedented opportunity to bridge the gap between first-order mechanoreceptive afferent input codes and their spatial, dynamic and perceptual representations in human cortex.

  9. Public Seagrass Compilation for West Coast Essential Fish Habitat (EFH) Environmental Impact Statement

    Data.gov (United States)

    Pacific States Marine Fisheries Commission — These data are a compilation of currently available seagrass GIS data sets for the west coast of the United States. These data have been compiled from seventeen...

  10. Maps showing ground-water units and withdrawal, Basin and Range Province, Texas

    Science.gov (United States)

    Brady, B.T.; Bedinger, M.S.; Mikels, John

    1984-01-01

    This report on ground-water units and withdrawal in the Basin and Range province of Texas (see index map) was prepared as part of a program of the U.S. Geological Survey to identify prospective regions for further study relative to isolation of high-level nuclear waste (Bedinger, Sargent, and Reed, 1984), utilizing program guidelines defined in Sargent and Bedinger (1984). Also included in this report are selected references on pertinent geologic and hydrologic studies of the region. Other map reports in this series contain detailed data on ground-water quality, surface distribution of selected rock types, tectonic conditions, areal geophysics, Pleistocene lakes and marshes, and mineral and energy resources.

  11. Combining forest inventory, satellite remote sensing, and geospatial data for mapping forest attributes of the conterminous United States

    Science.gov (United States)

    Mark Nelson; Greg Liknes; Charles H. Perry

    2009-01-01

    Analysis and display of forest composition, structure, and pattern provides information for a variety of assessments and management decision support. The objective of this study was to produce geospatial datasets and maps of conterminous United States forest land ownership, forest site productivity, timberland, and reserved forest land. Satellite image-based maps of...

  12. Karst Map of Puerto Rico

    Science.gov (United States)

    Aleman-Gonzalez, Wilma B.

    2010-01-01

    This map is a digital compilation, combining the mapping of earlier geologists. Their work, cited on the map, contains more detailed descriptions of karst areas and landforms in Puerto Rico. This map is the basis for the Puerto Rico part of a new national karst map currently being compiled by the U.S. Geological Survey. In addition, this product is a standalone, citable source of digital karst data for Puerto Rico. Nearly 25 percent of the United States is underlain by karst terrain, and a large part of that area is undergoing urban and industrial development. Accurate delineations of karstic rocks are needed at scales suitable for national, State, and local maps. The data on this map contribute to a better understanding of subsidence hazards, groundwater contamination potential, and cave resources as well as serve as a guide to topical research on karst. Because the karst data were digitized from maps having a different scale and projection from those on the base map used for this publication, some karst features may not coincide perfectly with physiographic features portrayed on the base map.

  13. Spatial disaggregation of complex soil map units at regional scale based on soil-landscape relationships

    Science.gov (United States)

    Vincent, Sébastien; Lemercier, Blandine; Berthier, Lionel; Walter, Christian

    2015-04-01

    Accurate soil information over large extent is essential to manage agronomical and environmental issues. Where it exists, information on soil is often sparse or available at coarser resolution than required. Typically, the spatial distribution of soil at regional scale is represented as a set of polygons defining soil map units (SMU), each one describing several soil types not spatially delineated, and a semantic database describing these objects. Delineation of soil types within SMU, ie spatial disaggregation of SMU allows improved soil information's accuracy using legacy data. The aim of this study was to predict soil types by spatial disaggregation of SMU through a decision tree approach, considering expert knowledge on soil-landscape relationships embedded in soil databases. The DSMART (Disaggregation and Harmonization of Soil Map Units Through resampled Classification Trees) algorithm developed by Odgers et al. (2014) was used. It requires soil information, environmental covariates, and calibration samples, to build then extrapolate decision trees. To assign a soil type to a particular spatial position, a weighed random allocation approach is applied: each soil type in the SMU is weighted according to its assumed proportion of occurrence in the SMU. Thus soil-landscape relationships are not considered in the current version of DSMART. Expert rules on soil distribution considering the relief, parent material and wetlands location were proposed to drive the procedure of allocation of soil type to sampled positions, in order to integrate the soil-landscape relationships. Semantic information about spatial organization of soil types within SMU and exhaustive landscape descriptors were used. In the eastern part of Brittany (NW France), 171 soil types were described; their relative area in the SMU were estimated, geomorphological and geological contexts were recorded. The model predicted 144 soil types. An external validation was performed by comparing predicted

  14. A special purpose silicon compiler for designing supercomputing VLSI systems

    Science.gov (United States)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  15. On Ladder Diagrams Compilation and Synthesis to FPGA Implemented Reconfigurable Logic Controller

    Directory of Open Access Journals (Sweden)

    Adam Milik

    2014-01-01

    Full Text Available The paper presents synthesis process of a hardware implemented reconfigurable logic controller from a ladder diagram according to IEC61131-3 requirements. It is focused on the originally developed a high-performance LD processing method. It is able to process a set of diagrams restricted to logic operations in a single clock cycle independently from the number of processed rungs. The paper considers the compilation of the ladder diagram into an intermediate form suitable for logic synthesis process according to developed processing method. The enhanced data flow graph (EDFG has been developed for the intermediate representation of an LD program. The original construction of the EDFG with attributed edges has been described. It allows for efficient representation and processing of logic and arithmetic formulas. The set of compilation algorithms that allow to preserve serial analysis order and to obtain massively parallel processing unit are presented. The overview of a hardware mapping concludes the presented considerations.

  16. MAPPING GLAUCONITE UNITES WITH USING REMOTE SENSING TECHNIQUES IN NORTH EAST OF IRAN

    Directory of Open Access Journals (Sweden)

    R. Ahmadirouhani

    2014-10-01

    Full Text Available Glauconite is a greenish ferric-iron silicate mineral with micaceous structure, characteristically formed in shallow marine environments. Glauconite has been used as a pigmentation agent for oil paint, contaminants remover in environmental studies and a source of potassium in plant fertilizers, and other industries. Koppeh-dagh basin is extended in Iran, Afghanistan and Turkmenistan countries and Glauconite units exist in this basin. In this research for enhancing and mapping glauconitic units in Koppeh-dagh structural zone in north east of Iran, remote sensing techniques such as Spectral Angle Mapper classification (SAM, band ratio and band composition methods on SPOT, ASTER and Landsat data in 3 steps were applied.

  17. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    Science.gov (United States)

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  18. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  19. Theory and practice of compilation

    CERN Document Server

    Langmaack, H

    1972-01-01

    Compilation is the translation of high level language programs into machine code. Correct translation can only be achieved if syntax and semantics of programming languages are clearly defined and strictly obeyed by compiler constructors. The author presents a simple extendable scheme for syntax and semantics to be defined rigorously. This scheme fits many programming languages, especially ALGOL-like ones. The author considers statements and programs to be notations of state transformations; in special cases storage state transformations. (5 refs).

  20. Bedrock geologic map of Vermont

    Science.gov (United States)

    Ratcliffe, Nicholas M.; Stanley, Rolfe S.; Gale, Marjorie H.; Thompson, Peter J.; Walsh, Gregory J.; With contributions by Hatch, Norman L.; Rankin, Douglas W.; Doolan, Barry L.; Kim, Jonathan; Mehrtens, Charlotte J.; Aleinikoff, John N.; McHone, J. Gregory; Cartography by Masonic, Linda M.

    2011-01-01

    The Bedrock Geologic Map of Vermont is the result of a cooperative agreement between the U.S. Geological Survey (USGS) and the State of Vermont. The State's complex geology spans 1.4 billion years of Earth's history. The new map comes 50 years after the most recent map of the State by Charles G. Doll and others in 1961 and a full 150 years since the publication of the first geologic map of Vermont by Edward Hitchcock and others in 1861. At a scale of 1:100,000, the map shows an uncommon level of detail for State geologic maps. Mapped rock units are primarily based on lithology, or rock type, to facilitate derivative studies in multiple disciplines. The 1961 map was compiled from 1:62,500-scale or smaller maps. The current map was created to integrate more detailed (1:12,000- to 1:24,000-scale) modern and older (1:62,500-scale) mapping with the theory of plate tectonics to provide a framework for geologic, tectonic, economic, hydrogeologic, and environmental characterization of the bedrock of Vermont. The printed map consists of three oversize sheets (52 x 76 inches). Sheets 1 and 2 show the southern and northern halves of Vermont, respectively, and can be trimmed and joined so that the entire State can be displayed as a single entity. These sheets also include 10 cross sections and a geologic structure map. Sheet 3 on the front consists of descriptions of 486 map units, a correlation of map units, and references cited. Sheet 3 on the back features a list of the 195 sources of geologic map data keyed to an index map of 7.5-minute quadrangles in Vermont, as well as a table identifying ages of rocks dated by uranium-lead zircon geochronology.

  1. The General Urban Plan of Casimcea territorial administrative unit, map of natural and anthropogenic risks

    Directory of Open Access Journals (Sweden)

    Sorin BĂNICĂ

    2013-08-01

    Full Text Available The General Urban Plan represents the legal ground for any development action proposed. After endorsement and approval as required by law, GUP is act of authority of local government for the area in which it applies. The aim is to establish priorities regulations applied in land use planning and construction of structures. In terms of geographical location, the administrative territory of Casimcea, Tulcea county, falls in the central Northwest Plateau Casimcei. This is the second unit of the Central Dobrogea Plateau. Geographical location in southeastern Romania, climatic and relief conditions and anthropogenic pressure, expose the village administrative territorial unit Casimcea, permanent susceptibility to produce natural and antropogenical risks. In this context, we identified the following categories of natural and anthropogenic risks: i natural risk phenomena (earthquakes, strong winds, heavy rains, floods caused by overflowing or precipitation, erosion of river banks and torrents, gravitational processes, rain droplet erosion and surface soil erosion; and ii anthropogenic risk phenomena (overgrazing, chemicals use in agriculture, road transport infrastructure and electricity, wind turbines for electricity production, waste deposits, agro-zootechnical complexs, and human cemeteries. Extending their surface was materialized by creating a map of natural and anthropogenic risk on Casimcea territorial administrative unit, explaining the share of potentially affected areas as territorial balance

  2. Geochemical and mineralogical maps for soils of the conterminous United States

    Science.gov (United States)

    Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Solano, Federico; Ellefsen, Karl J.

    2014-01-01

    The U.S. Geological Survey began sampling in 2007 for a low-density (1 site per 1,600 square kilometers, 4,857 sites) geochemical and mineralogical survey of soils in the conterminous United States as part of the North American Soil Geochemical Landscapes Project. The sampling protocol for the national-scale survey included, at each site, a sample from a depth of 0 to 5 centimeters, a composite of the soil A horizon, and a deeper sample from the soil C horizon or, if the top of the C horizon was at a depth greater than 1 meter, a sample from a depth of approximately 80–100 centimeters. The elements by methods that yield the total or near-total elemental content. The major mineralogical components in the samples from the soil A and C horizons were determined by a quantitative X-ray diffraction method using Rietveld refinement. Sampling in the conterminous United States was completed in 2010, with chemical and mineralogical analyses completed in May 2013. The resulting data set provides an estimate of the abundance and spatial distribution of chemical elements and minerals in soils of the conterminous United States and represents a baseline for soil geochemistry and mineralogy against which future changes may be recognized and quantified. This report releases geochemical and mineralogical maps along with a histogram, boxplot, and empirical cumulative distribution function plot for each element or mineral.

  3. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  4. The mapping methods and division of tectonic units of the regional tectonic map in the eastern China seas and adjacent regions

    Institute of Scientific and Technical Information of China (English)

    YIN Yanhong; ZHANG Xunhua; WEN Zhenhe; GUO Zhenxuan

    2009-01-01

    The geological-geophysical map series of the eastern China seas and adjacent region (1:1 000 000) will be published in the late half year of 2009. The regional tectonic map is one of the main professional maps. The Mapping methods, the division method of geological tectonic units and the main geological tectonic units are mainly discussed. The strata from Pliocene to Holocene are peeled off so as to display the Pre-Pliocene structures. In basins, isopaches are drawn for the Cenozoic deposits. The plate tectonic theory and present tectonic pattern are adopted as the priorities in tectonic division. As to the division of intraplate tectonic units, it is a revision, complement and improvement of previous dividing systems, and the nomenclature for each tectonic unit follows the current system in China. The first-order tectonic unit is plate (Pacific Plate, Eurasian Plate and Philippine Sea Plate). The second-order tectonic unit is tectonic domain (East Asian continental tectonic domain,East Asian continental margin tectonic domain and west Pacific tectonic domain). The Philippine Sea Plate and the west part of the Pacific Plate are called the West Pacific tectonic domain. The part of the Eurasian Plate involved in this study area can be further divided into East Asian continental tectonic domain and East Asian continental margin tectonic domain. The East Asian continental margin domain is composed of the Ryukyu island arc, the Okinawa Trough back-arc basin and the back-arc basin of Sea of Japan. The East Asian continental tectonic domain in this study area is composed of the Sino-Korea Massif, the Changjiang River (Yangtze) Massif and South China Massif. In turn, these massifs consist of basins, folded belts or uplift zones. The basins,the folded belts or the uplift zones are further divided into uplifts and depressions made up of sags and swells.

  5. Understanding Variations of Soil Mapping Units and Associated Data for Forensic Science.

    Science.gov (United States)

    Suarez, Melissa D; Southard, Randal J; Parikh, Sanjai J

    2015-07-01

    Soil samples have potential to be useful in forensic investigations, but their utility may be limited due to the inherent variability of soil properties, the wide array of analytical methods, and complexity of data analysis. This study examined the differentiation of similar soils based on both gross (texture, color, mineralogy) and explicit soil properties (elemental composition, cation exchange, Fe-oxyhydroxides). Soils were collected from Fallbrook and adjacent map units from Riverside and San Diego Counties in California. Samples were characterized using multiple techniques, including chemical extracts, X-ray diffraction (XRD), and Fourier transform infrared spectroscopy. Results were analyzed using multiple analytical approaches to compare counties and land uses. Some analyses (XRD, extractions) were better at distinguishing among samples than others (color, texture). Ratios of rare earth elements were particularly useful for distinguishing samples between counties. This potential to "fingerprint" soils illustrates the usefulness of a comprehensive soil database for criminal investigators.

  6. Transmission Maps of the ACIS UV/Optical Blocking Filter Flight Units

    Science.gov (United States)

    Townsley, L. K.; Broos, P. S.; Mackay, J. F.

    1996-05-01

    The AXAF CCD Imaging Spectrometer (ACIS) employs filters made of Lexan coated on both sides with aluminum to block optical and UV light, so that the CCDs see only X-radiation from astronomical targets. These filters must be characterized by spatially mapping their transmission at various astrophysically and instrumentally important energies. The Penn State University ACIS team determined that a synchrotron, where a variety of well-determined X-ray energies is available, would provide the best calibration. We measured engineering grade UV/optical blocking filters at the University of Wisconsin Synchrotron Radiation Center (SRC) in June and October 1995, modified the hardware and software on a dry run in January 1996, and just completed the calibration of the flight filters in March 1996. The Multilayer Beamline at the SRC was used for these measurements because it can access several energies important to the calibration and its built-in, computer-controlled x-z stage allows us to map the filters automatically with user-specified spatial resolution. These transmission maps formed the basis for choosing the actual flight filter units from the set of filters manufactured with flight specifications. We obtained transmission measurements at five energies in the range 200-2000 eV. We present here best-fit models of the filter transmission based on these data points. Better than one percent accuracy in transmission as a function of energy was achieved over the entire filter area on scales corresponding to thirty arcseconds in the focal plane of AXAF (the amplitude of the planned aspect dither of the spacecraft). The pair of filters (one for the Imaging array and one for the Spectroscopy array) selected for flight will be installed on the ACIS focal plane in early summer.

  7. Mapping the terrestrial reptile distributions in Oman and the United Arab Emirates

    Directory of Open Access Journals (Sweden)

    Andrew Gardner

    2009-12-01

    Full Text Available The terrestrial reptile fauna of Oman and the United Arab Emirates is rich, with at least 79 species of lizards and snakes and a single species of worm lizard. However, to date there have been no accurate maps published of their distribution ranges, and distribution data relies on scattered museum specimen localities and published accounts. Considerable numbers of locality data points do exist, collected by visting and resident herpetologists, and more recently, from ecologists working on surveys for environmental impact assessments and biodiversity action plans. These data are invaluable, as amongst other uses, they can assist conservation planning and management, and will eventually document changes in distributions over time. This is especially true where there has been extensive habitat loss and degradation due to urbanisation and development activities. Data have been collected from museum records, published accounts and unpublished data from a variety of sources, including many records made by the author over the last 20 years, with the aim of producing an atlas of species distributions. The number of records is now approaching 5.000, giving sufficient coverage to produce maps that are useful for a variety of applications. Examples are discussed, including endangered and endemic species, snakes of medical importance and species of potential interest in ecological and evolutionary studies.

  8. Certifying cost annotations in compilers

    CERN Document Server

    Amadio, Roberto M; Régis-Gianas, Yann; Saillard, Ronan

    2010-01-01

    We discuss the problem of building a compiler which can lift in a provably correct way pieces of information on the execution cost of the object code to cost annotations on the source code. To this end, we need a clear and flexible picture of: (i) the meaning of cost annotations, (ii) the method to prove them sound and precise, and (iii) the way such proofs can be composed. We propose a so-called labelling approach to these three questions. As a first step, we examine its application to a toy compiler. This formal study suggests that the labelling approach has good compositionality and scalability properties. In order to provide further evidence for this claim, we report our successful experience in implementing and testing the labelling approach on top of a prototype compiler written in OCAML for (a large fragment of) the C language.

  9. Mapping Natural Terroir Units using a multivariate approach and legacy data

    Science.gov (United States)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    Natural Terroir Unit (NTU) is a volume of earth's biosphere that is characterized by a stable set of variables related to the topography, climate, geology and soil. Methods to study the association soil-climate-vines are numerous, but the main question is always: which variables are actually important for the quality and the typicality of grapevines, and then wine, for a particular scale? This work aimed to setting up a multivariate methodology to define viticultural terroirs at the province scale (1:125,000), using viticultural and oenological legacy data. The study area was the Siena province in the Tuscany region (Central Italy). The reference grapevine cultivar was "Sangiovese", which is the most important cultivar of the region. The methodology was based upon the creation of a GIS storing several viticultural and oenological legacy data of 55 experimental vineyards (vintages between 1989-2009), the long term climate data, the digital elevation model, the soil-landscapes (land systems) and the soil profiles with the soil analysis. The selected viticultural and oenological parameters were: must sugar content, sugar accumulation rate from veraison to harvest, must titratable acidity, grape yield per vine, number of bunches for vine, mean bunch weight, and mean weight of berries. The environmental parameters related to viticulture, selected by an explorative PCA, were: elevation, mean annual temperature, mean soil temperature, annual precipitation, clay, sand and gravel content of soils, soil water availability, redoximorphic features and rooting depth. The geostatistical models of the variables interpolation were chosen on the best of mean standardize error, obtained by the cross-validation, between "Simple cokriging with varying local mean", "Multicollocated simple cokriging with varying local mean" and "Regression kriging". These variables were used for a k-means clustering aimed to map the Natural Terroirs Units (NTUs). The viticultural areas of Siena province

  10. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Political Areas of the United States 201403 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the counties and equivalent entities of the United States, Puerto Rico, and the U.S. Virgin Islands. States and the...

  11. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Inland Water Areas of the United States 201406 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing waterbodies and wetlands of the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified...

  12. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Streams of the United States 201406 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing streams in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of the...

  13. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Canals and Aqueducts of the United States 201406 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the canals, aqueducts, and the Intracoastal Waterway in the United States, Puerto Rico, and the U.S. Virgin Islands....

  14. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Political Boundary Lines of the United States 201403 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the boundaries of counties and equivalent entities of the United States, Puerto Rico, and the U.S. Virgin Islands....

  15. Broad-band Rayleigh wave phase velocity maps (10-150 s) across the United States from ambient noise data

    Science.gov (United States)

    Zhao, Kaifeng; Luo, Yinhe; Xie, Jun

    2017-02-01

    In this study, we demonstrate the feasibility of imaging broad-band (10-150 s) Rayleigh wave phase velocity maps on a continental scale using ambient noise tomography (ANT). We obtain broad-band Rayleigh waves from cross-correlations of ambient noise data between all station pairs of USArray and measure the dispersion curves from these cross-correlations at a period band of 10-150 s. The large-scale dense USArray enables us to obtain over 500 000 surface wave paths which cover the contiguous United States densely. Using these paths, we generate Rayleigh wave phase velocity maps at 10-150 s periods. Our phase velocity maps are similar to other reported phase velocity maps based on ambient noise data at short periods (phase velocity maps from ANT can be used to construct 3-D lithospheric and asthenospheric velocity structures.

  16. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  17. A compiler for variational forms

    CERN Document Server

    Kirby, Robert C; 10.1145/1163641.1163644

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in some cases the speedup is as large as a factor 1000.

  18. Suppressing escape events in maps of the unit interval with demographic noise

    Science.gov (United States)

    Parra-Rojas, César; Challenger, Joseph D.; Fanelli, Duccio; McKane, Alan J.

    2016-11-01

    We explore the properties of discrete-time stochastic processes with a bounded state space, whose deterministic limit is given by a map of the unit interval. We find that, in the mesoscopic description of the system, the large jumps between successive iterates of the process can result in probability leaking out of the unit interval, despite the fact that the noise is multiplicative and vanishes at the boundaries. By including higher-order terms in the mesoscopic expansion, we are able to capture the non-Gaussian nature of the noise distribution near the boundaries, but this does not preclude the possibility of a trajectory leaving the interval. We propose a number of prescriptions for treating these escape events, and we compare the results with those obtained for the metastable behavior of the microscopic model, where escape events are not possible. We find that, rather than truncating the noise distribution, censoring this distribution to prevent escape events leads to results which are more consistent with the microscopic model. The addition of higher moments to the noise distribution does not increase the accuracy of the final results, and it can be replaced by the simpler Gaussian noise.

  19. Quantitative analysis of terrain units mapped in the northern quarter of Venus from Venera 15/16 data

    Science.gov (United States)

    Schaber, G. G.

    1991-01-01

    The contacts between 34 geological/geomorphic terrain units in the northern quarter of Venus mapped from Venera 15/16 data were digitized and converted to a Sinusoidal Equal-Area projection. The result was then registered with a merged Pioneer Venus/Venera 15/16 altimetric database, root mean square (rms) slope values, and radar reflectivity values derived from Pioneer Venus. The resulting information includes comparisons among individual terrain units and terrain groups to which they are assigned in regard to percentage of map area covered, elevation, rms slopes, distribution of suspected craters greater than 10 km in diameter.

  20. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  1. Detailed mapping of surface units on Mars with HRSC color data

    Science.gov (United States)

    Combe, J.-Ph.; Wendt, L.; McCord, T. B.; Neukum, G.

    2008-09-01

    Introduction: Making use of HRSC color data Mapping outcrops of clays, sulfates and ferric oxides are basis information to derive the climatic, tectonic and volcanic evolution of Mars, especially the episodes related to the presence of liquid water. The challenge is to resolve spatially the outcrops and to distinguish these components from the globally-driven deposits like the iron oxide-rich bright red dust and the basaltic dark sands. The High Resolution Stereo Camera (HRSC) onboard Mars-Express has five color filters in the visible and near infrared that are designed for visual interpretation and mapping various surface units [1]. It provides also information on the topography at scale smaller than a pixel (roughness) thanks to the different geometry of observation for each color channel. The HRSC dataset is the only one that combines global coverage, 200 m/pixel spatial resolution or better and filtering colors of light. The present abstract is a work in progress (to be submitted to Planetary and Space Science) that shows the potential and limitations of HRSC color data as visual support and as multispectral images. Various methods are described from the most simple to more complex ones in order to demonstrate how to make use of the spectra, because of the specific steps of processing they require [2-4]. The objective is to broaden the popularity of HRSC color data, as they could be used more widely by the scientific community. Results prove that imaging spectrometry and HRSC color data complement each other for mapping outcrops types. Example regions of interest HRSC is theoretically sensitive to materials with absorption features in the visible and near-infrared up to 1 μm. Therefore, oxide-rich red dust and basalts (pyroxenes) can be mapped, as well as very bright components like water ice [5, 6]. Possible detection of other materials still has to be demonstrated. We first explore regions where unusual mineralogy appears clearly from spectral data. Hematite

  2. Generating a Pattern Matching Compiler by Partial Evaluation

    DEFF Research Database (Denmark)

    Jørgensen, Knud Jesper

    1991-01-01

    Datalogi, partial Evaluation, Compiling, denotational Semantics, Pattern Matching, Semantic directed Compiler Generation......Datalogi, partial Evaluation, Compiling, denotational Semantics, Pattern Matching, Semantic directed Compiler Generation...

  3. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  4. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  5. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    Science.gov (United States)

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  6. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units.

    Science.gov (United States)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios Gruson, Martha; Baechler, Sébastien

    2015-09-01

    According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as

  7. Soil Moisture Mapping in an Arid Area Using a Land Unit Area (LUA Sampling Approach and Geostatistical Interpolation Techniques

    Directory of Open Access Journals (Sweden)

    Saeid Gharechelou

    2016-03-01

    Full Text Available Soil moisture (SM plays a key role in many environmental processes and has a high spatial and temporal variability. Collecting sample SM data through field surveys (e.g., for validation of remote sensing-derived products can be very expensive and time consuming if a study area is large, and producing accurate SM maps from the sample point data is a difficult task as well. In this study, geospatial processing techniques are used to combine several geo-environmental layers relevant to SM (soil, geology, rainfall, land cover, etc. into a land unit area (LUA map, which delineates regions with relatively homogeneous geological/geomorphological, land use/land cover, and climate characteristics. This LUA map is used to guide the collection of sample SM data in the field, and the field data is finally spatially interpolated to create a wall-to-wall map of SM in the study area (Garmsar, Iran. The main goal of this research is to create a SM map in an arid area, using a land unit area (LUA approach to obtain the most appropriate sample locations for collecting SM field data. Several environmental GIS layers, which have an impact on SM, were combined to generate a LUA map, and then field surveying was done in each class of the LUA map. A SM map was produced based on LUA, remote sensing data indexes, and spatial interpolation of the field survey sample data. The several interpolation methods (inverse distance weighting, kriging, and co-kriging were evaluated for generating SM maps from the sample data. The produced maps were compared to each other and validated using ground truth data. The results show that the LUA approach is a reasonable method to create the homogenous field to introduce a representative sample for field soil surveying. The geostatistical SM map achieved adequate accuracy; however, trend analysis and distribution of the soil sample point locations within the LUA types should be further investigated to achieve even better results. Co

  8. Verified Separate Compilation for C

    Science.gov (United States)

    2015-06-01

    independent linking, a new operational model of multilanguage module interaction that supports the statement and proof of cross-language contextual...Compiling Open Programs A presumption of the preceding is that we at least have a specification of multilanguage programs. By multilanguage , I mean...Ahmed [PA14] have also observed, multilanguage semantics is useful not only for program understanding, but also as a mechanism for stating cross

  9. Explanatory Notes to Standard Compilation

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ Ⅰ. Basis for Standard Compilation The economic globalization and China's rapid expansion of foreign exchanges have drastically boosted the demand for translation services. As a result, enterprises offering translation services mushroomed and formed a new industry unlike any other service industry. Though the output value of translation services is not high at the moment, their level and quality have a great impact on the clients because they cover the foreign intercourses in various fields and the construction of major foreign-invested projects.

  10. Compilation of HPSG to TAG

    CERN Document Server

    Kasper, R; Netter, K; Vijay-Shanker, K; Kasper, Robert; Kiefer, Bernd; Netter, Klaus

    1995-01-01

    We present an implemented compilation algorithm that translates HPSG into lexicalized feature-based TAG, relating concepts of the two theories. While HPSG has a more elaborated principle-based theory of possible phrase structures, TAG provides the means to represent lexicalized structures more explicitly. Our objectives are met by giving clear definitions that determine the projection of structures from the lexicon, and identify maximal projections, auxiliary trees and foot nodes.

  11. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  12. Vegetation map for the Hakalau Forest Unit of the Big Island National Wildlife Refuge Complex on the island of Hawai‘i

    Science.gov (United States)

    Jacobi, James D.

    2017-01-01

    This vegetation map was produced to serve as an updated habitat base for management of natural resources of the Hakalau Forest Unit (HFU) of the Big Island National Wildlife Refuge Complex (Refuge) on the island of Hawai‘i. The map is based on a vegetation map originally produced as part of the U.S. Fish and Wildlife Service’s Hawai‘i Forest Bird Survey to depict the distribution, structure, and composition of plant communities on the island of Hawai‘i as they existed in 1977. The current map has been updated to represent current conditions of plant communities in the HFU, based on WorldView 2 imagery taken in 2012 and very-high-resolution imagery collected by Pictometry International from 2010 to 2014. Thirty-one detailed plant communities are identified on this map, and fourteen of these units are found within the boundaries of HFU. Additionally, the mapped units can be displayed as five tree canopy cover units, three moisture zones units, eight dominant tree species units, and four habitat status units by choosing the various fields to group the units from the map attribute table. This updated map will provide a foundation for the refinement and tracking of management actions on the Refuge for the near future, particularly as the habitats in this area are subject to projected climate change.

  13. Tight Integration of Digital Map and In-Vehicle Positioning Unit for Car Navigation in Urban Areas

    Institute of Scientific and Technical Information of China (English)

    Chen Wu; Yu Meng; Li Zhi-lin; Chen Yong-qi; J. Chao

    2003-01-01

    Now GPS has been widely used for land, sea and air navigation.However, due to signal blockage and severe multipath environments in urban areas, such as in Hong Kong, GPS alone can not satisfy most land vehicle navigation requirements. Dead Reckoning (DR) systems have been widely used to bridge the gaps of GPS and to smooth GPS position errors. However,the DR drift errors increase with time rapidly and frequent calibration is required. Under the normal situation, GPS is sufficient to provide the calibration to the DR unit. However, GPS may not be available in urban areas for more than 20 min, and the DR position errors can reach hundreds of meters during the period. As land vehicles have to be on roads, digital map can be used to constrain the locations of vehicles, known as map-matching.One of the main problems for map-matching techniques is mis-matching, that may be caused by the positioning sensor errors and the complexity of city road network. In this paper, a newly developed model to tightly integrate digital map and in-vehicle positioning unit for car navigation is introduced.With this method, it improves the position accuracy by constraining the vehicle location on the roads. Moreover it provides the close-loop controls for the DR drift errors by feeding back the coordinates of the feature points of the road network and road bearings to the DR unit and therefore the navigation system can be used for longer period when GPS is not available.Extensive tests have been carried out in Hong Kong. It demonstrates that this close-loop approach is much better on the reliability of map-matching, as the positioning sensor errors are constantly calibrated by the digital map.

  14. An Action Compiler Targeting Standard ML

    DEFF Research Database (Denmark)

    Iversen, Jørgen

    2005-01-01

    We present an action compiler that can be used in connection with an action semantics based compiler generator. Our action compiler produces code with faster execution times than code produced by other action compilers, and for some non-trivial test examples it is only a factor two slower than th...... the code produced by the Gnu C Compiler. Targeting Standard ML makes the description of the code generation simple and easy to implement. The action compiler has been tested on a description of the Core of Standard ML and a subset of C....

  15. Compiling a Corpus for Teaching Medical Translation

    Directory of Open Access Journals (Sweden)

    Elizabeth de la Teja Bosch

    2014-04-01

    Full Text Available Background: medical translation has countless documentary sources; the major difficulty lies in knowing how to assess them. The corpus is the ideal tool to perform this activity in a rapid and reliable way, and to define the learning objectives based on text typology and oriented towards professional practice.Objective: to compile an electronic corpus that meets the requirements of the professional practice to perform specialized medical translation. Methods: a pedagogical research was conducted in the province of Cienfuegos. The units of analysis involved records from translators of the Provincial Medical Sciences Information Center and specialized translators in this field, who completed a questionnaire to accurately determine their information needs, conditioning the corpus design criteria. The analysis of a set of texts extracted from highly reputable sources led to the text selection and final compilation. Subsequently, the validation of the corpus as a documentary tool for teaching specialized medical translation was performed. Results: there was a concentration of translation assignments in the topics: malignant tumors, hypertension, heart disease, diabetes mellitus and pneumonias. The predominant text typologies were: evaluative and dissemination of current research, with plenty original articles and reviews. The text corpus design criteria were: unannotated, documented, specialized, monitor and comparable. Conclusions: the corpus is a useful tool to show the lexical, terminological, semantic, discursive and contextual particularities of biomedical communication. It allows defining learning objectives and translation problems. Key words: teaching; translating; medicine

  16. What Is the Unit of Visual Attention? Object for Selection, but Boolean Map for Access

    Science.gov (United States)

    Huang, Liqiang

    2010-01-01

    In the past 20 years, numerous theories and findings have suggested that the unit of visual attention is the object. In this study, I first clarify 2 different meanings of unit of visual attention, namely the unit of access in the sense of measurement and the unit of selection in the sense of division. In accordance with this distinction, I argue…

  17. Fault-Tree Compiler Program

    Science.gov (United States)

    Butler, Ricky W.; Martensen, Anna L.

    1992-01-01

    FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.

  18. Recommendations for a Retargetable Compiler.

    Science.gov (United States)

    1980-03-01

    reloasables to the genor" pubic, R4ludlgw.u te 106-7R-79-331 hot beea reviewed and U* approved for pb1h*s~~I. APPROVED:j. SAMUE A. DI NITTO , JR. Project...1 b.t,.cf ente~red 17ll1k 20. it ditI.,OI Ito Report) Same ff I IS SUPPLEMENTARY NOTES RADC Project Engineer: Samuel A. Di Nitto , Jr. (ISIS) 2...compiler for Ada can commence development in FY82. SAMUEL A. DI NITTO , JR Project Engineer viii 1. INTRODUCTION In this section, we discuss the current

  19. The Syntax Model of Mobile Maps Generation

    Directory of Open Access Journals (Sweden)

    TIAN Jiangpeng

    2016-11-01

    Full Text Available Using the method of formal language (FL, the syntax model of mobile map generation is studied. The syntax model is located in the level of logical calculus of map generation based on the analysis of its process. Combined the hierarchical and recursive characteristics of map representation, the simplest form of syntax structure is abstracted as carto-lexicons and syntax-rules. The classification system of carto-lexicons is established as well as the model of the spatial relation predicate system, and the map operation rules and rules of different levels of syntactic units are discussed. The compilation process and key techniques of the syntax model are discussed, and the feasibility of the model is verified through mobile maps generation experiment. The essence of the model is a kind of formal language grammar, which uses finite rules and lexicons to generate maps automatically, as well as a kind of high-level cartography interface of human-computer interactive.

  20. Geology, Bedrock, Hot Springs Quad unit polygons. Compiled Polygons., Published in 2006, 1:24000 (1in=2000ft) scale, NC DENR / Div. of Land Resources / Geological Survey Section.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Geology, Bedrock dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2006. It is described as...

  1. Geology, Bedrock, Mars Hill Quad unit polygons. Compiled Polygons., Published in 2006, 1:24000 (1in=2000ft) scale, NC DENR / Div. of Land Resources / Geological Survey Section.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Geology, Bedrock dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2006. It is described as...

  2. Geology, Bedrock, Lemon Gap Quad unit polygons. Compiled Polygons., Published in 2006, 1:24000 (1in=2000ft) scale, NC DENR / Div. of Land Resources / Geological Survey Section.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Geology, Bedrock dataset, published at 1:24000 (1in=2000ft) scale, was produced all or in part from Hardcopy Maps information as of 2006. It is described as...

  3. Mapping water use—Landsat and water resources in the United States

    Science.gov (United States)

    Johnson, Rebecca L.

    2016-06-27

    Using Landsat satellite data, scientists with the U.S. Geological Survey (USGS) have helped to refine a technique called evapotranspiration (ET) mapping to measure how much water crops are using across landscapes and through time. These water-use maps are created using a computer model that integrates Landsat and weather data.

  4. Fast traffic noise mapping of cities using the graphics processing unit of a personal computer

    NARCIS (Netherlands)

    Salomons, E.M.; Zhou, H.; Lohman, W.J.A.

    2014-01-01

    Traffic noise mapping of cities requires large computer calculation times. This originates from the large number of point-to-point sound propagation calculations that must be performed. In this article it is demonstrated that noise mapping calculation times can be reduced considerably by the use of

  5. Map and Data for Quaternary Faults and Fault Systems on the Island of Hawai`i

    Science.gov (United States)

    Cannon, Eric C.; Burgmann, Roland; Crone, Anthony J.; Machette, Michael N.; Dart, Richard L.

    2007-01-01

    Introduction This report and digitally prepared, GIS-based map is one of a series of similar products covering individual states or regions of United States that show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. It is part of a continuing the effort to compile a comprehensive Quaternary fault and fold map and database for the United States, which is supported by the U.S. Geological Survey's (USGS) Earthquake Hazards Program. Guidelines for the compilation of the Quaternary fault and fold maps for the United States were published by Haller and others (1993) at the onset of this project. This compilation of Quaternary surface faulting and folding in Hawai`i is one of several similar state and regional compilations that were planned for the United States. Reports published to date include West Texas (Collins and others, 1996), New Mexico (Machette and others, 1998), Arizona (Pearthree, 1998), Colorado (Widmann and others, 1998), Montana (Stickney and others, 2000), Idaho (Haller and others, 2005), and Washington (Lidke and others, 2003). Reports for other states such as California and Alaska are still in preparation. The primary intention of this compilation is to aid in seismic-hazard evaluations. The report contains detailed information on the location and style of faulting, the time of most recent movement, and assigns each feature to a slip-rate category (as a proxy for fault activity). It also contains the name and affiliation of the compiler, date of compilation, geographic and other paleoseismologic parameters, as well as an extensive set of references for each feature. The map (plate 1) shows faults, volcanic rift zones, and lineaments that show evidence of Quaternary surface movement related to faulting, including data on the time of most recent movement, sense of movement, slip rate, and continuity of surface expression. This compilation is presented as a digitally prepared map product

  6. Geomatics for Mapping of Groundwater Potential Zones in Northern Part of the United Arab Emiratis - Sharjah City

    Science.gov (United States)

    Al-Ruzouq, R.; Shanableh, A.; Merabtene, T.

    2015-04-01

    In United Arab Emirates (UAE) domestic water consumption has increased rapidly over the last decade. The increased demand for high-quality water, create an urgent need to evaluate the groundwater production of aquifers. The development of a reasonable model for groundwater potential is therefore crucial for future systematic developments, efficient management, and sustainable use of groundwater resources. The objective of this study is to map the groundwater potential zones in northern part of UAE and assess the contributing factors for exploration of potential groundwater resources. Remote sensing data and geographic information system will be used to locate potential zones for groundwater. Various maps (i.e., base, soil, geological, Hydro-geological, Geomorphologic Map, structural, drainage, slope, land use/land cover and average annual rainfall map) will be prepared based on geospatial techniques. The groundwater availability of the basin will qualitatively classified into different classes based on its hydro-geo-morphological conditions. The land use/land cover map will be also prepared for the different seasons using a digital classification technique with a ground truth based on field investigation.

  7. Combined landslide inventory and susceptibility assessment based on different mapping units: an example from the Flemish Ardennes, Belgium

    Directory of Open Access Journals (Sweden)

    M. Van Den Eeckhaut

    2009-03-01

    Full Text Available For a 277 km2 study area in the Flemish Ardennes, Belgium, a landslide inventory and two landslide susceptibility zonations were combined to obtain an optimal landslide susceptibility assessment, in five classes. For the experiment, a regional landslide inventory, a 10 m × 10 m digital representation of topography, and lithological and soil hydrological information obtained from 1:50 000 scale maps, were exploited. In the study area, the regional inventory shows 192 landslides of the slide type, including 158 slope failures occurred before 1992 (model calibration set, and 34 failures occurred after 1992 (model validation set. The study area was partitioned in 2.78×106 grid cells and in 1927 topographic units. The latter are hydro-morphological units obtained by subdividing slope units based on terrain gradient. Independent models were prepared for the two terrain subdivisions using discriminant analysis. For grid cells, a single pixel was identified as representative of the landslide depletion area, and geo-environmental information for the pixel was obtained from the thematic maps. The landslide and geo-environmental information was used to model the propensity of the terrain to host landslide source areas. For topographic units, morphologic and hydrologic information and the proportion of lithologic and soil hydrological types in each unit, were used to evaluate landslide susceptibility, including the depletion and depositional areas. Uncertainty associated with the two susceptibility models was evaluated, and the model performance was tested using the independent landslide validation set. An heuristic procedure was adopted to combine the landslide inventory and the susceptibility zonations. The procedure makes optimal use of the available landslide and susceptibility information, minimizing the limitations inherent in the inventory and the susceptibility maps. For the established susceptibility classes, regulations to

  8. Distributed memory compiler design for sparse problems

    Science.gov (United States)

    Wu, Janet; Saltz, Joel; Berryman, Harry; Hiranandani, Seema

    1991-01-01

    A compiler and runtime support mechanism is described and demonstrated. The methods presented are capable of solving a wide range of sparse and unstructured problems in scientific computing. The compiler takes as input a FORTRAN 77 program enhanced with specifications for distributing data, and the compiler outputs a message passing program that runs on a distributed memory computer. The runtime support for this compiler is a library of primitives designed to efficiently support irregular patterns of distributed array accesses and irregular distributed array partitions. A variety of Intel iPSC/860 performance results obtained through the use of this compiler are presented.

  9. An OpenMP Compiler Benchmark

    Directory of Open Access Journals (Sweden)

    Matthias S. Müller

    2003-01-01

    Full Text Available The purpose of this benchmark is to propose several optimization techniques and to test their existence in current OpenMP compilers. Examples are the removal of redundant synchronization constructs, effective constructs for alternative code and orphaned directives. The effectiveness of the compiler generated code is measured by comparing different OpenMP constructs and compilers. If possible, we also compare with the hand coded "equivalent" solution. Six out of seven proposed optimization techniques are already implemented in different compilers. However, most compilers implement only one or two of them.

  10. The fault-tree compiler

    Science.gov (United States)

    Martensen, Anna L.; Butler, Ricky W.

    1987-01-01

    The Fault Tree Compiler Program is a new reliability tool used to predict the top event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N gates. The high level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precise (within the limits of double precision floating point arithmetic) to the five digits in the answer. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Corporation VAX with the VMS operation system.

  11. Ace Basin National Wildlife Refuge (Combahee Unit) [Land Status Map: Sheet 1 of 2

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This map was produced by the Division of Realty to depict landownership at Ernest F. Hollings Ace Basin National Wildlife Refuge. It was generated from rectified...

  12. Geochemical and mineralogical maps for soils of the conterminous United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Geochemical and mineralogical maps along with a histogram, boxplot, and empirical cumulative distribution function plot for each element or mineral whose data are...

  13. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  14. Mapping Table Based Register File Design and Compiler Optimization%基于映射表的寄存器文件设计以及编译器优化

    Institute of Scientific and Technical Information of China (English)

    邓晴莺; 张民选

    2008-01-01

    寄存器文件的设计在高性能处理器设计中十分重要,寄存器栈和寄存器栈引擎是提高其性能的重要手段.编译优化常常基于特定的体系机构以及目标机器.本文针对EDSMT微体系结构(基于IA-64的同时多线程体系结构)提出了一种新颖的基于映射表的寄存器机制--MTRM(Mapping Table-based Register Management),它通过映射表将连续的虚拟寄存器物理号映射到不连续的实际物理寄存器,并研究了编译器支持下的及时去配,实验结果表明该方案能有效提高性能.

  15. Maps and grids of hydrogeologic information created from standardized water-well drillers’ records of the glaciated United States

    Science.gov (United States)

    Bayless, E. Randall; Arihood, Leslie D.; Reeves, Howard W.; Sperl, Benjamin J.S.; Qi, Sharon L.; Stipe, Valerie E.; Bunch, Aubrey R.

    2017-01-18

    As part of the National Water Availability and Use Program established by the U.S. Geological Survey (USGS) in 2005, this study took advantage of about 14 million records from State-managed collections of water-well drillers’ records and created a database of hydrogeologic properties for the glaciated United States. The water-well drillers’ records were standardized to be relatively complete and error-free and to provide consistent variables and naming conventions that span all State boundaries.Maps and geospatial grids were developed for (1) total thickness of glacial deposits, (2) total thickness of coarse-grained deposits, (3) specific-capacity based transmissivity and hydraulic conductivity, and (4) texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity. The information included in these maps and grids is required for most assessments of groundwater availability, in addition to having applications to studies of groundwater flow and transport. The texture-based estimated equivalent horizontal and vertical hydraulic conductivity and transmissivity were based on an assumed range of hydraulic conductivity values for coarse- and fine-grained deposits and should only be used with complete awareness of the methods used to create them. However, the maps and grids of texture-based estimated equivalent hydraulic conductivity and transmissivity may be useful for application to areas where a range of measured values is available for re-scaling.Maps of hydrogeologic information for some States are presented as examples in this report but maps and grids for all States are available electronically at the project Web site (USGS Glacial Aquifer System Groundwater Availability Study, http://mi.water.usgs.gov/projects/WaterSmart/Map-SIR2015-5105.html) and the Science Base Web site, https://www.sciencebase.gov/catalog/item/58756c7ee4b0a829a3276352.

  16. E/V Nautilus Mapping and ROV Dives Reveal Hundreds of Vents along the West Coast of the United States

    Science.gov (United States)

    Kane, R.; Raineault, N.; Embley, R. W.; Merle, S. G.; Girguis, P. R.; Irish, O.; Lubetkin, M.; German, C. R.; Levin, L. A.; Cormier, M. H.; Caldow, C.; Freedman, R.; Gee, L.

    2016-12-01

    The Exploration Vessel (E/V) Nautilus has mapped more than 30,000 km2 of seafloor off the west coast of the United States between July 2015 and September 2016. The 30 kHz EM302 multibeam mapping system collects water column data in addition to bathymetry and backscatter. Examination of the water column data revealed hundreds of distinct vertical features, presumably plumes of methane gas released from the seafloor. While seafloor reservoirs of methane are thought to contribute 5-10% of the global discharge, inventories of seafloor methane seeps are poorly constrained due to the lack of data such as the distribution and abundance of seafloor gas plumes. The results of mapping efforts reveal an unexpected number of methane seeps. ROV dives were then used to provide geological context to the seeps and associated unique biological communities. Altogether these findings contribute significantly to our baseline inventory of seeps along the continental margins of the United States. The presence of unexpectedly large numbers of methane seeps on the US Pacific, Gulf and Atlantic margins may influence the management of human extraction activities on the margin seabed.

  17. ­­Estimating Forest Management Units from Road Network Maps in the Southeastern U.S.

    Science.gov (United States)

    Yang, D.; Hall, J.; Fu, C. S.; Binford, M. W.

    2015-12-01

    The most important factor affecting forest structure and function is the type of management undertaken in forest stands. Owners manage forests using appropriately sized areas to meet management objectives, which include economic return, sustainability, recreation, or esthetic enjoyment. Thus, the socio-environmental unit of study for forests should be the management unit. To study the ecological effects of different kinds of management activities, we must identify individual management units. Road networks, which provide access for human activities, are widely used in managing forests in the southeastern U.S. Coastal Plain and Piedmont (SEUS). Our research question in this study is: How can we identify individual forest management units in an entire region? To answer it, we hypothesize that the road network defines management units on the landscape. Road-caused canopy openings are not always captured by satellite sensors, so it is difficult to delineate ecologically relevant patches based only on remote sensing data. We used a reliable, accurate and freely available road network data, OpenStreetMap (OSM), and the National Land Cover Database (NLCD) to delineate management units in a section of the SEUS defined by Landsat Wprldwide Reference System (WRS) II footprint path 17 row 39. The spatial frequency distributions of forest management units indicate that while units Management units ≥ 0.5 Ha ranged from 0.5 to 160,770 Ha (the Okefenokee National Wildlife Refuge). We compared the size-frequency distributions of management units with four independently derived management types: production, ecological, preservation, and passive management. Preservation and production management had the largest units, at 40.5 ± 2196.7 (s.d.) and 41.3 ± 273.5 Ha, respectively. Ecological and passive averaged about half as large at 19.2 ± 91.5 and 22.4 ± 96.0 Ha, respectively. This result supports the hypothesis that the road network defines management units in SEUS. If this way

  18. Basement domain map of the conterminous U.S.A. and Alaska

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The basement-domain map is a compilation of basement domains in the conterminous United States and Alaska designed to be used at 1:5,000,000-scale, particularly as...

  19. Basement domain map of the conterminous U.S.A. and Alaska

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The basement-domain map is a compilation of basement domains in the conterminous United States and Alaska designed to be used at 1:5,000,000-scale, particularly as a...

  20. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice...... and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., that it is a decompiler. In the context of partial evaluation, the binding-time shift of going from a source interpreter to a compiler is classically referred...... to as a Futamura projection. By symmetry, it seems logical to refer to the binding-time shift of going from a target interpreter to a compiler as a Futamura embedding....

  1. Assessment and mapping of slope stability based on slope units: A case study in Yan’an, China

    Indian Academy of Sciences (India)

    Jianqi Zhuang; Jianbing Peng; Yonglong Xu; Qiang Xu; Xinghua Zhu; Wei Li

    2016-10-01

    Precipitation frequently triggers shallow landslides in the Loess Plateau of Shaanxi, China, resulting in loss of life, damage to gas and oil routes, and destruction of transport infrastructure and farmland. To assess the possibility of shallow landslides at different precipitation levels, a method to draw slope units and steepest slope profiles based on ARCtools and a new method for calculating slope stability areproposed. The methods were implemented in a case study conducted in Yan’an, north-west China. High resolution DEM (Digital Elevation Model) images, soil parameters from in-situ laboratory measurements and maximum depths of precipitation infiltration were used as input parameters in the method. Next,DEM and reverse DEM were employed to map 2146 slope units in the study area, based on which the steepest profiles of the slope units were constructed. Combining analysis of the water content of loess, strength of the sliding surface, its response to precipitation and the infinite slope stability equation, a newequation to calculate infinite slope stability is proposed to assess shallow landslide stability. The slope unit stability was calculated using the equation at 10-, 20-, 50- and 100-year return periods of antecedent effective precipitation. The number of slope units experiencing failure increased in response to increasing effective antecedent rainfall. These results were validated based on the occurrence of landslides in recent decades. Finally, the applicability and limitations of the model are discussed.

  2. USGS Small-scale Dataset - Global Map: 1:1,000,000-Scale Major Roads of the United States 201403 Shapefile

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing the major roads in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of...

  3. USGS Small-scale Dataset - Global Map: Cities and Towns of the United States 201403 FileGDB 10.1

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map layer includes Global Map data showing cities and towns in the United States, Puerto Rico, and the U.S. Virgin Islands. The data are a modified version of...

  4. Compiling scheme using abstract state machines

    OpenAIRE

    2003-01-01

    The project investigates the use of Abstract State Machine in the process of computer program compilation. Compilation is to produce machine-code from a source program written in a high-level language. A compiler is a program written for the purpose. Machine-code is the computer-readable representation of sequences of computer instructions. An Abstract State Machine (ASM) is a notional computing machine, developed by Yuri Gurevich, for accurately and easily representing the semantics of...

  5. Geologic quadrangle maps of the United States: geology of the Casa Diablo Mountain quadrangle, California

    Science.gov (United States)

    Rinehart, C. Dean; Ross, Donald Clarence

    1957-01-01

    The Casa Diablo Mountain quadrangle was mapped in the summers of 1952 and 1953 by the U.S. Geological Survey in cooperation with the California State Division of Mines as part of a study of potential tungsten-bearing areas.

  6. Translation of Bernstein Coefficients Under an Affine Mapping of the Unit Interval

    Science.gov (United States)

    Alford, John A., II

    2012-01-01

    We derive an expression connecting the coefficients of a polynomial expanded in the Bernstein basis to the coefficients of an equivalent expansion of the polynomial under an affine mapping of the domain. The expression may be useful in the calculation of bounds for multi-variate polynomials.

  7. Map and Aerial Photo Collections in the United States: Survey of the Seventy Largest Collections.

    Science.gov (United States)

    Stevens, Stanley D.

    1981-01-01

    Data gathered from 56 libraries, agencies, and other institutions holding large collections of maps and aerial photographs are reported, including such areas as personnel, equipment, acquisitions, floor space, promotion, and use of computers. The 70 largest collections are ranked and profiled, and a sample questionnaire is provided. (FM)

  8. Compiler-assisted static checkpoint insertion

    Science.gov (United States)

    Long, Junsheng; Fuchs, W. K.; Abraham, Jacob A.

    1992-01-01

    This paper describes a compiler-assisted approach for static checkpoint insertion. Instead of fixing the checkpoint location before program execution, a compiler enhanced polling mechanism is utilized to maintain both the desired checkpoint intervals and reproducible checkpoint 1ocations. The technique has been implemented in a GNU CC compiler for Sun 3 and Sun 4 (Sparc) processors. Experiments demonstrate that the approach provides for stable checkpoint intervals and reproducible checkpoint placements with performance overhead comparable to a previously presented compiler assisted dynamic scheme (CATCH) utilizing the system clock.

  9. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  10. Mapping Investments and Published Outputs in Norovirus Research: A Systematic Analysis of Research Funded in the United States and United Kingdom During 1997-2013.

    Science.gov (United States)

    Head, Michael G; Fitchett, Joseph R; Lichtman, Amos B; Soyode, Damilola T; Harris, Jennifer N; Atun, Rifat

    2016-02-01

    Norovirus accounts for a considerable portion of the global disease burden. Mapping national or international investments relating to norovirus research is limited. We analyzed the focus and type of norovirus research funding awarded to institutions in the United States and United Kingdom during 1997-2013. Data were obtained from key public and philanthropic funders across both countries, and norovirus-related research was identified from study titles and abstracts. Included studies were further categorized by the type of scientific investigation, and awards related to vaccine, diagnostic, and therapeutic research were identified. Norovirus publication trends are also described using data from Scopus. In total, US and United Kingdom funding investment for norovirus research was £97.6 million across 349 awards; 326 awards (amount, £84.9 million) were received by US institutions, and 23 awards (£12.6 million) were received by United Kingdom institutions. Combined, £81.2 million of the funding (83.2%) was for preclinical research, and £16.4 million (16.8%) was for translational science. Investments increased from £1.7 million in 1997 to £11.8 million in 2013. Publication trends showed a consistent temporal increase from 48 in 1997 to 182 in 2013. Despite increases over time, trends in US and United Kingdom funding for norovirus research clearly demonstrate insufficient translational research and limited investment in diagnostics, therapeutics, or vaccine research. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  11. Geothermal Favorability Map Derived From Logistic Regression Models of the Western United States (favorabilitysurface.zip)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This is a surface showing relative favorability for the presence of geothermal systems in the western United States. It is an average of 12 models that correlates...

  12. The Process, Methods and Technics in Compiling Dialect Maps: The Case of Dutch Dialect Atlases / Ağız Atlaslarının Hazırlanışında Süreç, Yöntem ve Teknikler: Flaman Ağız Atlasları Örneği

    Directory of Open Access Journals (Sweden)

    Nihal Çalışkan

    2013-09-01

    Full Text Available Dialect maps show the geographical distribution of linguistic variables. First atlases on European languages were published at the beginning of the 19th century. Since then, many atlases have been revealed at both national and regional level. Contemporarily the so called new age atlases are being compiled in parallel with the advances in digital sciences. In spite of this progress in dialect mapping, the Dialect Atlas of Turkish has not been compiled yet. This study aims to introduce one of the new-age atlas projects, Dutch Dialect Atlases, and particularly deals with the process and methods of the Syntactic Atlas of Dutch Dialects to guide the scholars who will undertake the future Dialect Atlas of Turkish. Ağız atlasları, dilde varyasyonu çeşitli dilbilimsel değişkenlerin coğrafi dağılımına göre gösteren haritalardır. İlk atlaslar Batı dillerinden hareketle 19. yüzyılın başlarında ortaya konmuştur. O tarihten bugüne kadar gerek geniş bölge gerekse dar bölge atlası olarak nitelendirebileceğimiz pek çok atlas hazırlanmış; günümüzde ise özellikle dijital ortamda kaydedilen gelişmelere paralel olarak elektronik erişime ve sorgulamaya uygun yeni nesil atlaslar ortaya çıkmıştır. Ağız atlasları alanındaki bütün bu gelişmelere rağmen Türkiye Türkçesinin ağız atlası henüz hazırlanabilmiş değildir. Bu çalışmada Türkiye Türkçesi için yapılacak ağız atlası çalışmalarında yararlanılabilmesi amacıyla, yeni nesil olarak nitelendirilebilecek atlas projelerinden Flaman ağız atlasları üzerinde durulacak, bunlardan söz dizimini esas alan atlasın hazırlanışındaki süreç, yöntem ve teknikler ele alınacaktır.

  13. The 2014 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Haller, K. M.; Zeng, Y.; Harmsen, S.; Frankel, A. D.; Rezaeian, S.; Powers, P.; Field, E. H.; Boyd, O. S.; Chen, R.; Rukstales, K. S.; Wheeler, R. L.; Luco, N.; Williams, R. A.; Olson, A.

    2013-12-01

    The USGS is in the process of updating the U.S. National Seismic Hazard Maps for the lower 48 States that will be considered for inclusion in future building codes, risk assessments, and other public policy applications. These seismic hazard maps are based on our assessment of the best available science at the time of the update, and incorporate a broad range of scientific models and parameters. The maps were discussed in regional workshops held across the U.S., reviewed by our Steering Committee, and available on-line during a 45-day period for public comment. The USGS hazard maps depict earthquake ground-shaking exceedance levels for various probabilities over a 50-year time period and are based on calculations at several hundred thousand sites across the U.S. Inputs to the hazard maps are based on scientific estimates of the locations, magnitudes, and rates of earthquakes as well as ground motion models describing each earthquake's ground shaking. We model rates of earthquakes either on known faults or as seismicity-based background earthquakes that account for unknown faults and an incomplete fault inventory. Probabilities of ground shaking are calculated from ground motion models that estimate the likely shaking caused by an earthquake. Several new datasets and models have been developed since the 2008 update of the maps. For the Central and Eastern U.S. we implemented a new moment magnitude catalog and completeness estimates, updated the maximum magnitude distribution, updated and tested the smoothing algorithms for adaptive and fixed-radius methods, extended the fault model -including the sizes and rates of New Madrid Seismic Zone earthquakes, considered induced earthquakes, and included updated and new ground motion models along with a new weighting scheme. In the Intermountain West we implemented new smoothing algorithms, fault geometry for normal faults, Wasatch fault model, and fault slip rates based on models obtained by inverting geodetic and geologic

  14. Fixed point property for nonexpansive mappings and nonexpansive semigroups on the unit disk

    Directory of Open Access Journals (Sweden)

    Luis Benítez-Babilonia

    2015-06-01

    Full Text Available For closed convex subsets D of a Banach spaces, in 2009, Tomonari Suzuki [11] proved that the fixed point property (FPP for nonexpansive mappings and the FPP for nonexpansive semigroups are equivalent. In this paper some relations between the aforementioned properties for mappings and semigroups defined on D, a closed convex subset of the hyperbolic metric space (D, ρ, are studied. This work arises as a generalization to the space (D, ρ of the study made by Suzuki. Resumen. Para subconjuntos D cerrados y convexos de espacios de Banach, Tomonari Suzuki [11] demostró en 2009 que la propiedad del punto fijo (PPF para funciones no expansivas y la PPF para semigrupos de funciones no expansivas son equivalentes. En este trabajo se estudian algunas relaciones entre dichas propiedades, cuando D es un subconjunto del espacio mético (D, ρ. Este trabajo surge como una generalización al espacio (D, ρ de los resultados de Suzuki.

  15. Geologic map of Chickasaw National Recreation Area, Murray County, Oklahoma

    Science.gov (United States)

    Blome, Charles D.; Lidke, David J.; Wahl, Ronald R.; Golab, James A.

    2013-01-01

    This 1:24,000-scale geologic map is a compilation of previous geologic maps and new geologic mapping of areas in and around Chickasaw National Recreation Area. The geologic map includes revisions of numerous unit contacts and faults and a number of previously “undifferentiated” rock units were subdivided in some areas. Numerous circular-shaped hills in and around Chickasaw National Recreation Area are probably the result of karst-related collapse and may represent the erosional remnants of large, exhumed sinkholes. Geospatial registration of existing, smaller scale (1:72,000- and 1:100,000-scale) geologic maps of the area and construction of an accurate Geographic Information System (GIS) database preceded 2 years of fieldwork wherein previously mapped geology (unit contacts and faults) was verified and new geologic mapping was carried out. The geologic map of Chickasaw National Recreation Area and this pamphlet include information pertaining to how the geologic units and structural features in the map area relate to the formation of the northern Arbuckle Mountains and its Arbuckle-Simpson aquifer. The development of an accurate geospatial GIS database and the use of a handheld computer in the field greatly increased both the accuracy and efficiency in producing the 1:24,000-scale geologic map.

  16. Proving Correctness of Compilers Using Structured Graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    We present an approach to compiler implementation using Oliveira and Cook’s structured graphs that avoids the use of explicit jumps in the generated code. The advantage of our method is that it takes the implementation of a compiler using a tree type along with its correctness proof and turns it ...

  17. Criteria for Evaluating the Performance of Compilers

    Science.gov (United States)

    1974-10-01

    skilled programmer to take advantage of all of the environmental special features which could be exploited by a compiler. These programs are then...id efl !,i% programs, except remove all statement labels. Subtract the ba-c; 162 values obtained by compiling and running a program cont.ziing the

  18. The Molen compiler for reconfigurable architectures

    NARCIS (Netherlands)

    Moscu Panainte, E.

    2007-01-01

    In this dissertation, we present the Molen compiler framework that targets reconfigurable architectures under the Molen Programming Paradigm. More specifically, we introduce a set of compiler optimizations that address one of the main shortcomings of the reconfigurable architectures, namely the reco

  19. THE SHARP ESTIMATE OF THE THIRD HOMOGENEOUS EXPANSION FOR A CLASS OF STARLIKE MAPPINGS OF ORDER α ON THE UNIT POLYDISK IN Cn

    Institute of Scientific and Technical Information of China (English)

    Liu Xiaosong; Liu Taishun

    2012-01-01

    In this article,first,a sufficient condition for a starlike mapping of order α f(x) defined on the unit ball in a complex Banach space is given. Second,the sharp estimate of the third homogeneous expansion for f is established as well,where f(z) =(f1(z),f2(z),…,fn(z))' is a starlike mapping of order α or a normalized biholomorphic starlike mapping defined on the unit polydisk in Cn,and D2fk(0)(z2)/2 =zk(nΣl=1 aklzl),k =1,2,…,n,here,akl =1/2!(6)2fk(0)/(6)zk(6)zl,k,l==1,2,…,n.Our result states that the Bieberbach conjecture in several complex variables (the case of the third homogeneous expansion for starlike mappings of order α and biholomorphic starlike mappings) is partly proved.

  20. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    Helping programmers write parallel software is an urgent problem given the popularity of multi-core architectures. Engineering compilers which automatically parallelize and vectorize code has turned out to be very challenging. Consequently, compilers are very selective with respect to the coding...... patterns they can optimize. We present an interactive approach and a tool set which leverages ad- vanced compiler analysis and optimizations while retaining programmer control over the source code and its transformation. This allows opti- mization even when programmers refrain from enabling optimizations...... to preserve accurate debug information or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and suggests workarounds which can be applied automatically...

  1. Southwest Exotic Mapping Program (SWEMP) Database, 2007

    Science.gov (United States)

    Thomas, Kathryn A.; Guertin, Patricia

    2017-01-01

    The Southwest Exotic Plant Mapping Program (SWEMP) is a collaborative effort between the United States Geological Survey and federal, tribal, state, county and non-governmental organization (NGO) partners in the southwest. This project is an ongoing effort to compile and distribute regional data on the occurrence of non-native invasive plants in the southwestern United States. The database represents the known sites (represented by a point location, i.e. site) of non-native invasive plant infestations within Arizona and New Mexico, and adjacent portions of California, Colorado, Nevada and Utah. These data, collected from 1911 to 2006, represent the field observations of various state, federal, tribal and county agencies, along with some specimen data from Herbaria. The SWEMP database comprises a compilation of data submitted through 2006.

  2. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    Science.gov (United States)

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  3. Forest resources of the United States, 2002: mapping the renewable resource planning act data

    Science.gov (United States)

    Cassandra M. Kurtz; Daniel J. Kaisershot; Dale D. Gormanson; Jeffery S. Wazenegger

    2009-01-01

    Forest Inventory and Analysis (FIA), a national program of the Forest Service, U.S. Department of Agriculture conducts and maintains comprehensive inventories of the forest resources in the United States. The Forest and Rangeland Renewable Resources Planning Act (RPA) of 1974 mandates a comprehensive assessment of past trends, current status, and the future potential...

  4. Geologic map and map database of northeastern San Francisco Bay region, California, [including] most of Solano County and parts of Napa, Marin, Contra Costa, San Joaquin, Sacramento, Yolo, and Sonoma Counties

    Science.gov (United States)

    Graymer, Russell Walter; Jones, David Lawrence; Brabb, Earl E.

    2002-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (nesfmf.ps, nesfmf.pdf, nesfmf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.

  5. Hierarchical Object-Based Mapping of Riverscape Units and in-Stream Mesohabitats Using LiDAR and VHR Imagery

    Directory of Open Access Journals (Sweden)

    Luca Demarchi

    2016-01-01

    Full Text Available In this paper, we present a new, semi-automated methodology for mapping hydromorphological indicators of rivers at a regional scale using multisource remote sensing (RS data. This novel approach is based on the integration of spectral and topographic information within a multilevel, geographic, object-based image analysis (GEOBIA. Different segmentation levels were generated based on the two sources of Remote Sensing (RS data, namely very-high spatial resolution, near-infrared imagery (VHR and high-resolution LiDAR topography. At each level, different input object features were tested with Machine Learning classifiers for mapping riverscape units and in-stream mesohabitats. The GEOBIA approach proved to be a powerful tool for analyzing the river system at different levels of detail and for coupling spectral and topographic datasets, allowing for the delineation of the natural fluvial corridor with its primary riverscape units (e.g., water channel, unvegetated sediment bars, riparian densely-vegetated units, etc. and in-stream mesohabitats with a high level of accuracy, respectively of K = 0.91 and K = 0.83. This method is flexible and can be adapted to different sources of data, with the potential to be implemented at regional scales in the future. The analyzed dataset, composed of VHR imagery and LiDAR data, is nowadays increasingly available at larger scales, notably through European Member States. At the same time, this methodology provides a tool for monitoring and characterizing the hydromorphological status of river systems continuously along the entire channel network and coherently through time, opening novel and significant perspectives to river science and management, notably for planning and targeting actions.

  6. Geologic map of Oldonyo Lengai (Oldoinyo Lengai) Volcano and surroundings, Arusha Region, United Republic of Tanzania

    Science.gov (United States)

    Sherrod, David R.; Magigita, Masota M.; Kwelwa, Shimba

    2013-01-01

    The geology of Oldonyo Lengai volcano and the southernmost Lake Natron basin, Tanzania, is presented on this geologic map at scale 1:50,000. The map sheet can be downloaded in pdf format for online viewing or ready to print (48 inches by 36 inches). A 65-page explanatory pamphlet describes the geologic history of the area. Its goal is to place the new findings into the framework of previous investigations while highlighting gaps in knowledge. In this way questions are raised and challenges proposed to future workers. The southernmost Lake Natron basin is located along the East African rift zone in northern Tanzania. Exposed strata provide a history of volcanism, sedimentation, and faulting that spans 2 million years. It is here where Oldonyo Lengai, Tanzania’s most active volcano of the past several thousand years, built its edifice. Six new radiometric ages, by the 40Ar/39Ar method, and 48 new geochemical analyses from Oldonyo Lengai and surrounding volcanic features deepen our understanding of the area. Those who prefer the convenience and access offered by Geographic Information Systems (GIS) may download an electronic database, suitable for most GIS software applications. The GIS database is in a Transverse Mercator projection, zone 36, New (1960) Arc datum. The database includes layers for hypsography (topography), hydrography, and infrastructure such as roads and trails.

  7. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  8. Hydrogeologic unit map of the Piedmont and Blue Ridge provinces of North Carolina

    Science.gov (United States)

    Daniel, Charles C.; Payne, R.A.

    1990-01-01

    The numerous geologic formations and rock types in the Piedmont and Blue Ridge provinces of North Carolina have been grouped into 21 hydrogeologic units on the basis of their water-bearing potential as determined from rock origin, composition, and texture. All major classes of rocks--metamorphic, igneous, and sedimentary--are present, although metamorphic rocks are the most abundant. The origin of the hydrogeologic units is indicated by the rock class or subclass (metaigneous, metavolanic, or metasedimentary). The composition of the igneous, metaigneous, and metavolcanic rocks is designated as felsic, intermediate, or mafic except for the addition in the metavolcanic group of epiclastic rocks and compositionally undifferentiated rocks. Composition is the controlling attribute in the classification of the metasedimentary units of gneiss (mafic or felsic), marble, quartzite. The other metasediments are designated primarily on the basis of texture (grain size, degree of metamorphism, and development of foliation). Sedimentary rocks occur in the Piedmont in several downfaulted basins. A computerized data file containing records from more than 6,200 wells was analyzed to determine average well yields in each of the 21 units. The well yields were adjusted to an average well depth of 154 feet and an average diameter of 6 inches, the average of all wells in the data set, to remove the variation in well yield attributed to differences in depth and diameter. Average yields range from a high of 23.6 gallons per minute for schist to a low 11.6 gallons per minute for sedimentary rocks of Triassic age.

  9. DESIGN AND COMPILATION OF AGRICULTURAL ELECTRONIC ATLAS AT COUNTY-LEVEL

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    With the rapid development and application of new techniques, cartography has enteredthe 21st centumono-medium, static into 3-D, multi-media, dynamic and network (including intemet), and gradually is developing towards 4-D (time, space). There appeared digital map, electronic map, soft map, hard map, interactive map, mingle map etc. Agricultural map needs to include much more contents in 3-D, multi-media than other types of map. Only electronic map can represent completely these contents. Compiling agricultural electronic atlas at county-level aims to reflect scientifiAgricultural electronic atlas at county-level should take "sustainable development" as the theme; systematically reflect the natural resources and natural environment in a county; the spatial and temporal distribution and changing law of agricultural resources (including climate, soil and water). In the paper the authors introduce the concrete contents of agricultural electronic atlas, their compilation process, and corresponding software and hardware as well as an example. In agricultural electronic atlas design the most advanced multi-media techniques must be used. The procedure of agricultural electronic atlas includesthe study on compilation aim, content selection analysis, overall framework and data organization, determining compilation program. Agriculture includes many contents; each county has its own emphasis. In designing we set upa county's theme according to its concrete situation, the atlas contents are selected around the theme. For example, the main problems faced by the agriculture of Da'an City in Jilin Province is land desertification, so land desertification and its control are the theme of agricultural electronic atlas of Da'an City. When we compile other county's agricultural electronic atlas, only changing theme contents, can we get another county's agricultural electronic atlas.

  10. Mapping of earthquakes vulnerability area in Papua

    Science.gov (United States)

    Muhammad Fawzy Ismullah, M.; Massinai, Muh. Altin

    2016-05-01

    Geohazard is a geological occurrence which may lead to a huge loss for human. A mitigation of these natural disasters is one important thing to be done properly in order to reduce the risks. One of the natural disasters that frequently occurs in the Papua Province is the earthquake. This study applies the principle of Geospatial and its application for mapping the earthquake-prone area in the Papua region. It uses earthquake data, which is recorded for 36 years (1973-2009), fault location map, and ground acceleration map of the area. The earthquakes and fault map are rearranged into an earthquake density map, as well as an earthquake depth density map and fault density map. The overlaid data of these three maps onto ground acceleration map are then (compiled) to obtain an earthquake unit map. Some districts area, such as Sarmi, Nabire, and Dogiyai, are identified by a high vulnerability index. In the other hand, Waropen, Puncak, Merauke, Asmat, Mappi, and Bouven Digoel area shows lower index. Finally, the vulnerability index in other places is detected as moderate.

  11. Mapping marginal croplands suitable for cellulosic feedstock crops in the Great Plains, United States

    Science.gov (United States)

    Gu, Yingxin; Wylie, Bruce K.

    2016-01-01

    Growing cellulosic feedstock crops (e.g., switchgrass) for biofuel is more environmentally sustainable than corn-based ethanol. Specifically, this practice can reduce soil erosion and water quality impairment from pesticides and fertilizer, improve ecosystem services and sustainability (e.g., serve as carbon sinks), and minimize impacts on global food supplies. The main goal of this study was to identify high-risk marginal croplands that are potentially suitable for growing cellulosic feedstock crops (e.g., switchgrass) in the US Great Plains (GP). Satellite-derived growing season Normalized Difference Vegetation Index, a switchgrass biomass productivity map obtained from a previous study, US Geological Survey (USGS) irrigation and crop masks, and US Department of Agriculture (USDA) crop indemnity maps for the GP were used in this study. Our hypothesis was that croplands with relatively low crop yield but high productivity potential for switchgrass may be suitable for converting to switchgrass. Areas with relatively low crop indemnity (crop indemnity failures. Results show that approximately 650 000 ha of marginal croplands in the GP are potentially suitable for switchgrass development. The total estimated switchgrass biomass productivity gain from these suitable areas is about 5.9 million metric tons. Switchgrass can be cultivated in either lowland or upland regions in the GP depending on the local soil and environmental conditions. This study improves our understanding of ecosystem services and the sustainability of cropland systems in the GP. Results from this study provide useful information to land managers for making informed decisions regarding switchgrass development in the GP.

  12. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2010-01-01

    patterns they can optimize. We present an interactive approach which leverages advanced compiler analysis and optimizations while retaining program- mer control over the source code and its transformation. This allows optimization even when programmers refrain from enabling optimizations to preserve...... accurate debug in- formation or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and sug- gests workarounds which can be applied automatically. We...

  13. Visual Mapping of Sedimentary Facies Can Yield Accurate And Geomorphically Meaningful Results at Morphological Unit to River Segment Scales

    Science.gov (United States)

    Pasternack, G. B.; Wyrick, J. R.; Jackson, J. R.

    2014-12-01

    Long practiced in fisheries, visual substrate mapping of coarse-bedded rivers is eschewed by geomorphologists for inaccuracy and limited sizing data. Geomorphologists perform time-consuming measurements of surficial grains, with the few locations precluding spatially explicit mapping and analysis of sediment facies. Remote sensing works for bare land, but not vegetated or subaqueous sediments. As visual systems apply the log2 Wentworth scale made for sieving, they suffer from human inability to readily discern those classes. We hypothesized that size classes centered on the PDF of the anticipated sediment size distribution would enable field crews to accurately (i) identify presence/absence of each class in a facies patch and (ii) estimate the relative amount of each class to within 10%. We first tested 6 people using 14 measured samples with different mixtures. Next, we carried out facies mapping for ~ 37 km of the lower Yuba River in California. Finally, we tested the resulting data to see if it produced statistically significant hydraulic-sedimentary-geomorphic results. Presence/absence performance error was 0-4% for four people, 13% for one person, and 33% for one person. The last person was excluded from further effort. For the abundance estimation performance error was 1% for one person, 7-12% for three people, and 33% for one person. This last person was further trained and re-tested. We found that the samples easiest to visually quantify were unimodal and bimodal, while those most difficult had nearly equal amounts of each size. This confirms psychological studies showing that humans have a more difficult time quantifying abundances of subgroups when confronted with well-mixed groups. In the Yuba, mean grain size decreased downstream, as is typical for an alluvial river. When averaged by reach, mean grain size and bed slope were correlated with an r2 of 0.95. At the morphological unit (MU) scale, eight in-channel bed MU types had an r2 of 0.90 between mean

  14. Agricultural production in the United States by county: a compilation of information from the 1974 census of agriculture for use in terrestrial food-chain transport and assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shor, R.W.; Baes, C.F. III; Sharp, R.D.

    1982-01-01

    Terrestrial food-chain models that simulate the transport of environmentally released radionuclides incorporate parameters describing agricultural production and practice. Often a single set of default parameters, such as that listed in USNRC Regulatory Guide 1.109, is used in lieu of site-specific information. However, the geographical diversity of agricultural practice in the United States suggests the limitations of a single set of default parameters for assessment models. This report documents default parameters with a county-wide resolution based on analysis of the 1974 US Census of Agriculture for use in terrestrial food chain models. Data reported by county, together with state-based information from the US Department of Agriculture, Economic and Statistics Service, provided the basis for estimates of model input parameters. This report also describes these data bases, their limitations, and lists default parameters by county. Vegetable production is described for four categories: leafy vegetables; vegetables and fruits exposed to airborne material; vegetables, fruits, and nuts protected from airborne materials; and grains. Livestock feeds were analyzed in categories of hay, silage, pasture, and grains. Pasture consumption was estimated from cattle and sheep inventories, their feed requirements, and reported quantities of harvested forage. The results were compared with assumed yields of the pasture areas reported. In addition, non-vegetable food production estimates including milk, beef, pork, lamb, poultry, eggs, goat milk, and honey are described. The agricultural parameters and land use information - in all 47 items - are tabulated in four appendices for each of the 3067 counties of the US reported to the Census of Agriculture, excluding those in Hawaii and Alaska.

  15. Mapping grasslands suitable for cellulosic biofuels in the Greater Platte River Basin, United States

    Science.gov (United States)

    Wylie, Bruce K.; Gu, Yingxin

    2012-01-01

    Biofuels are an important component in the development of alternative energy supplies, which is needed to achieve national energy independence and security in the United States. The most common biofuel product today in the United States is corn-based ethanol; however, its development is limited because of concerns about global food shortages, livestock and food price increases, and water demand increases for irrigation and ethanol production. Corn-based ethanol also potentially contributes to soil erosion, and pesticides and fertilizers affect water quality. Studies indicate that future potential production of cellulosic ethanol is likely to be much greater than grain- or starch-based ethanol. As a result, economics and policy incentives could, in the near future, encourage expansion of cellulosic biofuels production from grasses, forest woody biomass, and agricultural and municipal wastes. If production expands, cultivation of cellulosic feedstock crops, such as switchgrass (Panicum virgatum L.) and miscanthus (Miscanthus species), is expected to increase dramatically. The main objective of this study is to identify grasslands in the Great Plains that are potentially suitable for cellulosic feedstock (such as switchgrass) production. Producing ethanol from noncropland holdings (such as grassland) will minimize the effects of biofuel developments on global food supplies. Our pilot study area is the Greater Platte River Basin, which includes a broad range of plant productivity from semiarid grasslands in the west to the fertile corn belt in the east. The Greater Platte River Basin was the subject of related U.S. Geological Survey (USGS) integrated research projects.

  16. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  17. Extension of Alvis compiler front-end

    Science.gov (United States)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr

    2015-12-01

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters' types and operations on them. Thanks to the compiler's modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  18. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  19. Mapping the potential distribution of the invasive Red Shiner, Cyprinella lutrensis (Teleostei: Cyprinidae) across waterways of the conterminous United States

    Science.gov (United States)

    Poulos, Helen M.; Chernoff, Barry; Fuller, Pam L.; Butman, David

    2012-01-01

    Predicting the future spread of non-native aquatic species continues to be a high priority for natural resource managers striving to maintain biodiversity and ecosystem function. Modeling the potential distributions of alien aquatic species through spatially explicit mapping is an increasingly important tool for risk assessment and prediction. Habitat modeling also facilitates the identification of key environmental variables influencing species distributions. We modeled the potential distribution of an aggressive invasive minnow, the red shiner (Cyprinella lutrensis), in waterways of the conterminous United States using maximum entropy (Maxent). We used inventory records from the USGS Nonindigenous Aquatic Species Database, native records for C. lutrensis from museum collections, and a geographic information system of 20 raster climatic and environmental variables to produce a map of potential red shiner habitat. Summer climatic variables were the most important environmental predictors of C. lutrensis distribution, which was consistent with the high temperature tolerance of this species. Results from this study provide insights into the locations and environmental conditions in the US that are susceptible to red shiner invasion.

  20. The RCCM 2009 Survey: Mapping Doctoral and Postdoctoral CAM Research in the United Kingdom

    Directory of Open Access Journals (Sweden)

    Nicola Robinson

    2011-01-01

    Full Text Available Complementary and Alternative Medicine (CAM is widely available in the UK and used frequently by the public, but there is little high quality research to sustain its continued use and potential integration into the NHS. There is, therefore, a need to develop rigorous research in this area. One essential way forward is to train and develop more CAM researchers so that we can enhance academic capacity and provide the evidence upon which to base strategic healthcare decisions. This UK survey identified 80 research active postgraduates registered for MPhils/PhDs in 21 universities and were either current students or had completed their postgraduate degree during the recent UK Research Assessment Exercise (RAE 2001–2008. The single largest postgraduate degree funder was the university where the students registered (26/80. Thirty-two projects involved randomized controlled trials and 33 used qualitative research methods. The UK RAE also indicates a significant growth of postdoctoral and tenured research activity over this period (in 2001 there were three full time equivalents; in 2008 there were 15.5 with a considerable improvement in research quality. This mapping exercise suggests that considerable effort is currently being invested in developing UK CAM research capacity and thus inform decision making in this area. However, in comparative international terms UK funding is very limited. As in the USA and Australia, a centralized and strategic approach by the National Institute of Health Research to this currently uncoordinated and underfunded activity may benefit CAM research in the UK.

  1. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  2. Compilation of Pilot Cognitive Ability Norms

    Science.gov (United States)

    2011-12-01

    has amassed a body of knowledge about many topics .87 Comprehension (Comp) Measures “ social acculturation ,” “ social intelligence,” and the...AFRL-SA-WP-TR-2012-0001 COMPILATION OF PILOT COGNITIVE ABILITY NORMS Raymond E. King U.S Air Force School of Aerospace Medicine...2011 4. TITLE AND SUBTITLE Compilation of Pilot Cognitive Ability Norms 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  3. Verified Compilation of Floating-Point Computations

    OpenAIRE

    Boldo, Sylvie; Jourdan, Jacques-Henri; Leroy, Xavier; Melquiond, Guillaume

    2015-01-01

    International audience; Floating-point arithmetic is known to be tricky: roundings, formats, exceptional values. The IEEE-754 standard was a push towards straightening the field and made formal reasoning about floating-point computations easier and flourishing. Unfortunately, this is not sufficient to guarantee the final result of a program, as several other actors are involved: programming language, compiler, architecture. The CompCert formally-verified compiler provides a solution to this p...

  4. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  5. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  6. Geologic map of the Cook Inlet region, Alaska, including parts of the Talkeetna, Talkeetna Mountains, Tyonek, Anchorage, Lake Clark, Kenai, Seward, Iliamna, Seldovia, Mount Katmai, and Afognak 1:250,000-scale quadrangles

    Science.gov (United States)

    Wilson, Frederic H.; Hults, Chad P.; Schmoll, Henry R.; Haeussler, Peter J.; Schmidt, Jeanine M.; Yehle, Lynn A.; Labay, Keith A.

    2012-01-01

    In 1976, L.B. Magoon, W.L. Adkinson, and R.M. Egbert published a major geologic map of the Cook Inlet region, which has served well as a compilation of existing information and a guide for future research and mapping. The map in this report updates Magoon and others (1976) and incorporates new and additional mapping and interpretation. This map is also a revision of areas of overlap with the geologic map completed for central Alaska (Wilson and others, 1998). Text from that compilation remains appropriate and is summarized here; many compromises have been made in strongly held beliefs to allow construction of this compilation. Yet our willingness to make interpretations and compromises does not allow resolution of all mapping conflicts. Nonetheless, we hope that geologists who have mapped in this region will recognize that, in incorporating their work, our regional correlations may have required some generalization or lumping of map units. Many sources were used to produce this geologic map and, in most cases, data from available maps were combined, without generalization, and new data were added where available. A preliminary version of this map was published as U.S. Geological Survey Open-File Report 2009–1108. The main differences between the versions concern revised mapping of surfical deposits in the northern and eastern parts of the map area. Minor error corrections have been made also.

  7. Modeling and mapping the probability of occurrence of invasive wild pigs across the contiguous United States.

    Directory of Open Access Journals (Sweden)

    Meredith L McClure

    Full Text Available Wild pigs (Sus scrofa, also known as wild swine, feral pigs, or feral hogs, are one of the most widespread and successful invasive species around the world. Wild pigs have been linked to extensive and costly agricultural damage and present a serious threat to plant and animal communities due to their rooting behavior and omnivorous diet. We modeled the current distribution of wild pigs in the United States to better understand the physiological and ecological factors that may determine their invasive potential and to guide future study and eradication efforts. Using national-scale wild pig occurrence data reported between 1982 and 2012 by wildlife management professionals, we estimated the probability of wild pig occurrence across the United States using a logistic discrimination function and environmental covariates hypothesized to influence the distribution of the species. Our results suggest the distribution of wild pigs in the U.S. was most strongly limited by cold temperatures and availability of water, and that they were most likely to occur where potential home ranges had higher habitat heterogeneity, providing access to multiple key resources including water, forage, and cover. High probability of occurrence was also associated with frequent high temperatures, up to a high threshold. However, this pattern is driven by pigs' historic distribution in warm climates of the southern U.S. Further study of pigs' ability to persist in cold northern climates is needed to better understand whether low temperatures actually limit their distribution. Our model highlights areas at risk of invasion as those with habitat conditions similar to those found in pigs' current range that are also near current populations. This study provides a macro-scale approach to generalist species distribution modeling that is applicable to other generalist and invasive species.

  8. Mapping and Assessment of the United States Ocean Wave Energy Resource

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T; Hagerman, George; Scott, George

    2011-12-01

    This project estimates the naturally available and technically recoverable U.S. wave energy resources, using a 51-month Wavewatch III hindcast database developed especially for this study by National Oceanographic and Atmospheric Administration's (NOAA's) National Centers for Environmental Prediction. For total resource estimation, wave power density in terms of kilowatts per meter is aggregated across a unit diameter circle. This approach is fully consistent with accepted global practice and includes the resource made available by the lateral transfer of wave energy along wave crests, which enables wave diffraction to substantially reestablish wave power densities within a few kilometers of a linear array, even for fixed terminator devices. The total available wave energy resource along the U.S. continental shelf edge, based on accumulating unit circle wave power densities, is estimated to be 2,640 TWh/yr, broken down as follows: 590 TWh/yr for the West Coast, 240 TWh/yr for the East Coast, 80 TWh/yr for the Gulf of Mexico, 1570 TWh/yr for Alaska, 130 TWh/yr for Hawaii, and 30 TWh/yr for Puerto Rico. The total recoverable wave energy resource, as constrained by an array capacity packing density of 15 megawatts per kilometer of coastline, with a 100-fold operating range between threshold and maximum operating conditions in terms of input wave power density available to such arrays, yields a total recoverable resource along the U.S. continental shelf edge of 1,170 TWh/yr, broken down as follows: 250 TWh/yr for the West Coast, 160 TWh/yr for the East Coast, 60 TWh/yr for the Gulf of Mexico, 620 TWh/yr for Alaska, 80 TWh/yr for Hawaii, and 20 TWh/yr for Puerto Rico.

  9. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    Science.gov (United States)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where

  10. Resonating, Rejecting, Reinterpreting: Mapping the Stabilization Discourse in the United Nations Security Council, 2000–14

    Directory of Open Access Journals (Sweden)

    David Curran

    2015-10-01

    Full Text Available This article charts the evolution of the conceptualisation of stabilization in the UN Security Council (UNSC during the period 2001–2014. UNSC open meetings provide an important dataset for a critical review of stabilization discourse and an opportunity to chart the positions of permanent Members, rotating Members and the UN Secretariat towards this concept. This article is the first to conduct an analysis of this material to map the evolution of stabilization in this critical chamber of the UN. This dataset of official statements will be complemented by a review of open source reporting on UNSC meetings and national stabilization doctrines of the ‘P3’ – France, the UK and the US. These countries have developed national stabilization doctrines predominantly to deal with cross-governmental approaches to counterinsurgency operations conducted during the 2000s. The article therefore presents a genealogy of the concept of stabilization in the UNSC to help understand implications for its future development in this multilateral setting. This article begins by examining efforts by the P3 to ‘upload’ their conceptualisations of stabilization into UN intervention frameworks. Secondly, the article uses a content analysis of UNSC debates during 2000–2014 to explore the extent to which the conceptualisation of stabilization resonated with other Council members, were rejected in specific contexts or in general, or were re-interpreted by member states to suit alternative security agendas and interests. Therefore, the article not only examines the UNSC debates surrounding existing UN ‘stabilization operations’ (MONUSCO, MINUSTAH, MINUSCA, MINUSMA, which could be regarded as evidence that this ‘western’ concept has resonated with other UNSC members and relevant UN agencies, but also documents the appearance of stabilization in other contexts too. The article opens new avenues of research into concepts of stabilization within the UN, and

  11. Mapping Antimicrobial Stewardship in Undergraduate Medical, Dental, Pharmacy, Nursing and Veterinary Education in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Enrique Castro-Sánchez

    Full Text Available To investigate the teaching of antimicrobial stewardship (AS in undergraduate healthcare educational degree programmes in the United Kingdom (UK.Cross-sectional survey of undergraduate programmes in human and veterinary medicine, dentistry, pharmacy and nursing in the UK. The main outcome measures included prevalence of AS teaching; stewardship principles taught; estimated hours apportioned; mode of content delivery and teaching strategies; evaluation methodologies; and frequency of multidisciplinary learning.80% (112/140 of programmes responded adequately. The majority of programmes teach AS principles (88/109, 80.7%. 'Adopting necessary infection prevention and control precautions' was the most frequently taught principle (83/88, 94.3%, followed by 'timely collection of microbiological samples for microscopy, culture and sensitivity' (73/88, 82.9% and 'minimisation of unnecessary antimicrobial prescribing' (72/88, 81.8%. The 'use of intravenous administration only to patients who are severely ill, or unable to tolerate oral treatment' was reported in ~50% of courses. Only 32/88 (36.3% programmes included all recommended principles.Antimicrobial stewardship principles are included in most undergraduate healthcare and veterinary degree programmes in the UK. However, future professionals responsible for using antimicrobials receive disparate education. Education may be boosted by standardisation and strengthening of less frequently discussed principles.

  12. The application of compiler-assisted multiple instruction retry to VLIW architectures

    Science.gov (United States)

    Chen, Shyh-Kwei; Fuchs, W. K.; Hwu, Wen-Mei W.

    1994-01-01

    Very Long Instruction Word (VLIW) architectures enhance performance by exploiting fine-grained instruction level parallelism. We describe the development of two compiler assisted multiple instruction word retry schemes for VLIW architectures. The first scheme utilizes the compiler techniques previously developed for processors with single functional units. A compiler generated hazard-free code with different degrees of rollback capability for uniprocessors is compacted by a modified VLIW trace scheduling algorithm. Nops are then inserted in the scheduled code words to resolve data hazards for VLIW architectures. Performance is compared under three parameters: the rollback distance for uni-processors; the number of functional units; and the rollback distance for VLIW architectures. The second scheme employs a hardware read buffer to resolve frequently occurring data hazards, and utilizes the compiler to resolve the remaining hazards. Performance results are shown for six benchmark programs.

  13. Mapping the solid-state properties of crystalline lysozyme during pharmaceutical unit-operations.

    Science.gov (United States)

    Mohammad, Mohammad Amin; Grimsey, Ian M; Forbes, Robert T

    2015-10-10

    Bulk crystallisation of protein therapeutic molecules towards their controlled drug delivery is of interest to the biopharmaceutical industry. The complexity of biotherapeutic molecules is likely to lead to complex material properties of crystals in the solid state and to complex transitions. This complexity is explored using batch crystallised lysozyme as a model. The effects of drying and milling on the solid-state transformations of lysozyme crystals were monitored using differential scanning calorimetry (DSC), X-ray powder diffraction (XRPD), FT-Raman, and enzymatic assay. XRPD was used to characterise crystallinity and these data supported those of crystalline lysozyme which gave a distinctive DSC thermogram. The apparent denaturation temperature (Tm) of the amorphous lysozyme was ∼201 °C, while the Tm of the crystalline form was ∼187 °C. Raman spectra supported a more α-helix rich structure of crystalline lysozyme. This structure is consistent with reduced cooperative unit sizes compared to the amorphous lysozyme and is consistent with a reduction in the Tm of the crystalline form. Evidence was obtained that milling also induced denaturation in the solid-state, with the denatured lysozyme showing no thermal transition. The denaturation of the crystalline lysozyme occurred mainly through its amorphous form. Interestingly, the mechanical denaturation of lysozyme did not affect its biological activity on dissolution. Lysozyme crystals on drying did not become amorphous, while milling-time played a crucial role in the crystalline-amorphous-denatured transformations of lysozyme crystals. DSC is shown to be a key tool to monitor quantitatively these transformations.

  14. On growth and covering theorems of quasi-convex mappings in the unit ball of a complex Banach space

    Institute of Scientific and Technical Information of China (English)

    张文俊; 刘太顺

    2002-01-01

    A class of biholomorphic mappings named "quasi-convex mapping" is introduced in the unitball of a complex Banach space. It is proved that this class of mappings is a proper subset of the class ofstarlike mappings and contains the class of convex mappings properly, and it has the same growth and coveringtheorems as the convex mappings. Furthermore, when the Banach space is confined to Cn, the "quasi-convexmapping" is exactly the "quasi-convex mapping of type A" introduced by K. A. Roper and T. J. Suffridge.

  15. Flexible IDL Compilation for Complex Communication Patterns

    Directory of Open Access Journals (Sweden)

    Eric Eide

    1999-01-01

    Full Text Available Distributed applications are complex by nature, so it is essential that there be effective software development tools to aid in the construction of these programs. Commonplace “middleware” tools, however, often impose a tradeoff between programmer productivity and application performance. For instance, many CORBA IDL compilers generate code that is too slow for high‐performance systems. More importantly, these compilers provide inadequate support for sophisticated patterns of communication. We believe that these problems can be overcome, thus making idl compilers and similar middleware tools useful for a broader range of systems. To this end we have implemented Flick, a flexible and optimizing IDL compiler, and are using it to produce specialized high‐performance code for complex distributed applications. Flick can produce specially “decomposed” stubs that encapsulate different aspects of communication in separate functions, thus providing application programmers with fine‐grain control over all messages. The design of our decomposed stubs was inspired by the requirements of a particular distributed application called Khazana, and in this paper we describe our experience to date in refitting Khazana with Flick‐generated stubs. We believe that the special idl compilation techniques developed for Khazana will be useful in other applications with similar communication requirements.

  16. A pointer logic and certifying compiler

    Institute of Scientific and Technical Information of China (English)

    CHEN Yiyun; GE Lin; HUA Baojian; LI Zhaopeng; LIU Cheng; WANG Zhifang

    2007-01-01

    Proof-Carrying Code brings two big challenges to the research field of programming languages.One is to seek more expressive logics or type systems to specify or reason about the properties of low-level or high-level programs.The other is to study the technology of certifying compilation in which the compiler generates proofs for programs with annotations.This paper presents our progress in the above two aspects.A pointer logic was designed for PointerC (a C-like programming language) in our research.As an extension of Hoare logic,our pointer logic expresses the change of pointer information for each statement in its inference rules to support program verification.Meanwhile,based on the ideas from CAP (Certified Assembly Programming) and SCAP (Stack-based Certified Assembly Programming),a reasoning framework was built to verify the properties of object code in a Hoare style.And a certifying compiler prototype for PointerC was implemented based on this framework.The main contribution of this paper is the design of the pointer logic and the implementation of the certifying compiler prototype.In our certifying compiler,the source language contains rich pointer types and operations and also supports dynamic storage allocation and deallocation.

  17. Assssment and Mapping of the Riverine Hydrokinetic Resource in the Continental United States

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Ravens, Thomas M. [University of Alaska Anchorage; Cunningham, Keith W. [University of Alaska Fairbanks; Scott, George [National Renewable Energy Laboratory

    2012-12-14

    The U.S. Department of Energy (DOE) funded the Electric Power Research Institute and its collaborative partners, University of Alaska ? Anchorage, University of Alaska ? Fairbanks, and the National Renewable Energy Laboratory, to provide an assessment of the riverine hydrokinetic resource in the continental United States. The assessment benefited from input obtained during two workshops attended by individuals with relevant expertise and from a National Research Council panel commissioned by DOE to provide guidance to this and other concurrent, DOE-funded assessments of water based renewable energy. These sources of expertise provided valuable advice regarding data sources and assessment methodology. The assessment of the hydrokinetic resource in the 48 contiguous states is derived from spatially-explicit data contained in NHDPlus ?a GIS-based database containing river segment-specific information on discharge characteristics and channel slope. 71,398 river segments with mean annual flow greater than 1,000 cubic feet per second (cfs) mean discharge were included in the assessment. Segments with discharge less than 1,000 cfs were dropped from the assessment, as were river segments with hydroelectric dams. The results for the theoretical and technical resource in the 48 contiguous states were found to be relatively insensitive to the cutoff chosen. Raising the cutoff to 1,500 cfs had no effect on estimate of the technically recoverable resource, and the theoretical resource was reduced by 5.3%. The segment-specific theoretical resource was estimated from these data using the standard hydrological engineering equation that relates theoretical hydraulic power (Pth, Watts) to discharge (Q, m3 s-1) and hydraulic head or change in elevation (??, m) over the length of the segment, where ? is the specific weight of water (9800 N m-3): ??? = ? ? ?? For Alaska, which is not encompassed by NPDPlus, hydraulic head and discharge data were manually obtained from Idaho National

  18. Geologic Maps and Structure Sections of the southwestern Santa Clara Valley and southern Santa Cruz Mountains, Santa Clara and Santa Cruz Counties, California

    Science.gov (United States)

    McLaughlin, R.J.; Clark, J.C.; Brabb, E.E.; Helley, E.J.; Colon, C.J.

    2001-01-01

    This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (scvmf.ps, scvmf.pdf, scvmf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:24,000 or smaller.

  19. Digital Data for the Reconnaissance Geologic Map for the Kuskokwim Bay Region of Southwest Alaska

    Science.gov (United States)

    Digital Files Preparation: Wilson, Frederic H.; Hults, Chad P.; Mohadjer, Solmaz; Shew, Nora; Labay, Keith A.; Geologic Map Compilers: Wilson, Frederic H.; Coonrad, Warren L.

    2008-01-01

    INTRODUCTION The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  20. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  1. A small evaluation suite for Ada compilers

    Science.gov (United States)

    Wilke, Randy; Roy, Daniel M.

    1986-01-01

    After completing a small Ada pilot project (OCC simulator) for the Multi Satellite Operations Control Center (MSOCC) at Goddard last year, the use of Ada to develop OCCs was recommended. To help MSOCC transition toward Ada, a suite of about 100 evaluation programs was developed which can be used to assess Ada compilers. These programs compare the overall quality of the compilation system, compare the relative efficiencies of the compilers and the environments in which they work, and compare the size and execution speed of generated machine code. Another goal of the benchmark software was to provide MSOCC system developers with rough timing estimates for the purpose of predicting performance of future systems written in Ada.

  2. Proof-Carrying Code with Correct Compilers

    Science.gov (United States)

    Appel, Andrew W.

    2009-01-01

    In the late 1990s, proof-carrying code was able to produce machine-checkable safety proofs for machine-language programs even though (1) it was impractical to prove correctness properties of source programs and (2) it was impractical to prove correctness of compilers. But now it is practical to prove some correctness properties of source programs, and it is practical to prove correctness of optimizing compilers. We can produce more expressive proof-carrying code, that can guarantee correctness properties for machine code and not just safety. We will construct program logics for source languages, prove them sound w.r.t. the operational semantics of the input language for a proved-correct compiler, and then use these logics as a basis for proving the soundness of static analyses.

  3. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  4. Code Generation in the Columbia Esterel Compiler

    Directory of Open Access Journals (Sweden)

    Jia Zeng

    2007-02-01

    Full Text Available The synchronous language Esterel provides deterministic concurrency by adopting a semantics in which threads march in step with a global clock and communicate in a very disciplined way. Its expressive power comes at a cost, however: it is a difficult language to compile into machine code for standard von Neumann processors. The open-source Columbia Esterel Compiler is a research vehicle for experimenting with new code generation techniques for the language. Providing a front-end and a fairly generic concurrent intermediate representation, a variety of back-ends have been developed. We present three of the most mature ones, which are based on program dependence graphs, dynamic lists, and a virtual machine. After describing the very different algorithms used in each of these techniques, we present experimental results that compares twenty-four benchmarks generated by eight different compilation techniques running on seven different processors.

  5. New Approach to Develop a Bilingual Compiler

    Directory of Open Access Journals (Sweden)

    Shampa Banik

    2014-02-01

    Full Text Available This research work presents a development of a Bangla programming language along with its compiler with an aim to introduce the programming language to the beginner through mother tongue. The syntax and construction of the programming language has been kept similar to BASIC language by considering the fact that BASIC is very easier in terms of its syntax, which is reasonably applicable as an introductory language for new programmer. A compiler has been developed for the proposed programming language that compile the source code into an intermediate code which is optimized. We have developed our system in Java. Our software is an efficient translation engine which can translate English source code to Bangla source code. We have implemented the system with a lot of test cases to identify what aspects of the system best explain their relative performance.

  6. A Pad Router for the Monterey Silicon Compiler

    Science.gov (United States)

    1988-03-01

    program and the final chip layout. In 1986, M. A. Malagon -Fajar [Ref. 6o completed a valuable study on the relationship between the compiler and its laywut...the first SC’?\\I)S i cells. and E. Malagon [Ref. 9] described the structure of the data-path and inserted 1.. the first SCMOS organelles. That same...organelles are stacked vertically to form a unit. A description of the MacPitts data-path design and rout- ing organization is presented by E. Malagon [Ref

  7. Assssment and Mapping of the Riverine Hydrokinetic Resource in the Continental United States

    Energy Technology Data Exchange (ETDEWEB)

    Jacobson, Paul T. [Electric Power Research Institute; Ravens, Thomas M. [University of Alaska Anchorage; Cunningham, Keith W. [University of Alaska Fairbanks; Scott, George [National Renewable Energy Laboratory

    2012-12-14

    The U.S. Department of Energy (DOE) funded the Electric Power Research Institute and its collaborative partners, University of Alaska ? Anchorage, University of Alaska ? Fairbanks, and the National Renewable Energy Laboratory, to provide an assessment of the riverine hydrokinetic resource in the continental United States. The assessment benefited from input obtained during two workshops attended by individuals with relevant expertise and from a National Research Council panel commissioned by DOE to provide guidance to this and other concurrent, DOE-funded assessments of water based renewable energy. These sources of expertise provided valuable advice regarding data sources and assessment methodology. The assessment of the hydrokinetic resource in the 48 contiguous states is derived from spatially-explicit data contained in NHDPlus ?a GIS-based database containing river segment-specific information on discharge characteristics and channel slope. 71,398 river segments with mean annual flow greater than 1,000 cubic feet per second (cfs) mean discharge were included in the assessment. Segments with discharge less than 1,000 cfs were dropped from the assessment, as were river segments with hydroelectric dams. The results for the theoretical and technical resource in the 48 contiguous states were found to be relatively insensitive to the cutoff chosen. Raising the cutoff to 1,500 cfs had no effect on estimate of the technically recoverable resource, and the theoretical resource was reduced by 5.3%. The segment-specific theoretical resource was estimated from these data using the standard hydrological engineering equation that relates theoretical hydraulic power (Pth, Watts) to discharge (Q, m3 s-1) and hydraulic head or change in elevation (??, m) over the length of the segment, where ? is the specific weight of water (9800 N m-3): ??? = ? ? ?? For Alaska, which is not encompassed by NPDPlus, hydraulic head and discharge data were manually obtained from Idaho National

  8. Research on the Geo-spatial data compiling of the Circum-Pacific area

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeong-Chan; Chun, Hee-Young; Kim, You-Bong [Korea Institute of Geology Mining and Materials, Taejon (KR)] (and others)

    1999-12-01

    This project focuses on the compilation and digitization of geo-spatial data such as lithostratigraphic lexicon and biostratigraphic data of major sedimentary basins and absolute geologic age data of major rock units including sedimentary, metamorphic and igneous rocks. During 1997, the first year of this project, the Pohang Basin (Tertiary) was chosen as target for compilation and digitization of lithostratigraphic lexicon and biostratigraphic data. A total of 32 lithostratigraphic lexicons and 36 biostratigraphic data were compiled. During 1998, the second year of this project, we compiled lithostratigraphic lexicon and biostratigraphic data of Mesozoic strata in two sedimentary basins: the Chungnam Coal field and Gyeongsang Basin. A total of 9 lithostratigraphic lexicons were compiled from the Chungnam Coal field, and 42 lithostratigraphic lexicons were compiled from the Gyeongsang Basin. Each lexicon includes name, rank, type locality or section, index, historical records, geographic distribution and references. Due to lack of fossils in both Mesozoic non-marine basins, the biostratigraphic data cannot be compiled according to bio zone. Rather, biostratigraphic data were collected in reference to each lithostratigraphic unit(i.e. formation). In this year, the third year of this project, the Joseon Supergroup(Cambrian-Ordovician) was chosen as target for compilation and digitization of lithostratigraphic lexicon and biostratigraphic data. A total of 51 lithostratigraphic lexicons and 33 biostratigraphic data were compiled. Data acquired from this project were exchanged with compatible data from CCOP member countries (East Asia), resulting in the production of WGGC CD-ROM (GEOLOGICAL CORRELATION: Lexicon, Biostratigraphy, Geochronology Data Bases Version 5.2). (author). 18 refs.

  9. Bedrock Geologic Map of the Jay Peak, VT Quadrangle

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital data from VG99-1 Compilation bedrock geologic map of the Jay Peak quadrangle, Compiled by B. Doolan, 1999: VGS Open-File Report VG99-1, 1 plate, scale...

  10. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  11. COMPILATION OF CURRENT HIGH ENERGY PHYSICS EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.; Horne, C.P.; Hutchinson, M.S.; Rittenberg, A.; Trippe, T.G.; Yost, G.P.; Addis, L.; Ward, C.E.W.; Baggett, N.; Goldschmidt-Clermong, Y.; Joos, P.; Gelfand, N.; Oyanagi, Y.; Grudtsin, S.N.; Ryabov, Yu.G.

    1981-05-01

    This is the fourth edition of our compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about April 1981, and (2) had not completed taking of data by 1 January 1977. We emphasize that only approved experiments are included.

  12. Global Geological Map of Venus

    Science.gov (United States)

    Ivanov, M. A.

    2008-09-01

    Introduction: The Magellan SAR images provide sufficient data to compile a geological map of nearly the entire surface of Venus. Such a global and selfconsistent map serves as the base to address the key questions of the geologic history of Venus. 1) What is the spectrum of units and structures that makes up the surface of Venus [1-3]? 2) What volcanic/tectonic processes do they characterize [4-7]? 3) Did these processes operated locally, regionally, or globally [8- 11]? 4) What are the relationships of relative time among the units [8]? 5) At which length-scale these relationships appear to be consistent [8-10]? 6) What is the absolute timing of formation of the units [12-14]? 7) What are the histories of volcanism, tectonics and the long-wavelength topography on Venus? 7) What model(s) of heat loss and lithospheric evolution [15-21] do these histories correspond to? The ongoing USGS program of Venus mapping has already resulted in a series of published maps at the scale 1:5M [e.g. 22-30]. These maps have a patch-like distribution, however, and are compiled by authors with different mapping philosophy. This situation not always results in perfect agreement between the neighboring areas and, thus, does not permit testing geological hypotheses that could be addressed with a self-consistent map. Here the results of global geological mapping of Venus at the scale 1:10M is presented. The map represents a contiguous area extending from 82.5oN to 82.5oS and comprises ~99% of the planet. Mapping procedure: The map was compiled on C2- MIDR sheets, the resolution of which permits identifying the basic characteristics of previously defined units. The higher resolution images were used during the mapping to clarify geologic relationships. When the map was completed, its quality was checked using published USGS maps [e.g., 22-30] and the catalogue of impact craters [31]. The results suggest that the mapping on the C2-base provided a highquality map product. Units and

  13. Recent advances in geologic mapping by radar

    Science.gov (United States)

    Farr, T. G.

    1984-01-01

    Quantitative techniques are available which allow the analysis of SAR images for the derivation of geological surface and process data. In conjunction with calibrated radar sensors operating at several incidence angles, wavelengths, and polarizations, the compilation of multiparameter radar signatures of lithological and geomorphic units can accordingly proceed for geological mapping in unknown areas. While radar image tone can be used in arid zones to derive surface micromorphology, heavily vegetated tropical regions require the analysis of radar image texture by means of Fourier techniques which decompose the image into bandpasses that represent different scales of texture.

  14. How to compile a curriculum vitae.

    Science.gov (United States)

    Fish, J

    The previous article in this series tackled the best way to apply for a job. Increasingly, employers request a curriculum vitae as part of the application process. This article aims to assist you in compiling a c.v. by discussing its essential components and content.

  15. Heat Transfer and Thermodynamics: a Compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Studies include theories and mechanical considerations in the transfer of heat and the thermodynamic properties of matter and the causes and effects of certain interactions.

  16. Medical History: Compiling Your Medical Family Tree

    Science.gov (United States)

    ... history. Or, you can compile your family's health history on your computer or in a paper file. If you encounter reluctance from your family, consider these strategies: Share your ... have a family history of certain diseases or health conditions. Offer to ...

  17. Communications techniques and equipment: A compilation

    Science.gov (United States)

    1975-01-01

    This Compilation is devoted to equipment and techniques in the field of communications. It contains three sections. One section is on telemetry, including articles on radar and antennas. The second section describes techniques and equipment for coding and handling data. The third and final section includes descriptions of amplifiers, receivers, and other communications subsystems.

  18. Geologic and Mineral Resource Map of Afghanistan

    Science.gov (United States)

    Doebrich, Jeff L.; Wahl, Ronald R.; With Contributions by Ludington, Stephen D.; Chirico, Peter G.; Wandrey, Craig J.; Bohannon, Robert G.; Orris, Greta J.; Bliss, James D.; Wasy, Abdul; Younusi, Mohammad O.

    2006-01-01

    Data Summary The geologic and mineral resource information shown on this map is derived from digitization of the original data from Abdullah and Chmyriov (1977) and Abdullah and others (1977). The U.S. Geological Survey (USGS) has made no attempt to modify original geologic map-unit boundaries and faults as presented in Abdullah and Chmyriov (1977); however, modifications to map-unit symbology, and minor modifications to map-unit descriptions, have been made to clarify lithostratigraphy and to modernize terminology. Labeling of map units has not been attempted where they are small or narrow, in order to maintain legibility and to preserve the map's utility in illustrating regional geologic and structural relations. Users are encouraged to refer to the series of USGS/AGS (Afghan Geological Survey) 1:250,000-scale geologic quadrangle maps of Afghanistan that are being released concurrently as open-file reports. The classification of mineral deposit types is based on the authors' interpretation of existing descriptive information (Abdullah and others, 1977; Bowersox and Chamberlin, 1995; Orris and Bliss, 2002) and on limited field investigations by the authors. Deposit-type nomenclature used for nonfuel minerals is modified from published USGS deposit-model classifications, as compiled in Stoeser and Heran (2000). New petroleum localities are based on research of archival data by the authors. The shaded-relief base is derived from Shuttle Radar Topography Mission (SRTM) digital elevation model (DEM) data having 85-meter resolution. Gaps in the original SRTM DEM dataset were filled with data digitized from contours on 1:200,000-scale Soviet General Staff Sheets (1978-1997). The marginal extent of geologic units corresponds to the position of the international boundary as defined by Abdullah and Chmyriov (1977), and the international boundary as shown on this map was acquired from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af) in

  19. Compilation of tRNA sequences.

    Science.gov (United States)

    Sprinzl, M; Grueter, F; Spelzhaus, A; Gauss, D H

    1980-01-11

    This compilation presents in a small space the tRNA sequences so far published. The numbering of tRNAPhe from yeast is used following the rules proposed by the participants of the Cold Spring Harbor Meeting on tRNA 1978 (1,2;Fig. 1). This numbering allows comparisons with the three dimensional structure of tRNAPhe. The secondary structure of tRNAs is indicated by specific underlining. In the primary structure a nucleoside followed by a nucleoside in brackets or a modification in brackets denotes that both types of nucleosides can occupy this position. Part of a sequence in brackets designates a piece of sequence not unambiguosly analyzed. Rare nucleosides are named according to the IUPACIUB rules (for complicated rare nucleosides and their identification see Table 1); those with lengthy names are given with the prefix x and specified in the footnotes. Footnotes are numbered according to the coordinates of the corresponding nucleoside and are indicated in the sequence by an asterisk. The references are restricted to the citation of the latest publication in those cases where several papers deal with one sequence. For additional information the reader is referred either to the original literature or to other tRNA sequence compilations (3-7). Mutant tRNAs are dealt with in a compilation by J. Celis (8). The compilers would welcome any information by the readers regarding missing material or erroneous presentation. On the basis of this numbering system computer printed compilations of tRNA sequences in a linear form and in cloverleaf form are in preparation.

  20. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  1. A Note on Linearly Isometric Extension for 1-Lipschitz and Anti-1-Lipschitz Mappings between Unit Spheres of ALp(μ, H) Spaces

    Institute of Scientific and Technical Information of China (English)

    Zihou ZHANG; Chunyan LIU

    2013-01-01

    In this paper,we show that if V0 is a 1-Lipschitz mapping between unit spheres of Lp (μ,H) and Lp(v,H)(p > 2,H is a Hilbert space),and-Vo(S(Lp(μ,H))) (∪) Vo(S(Lp(μ,H))),then V0 can be extended to a linear isometry defined on the whole space.If 1 < p < 2 and V0 is an "anti-1-Lipschitz" mapping,then Vo can also be linearly and isometrically extended.

  2. Testing-Based Compiler Validation for Synchronous Languages

    Science.gov (United States)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  3. Geologic Map of the Thaumasia Region, Mars

    Science.gov (United States)

    Dohm, Janes M.; Tanaka, Kenneth L.; Hare, Trent M.

    2001-01-01

    The geology of the Thaumasia region (fig. 1, sheet 3) includes a wide array of rock materials, depositional and erosional landforms, and tectonic structures. The region is dominated by the Thaumasia plateau, which includes central high lava plains ringed by highly deformed highlands; the plateau may comprise the ancestral center of Tharsis tectonism (Frey, 1979; Plescia and Saunders, 1982). The extensive structural deformation of the map region, which is without parallel on Mars in both complexity and diversity, occurred largely throughout the Noachian and Hesperian periods (Tanaka and Davis, 1988; Scott and Dohm, 1990a). The deformation produced small and large extensional and contractional structures (fig. 2, sheet 3) that resulted from stresses related to the formation of Tharsis (Frey, 1979; Wise and others, 1979; Plescia and Saunders, 1982; Banerdt and others, 1982, 1992; Watters and Maxwell, 1986; Tanaka and Davis, 1988; Francis, 1988; Watters, 1993; Schultz and Tanaka, 1994), from magmatic-driven uplifts, such as at Syria Planum (Tanaka and Davis, 1988; Dohm and others, 1998; Dohm and Tanaka, 1999) and central Valles Marineris (Dohm and others, 1998, Dohm and Tanaka, 1999), and from the Argyre impact (Wilhelms, 1973; Scott and Tanaka, 1986). In addition, volcanic, eolian, and fluvial processes have highly modified older surfaces in the map region. Local volcanic and tectonic activity often accompanied episodes of valley formation. Our mapping depicts and describes the diverse terrains and complex geologic history of this unique ancient tectonic region of Mars. The geologic (sheet 1), paleotectonic (sheet 2), and paleoerosional (sheet 3) maps of the Thaumasia region were compiled on a Viking 1:5,000,000-scale digital photomosaic base. The base is a combination of four quadrangles: the southeast part of Phoenicis Lacus (MC–17), most of the southern half of Coprates (MC–18), a large part of Thaumasia (MC–25), and the northwest margin of Argyre (MC–26

  4. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  5. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  6. Process compilation methods for thin film devices

    Science.gov (United States)

    Zaman, Mohammed Hasanuz

    This doctoral thesis presents the development of a systematic method of automatic generation of fabrication processes (or process flows) for thin film devices starting from schematics of the device structures. This new top-down design methodology combines formal mathematical flow construction methods with a set of library-specific available resources to generate flows compatible with a particular laboratory. Because this methodology combines laboratory resource libraries with a logical description of thin film device structure and generates a set of sequential fabrication processing instructions, this procedure is referred to as process compilation, in analogy to the procedure used for compilation of computer programs. Basically, the method developed uses a partially ordered set (poset) representation of the final device structure which describes the order between its various components expressed in the form of a directed graph. Each of these components are essentially fabricated "one at a time" in a sequential fashion. If the directed graph is acyclic, the sequence in which these components are fabricated is determined from the poset linear extensions, and the component sequence is finally expanded into the corresponding process flow. This graph-theoretic process flow construction method is powerful enough to formally prove the existence and multiplicity of flows thus creating a design space {cal D} suitable for optimization. The cardinality Vert{cal D}Vert for a device with N components can be large with a worst case Vert{cal D}Vert≤(N-1)! yielding in general a combinatorial explosion of solutions. The number of solutions is hence controlled through a-priori estimates of Vert{cal D}Vert and condensation (i.e., reduction) of the device component graph. The mathematical method has been implemented in a set of algorithms that are parts of the software tool MISTIC (Michigan Synthesis Tools for Integrated Circuits). MISTIC is a planar process compiler that generates

  7. Compiling CIL Rewriting Language for Multiprocessors

    Institute of Scientific and Technical Information of China (English)

    田新民; 王鼎兴; 等

    1994-01-01

    The high-level Conpiler Intermediate Language CIL is a general-purpose description language of parallel graph rewriting computational model intended for paralled implementation of declarative languages on multiprocessor systems.In this paper,we first outline a new Hybrid Execution Model(HEM) and corresponding parallel abstract machine PAM/TGR based on extended parallel Graph Rewriting Computational Model EGRCM for implementing CIL language on distributed memory multiprocessor systems.Then we focus on the compiling CIL language with various optimizing techniques such as pattern matching,rule indexing,node ordering and compile-time partial scheduling.The experimental results on a 16-node transputer Array demonstrates the effectiveness of our model and strategies.

  8. Specialized Silicon Compilers for Language Recognition.

    Science.gov (United States)

    1984-07-01

    the circu its produced by a compiler can be vcrified by formal methods. Fzach primitive cell can be checked independently of the others. When all... primitive cell , each non-terminal corresponds to a more complex combination of cells, and each production corresponds to a construction rule. A...terminal symbol is reached during the parse, the corresponding primitive cell is added to the circuit. 14 "The following grammar for regular expressions is

  9. 1991 OCRWM bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year`s Bulletins.

  10. Application of geologic map information to water quality issues in the southern part of the Chesapeake Bay watershed, Maryland and Virginia, eastern United States

    Science.gov (United States)

    McCartan, L.; Peper, J.D.; Bachman, L.J.; Horton, J.W.

    1999-01-01

    Geologic map units contain much information about the mineralogy, chemistry, and physical attributes of the rocks mapped. This paper presents information from regional-scale geologic maps in Maryland and Virginia, which are in the southern part of the Chesapeake Bay watershed in the eastern United States. The geologic map information is discussed and analyzed in relation to water chemistry data from shallow wells and stream reaches in the area. Two environmental problems in the Chesapeake Bay watershed are used as test examples. The problems, high acidity and high nitrate concentrations in streams and rivers, tend to be mitigated by some rock and sediment types and not by others. Carbonate rocks (limestone, dolomite, and carbonate-cemented rocks) have the greatest capacity to neutralize acidic ground water and surface water in contact with them. Rocks and sediments having high carbon or sulfur contents (such as peat and black shale) potentially contribute the most toward denitrification of ground water and surface water in contact with them. Rocks and sediments that are composed mostly of quartz, feldspar, and light-colored clay (rocks such as granite and sandstone, sediments such as sand and gravel) tend not to alter the chemistry of waters that are in contact with them. The testing of relationships between regionally mapped geologic units and water chemistry is in a preliminary stage, and initial results are encouraging.Geologic map units contain much information about the mineralogy, chemistry, and physical attributes of the rocks mapped. This paper presents information from regional-scale geologic maps in Maryland and Virginia, which are in the southern part of the Chesapeake Bay watershed in the eastern United States. The geologic map information is discussed and analyzed in relation to water chemistry data from shallow wells and stream reaches in the area. Two environmental problems in the Chesapeake Bay watershed are used as test examples. The problems, high

  11. Integrating Parallelizing Compilation Technologies for SMP Clusters

    Institute of Scientific and Technical Information of China (English)

    Xiao-Bing Feng; Li Chen; Yi-Ran Wang; Xiao-Mi An; Lin Ma; Chun-Lei Sang; Zhao-Qing Zhang

    2005-01-01

    In this paper, a source to source parallelizing complier system, AutoPar, is presentd. The system transforms FORTRAN programs to multi-level hybrid MPI/OpenMP parallel programs. Integrated parallel optimizing technologies are utilized extensively to derive an effective program decomposition in the whole program scope. Other features such as synchronization optimization and communication optimization improve the performance scalability of the generated parallel programs, from both intra-node and inter-node. The system makes great effort to boost automation of parallelization.Profiling feedback is used in performance estimation which is the basis of automatic program decomposition. Performance results for eight benchmarks in NPB1.0 from NAS on an SMP cluster are given, and the speedup is desirable. It is noticeable that in the experiment, at most one data distribution directive and a reduction directive are inserted by the user in BT/SP/LU. The compiler is based on ORC, Open Research Compiler. ORC is a powerful compiler infrastructure, with such features as robustness, flexibility and efficiency. Strong analysis capability and well-defined infrastructure of ORC make the system implementation quite fast.

  12. Collection Mapping.

    Science.gov (United States)

    Harbour, Denise

    2002-01-01

    Explains collection mapping for library media collections. Discusses purposes for creating collection maps, including helping with selection and weeding decisions, showing how the collection supports the curriculum, and making budget decisions; and methods of data collection, including evaluating a collaboratively taught unit with the classroom…

  13. Mapping urban geology of the city of Girona, Catalonia

    Science.gov (United States)

    Vilà, Miquel; Torrades, Pau; Pi, Roser; Monleon, Ona

    2016-04-01

    A detailed and systematic geological characterization of the urban area of Girona has been conducted under the project '1:5000 scale Urban geological map of Catalonia' of the Catalan Geological Survey (Institut Cartogràfic i Geològic de Catalunya). The results of this characterization are organized into: i) a geological information system that includes all the information acquired; ii) a stratigraphic model focused on identification, characterization and correlation of the geological materials and structures present in the area and; iii) a detailed geological map that represents a synthesis of all the collected information. The mapping project integrates in a GIS environment pre-existing cartographic documentation (geological and topographical), core data from compiled boreholes, descriptions of geological outcrops within the urban network and neighbouring areas, physico-chemical characterisation of representative samples of geological materials, detailed geological mapping of Quaternary sediments, subsurface bedrock and artificial deposits and, 3D modelling of the main geological surfaces. The stratigraphic model is structured in a system of geological units that from a chronostratigrafic point of view are structured in Palaeozoic, Paleogene, Neogene, Quaternary and Anthropocene. The description of the geological units is guided by a systematic procedure. It includes the main lithological and structural features of the units that constitute the geological substratum and represents the conceptual base of the 1:5000 urban geological map of the Girona metropolitan area, which is organized into 6 map sheets. These map sheets are composed by a principal map, geological cross sections and, several complementary maps, charts and tables. Regardless of the geological map units, the principal map also represents the main artificial deposits, features related to geohistorical processes, contours of outcrop areas, information obtained in stations, borehole data, and contour

  14. 基于多源数据的月球大地构造纲要图编制:以LQ-4地区为例%Compilation of the lunar geotectonic outline map based on multisource data:A case study of LQ-4 Area

    Institute of Scientific and Technical Information of China (English)

    陈建平; 王翔; 许延波; 颜丹平; 刘少峰; 郑永春; 闫柏琨; 吴昀昭

    2012-01-01

    月球表面的地质构造要素主要包括环形构造、线性构造、地体构造及大型盆地构造等。月球大地构造纲要图从物质组成、构造要素、构造单元上对月表的构造状态进行全面的梳理、统计和分析。利用CE-1 CCD2C像数据、LROC宽视角影像数据、CE-1IIM 2C干涉成像光谱仪数据、Clementine紫外可见光影像数据、LOLA激光高度计数据识别月球表面各类矿物组分、线形构造、环形构造、火山构造和穹窿构造以及确定构造要素和构造单元的时代、古老撞击坑和大型盆地边界以及对月球表面撞击坑形态、大小、分布、密度及月球断裂和环形影像解译,充分认识月表基本情况,精细划分月表构造地貌单元,综合利用上述分析结果与国际上研究的进展,确定大地构造区划的基本原则,厘定月表重大构造事件与演化序列。依据岩石、月壤、构造地貌与构造形迹的综合分类,拟定大地构造区划的图例、图识规范,确定不同类型环形构造影像、线性构造影像、高地、盆地和月海等大地构造单元,进而编制大地构造区划图,并对重点区域构造形迹进行研究。虹湾区域(LQ-4)月球数字构造编图研究,充分借鉴国际行星地质编图的已有技术标准和规范,结合国内数字地质编图的技术标准和规范,建立了中国自己的月球与行星地质编图标准、规范和制图流程,也为最终完成月球大地构造区划提供地貌和构造方面的基础信息。%Geologic tectonic elements on the lunar surface typically include ring structures,linear structures,geologic body structures and large basin structures.The lunar geotectonic outline map combs,counts and analyzes the tectonic status on the lunar surface from the material composition,tectonic elements and tectonic units.First,CE-1 CCD 2C image data,LROC broad-view image data,CE-1 IIM 2C interference imaging

  15. Digital Geologic Map of the Nevada Test Site and Vicinity, Nye, Lincoln, and Clark Counties, Nevada, and Inyo County, California

    Science.gov (United States)

    Slate, Janet L.; Berry, Margaret E.; Rowley, Peter D.; Fridrich, Christopher J.; Morgan, Karen S.; Workman, Jeremiah B.; Young, Owen D.; Dixon, Gary L.; Williams, Van S.; McKee, Edwin H.; Ponce, David A.; Hildenbrand, Thomas G.; Swadley, W.C.; Lundstrom, Scott C.; Ekren, E. Bartlett; Warren, Richard G.; Cole, James C.; Fleck, Robert J.; Lanphere, Marvin A.; Sawyer, David A.; Minor, Scott A.; Grunwald, Daniel J.; Laczniak, Randell J.; Menges, Christopher M.; Yount, James C.; Jayko, Angela S.

    1999-01-01

    This digital geologic map of the Nevada Test Site (NTS) and vicinity, as well as its accompanying digital geophysical maps, are compiled at 1:100,000 scale. The map compilation presents new polygon (geologic map unit contacts), line (fault, fold axis, metamorphic isograd, dike, and caldera wall) and point (structural attitude) vector data for the NTS and vicinity, Nye, Lincoln, and Clark Counties, Nevada, and Inyo County, California. The map area covers two 30 x 60-minute quadrangles-the Pahute Mesa quadrangle to the north and the Beatty quadrangle to the south-plus a strip of 7.5-minute quadrangles on the east side-72 quadrangles in all. In addition to the NTS, the map area includes the rest of the southwest Nevada volcanic field, part of the Walker Lane, most of the Amargosa Desert, part of the Funeral and Grapevine Mountains, some of Death Valley, and the northern Spring Mountains. This geologic map improves on previous geologic mapping of the same area (Wahl and others, 1997) by providing new and updated Quaternary and bedrock geology, new geophysical interpretations of faults beneath the basins, and improved GIS coverages. Concurrent publications to this one include a new isostatic gravity map (Ponce and others, 1999) and a new aeromagnetic map (Ponce, 1999).

  16. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  17. Lake Wales Ridge National Wildlife Refuge (Horse Creek and Snell Creek Units) [Land Status Map: Sheet 1 of 19

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This map was produced by the Division of Realty to depict landownership at Lake Wales Ridge National Wildlife Refuge. It was generated from rectified aerial...

  18. Piping and tubing technology: A compilation

    Science.gov (United States)

    1971-01-01

    A compilation on the devices, techniques, and methods used in piping and tubing technology is presented. Data cover the following: (1) a number of fittings, couplings, and connectors that are useful in joining tubing and piping and various systems, (2) a family of devices used where flexibility and/or vibration damping are necessary, (3) a number of devices found useful in the regulation and control of fluid flow, and (4) shop hints to aid in maintenance and repair procedures such as cleaning, flaring, and swaging of tubes.

  19. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  20. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  1. Compiling ER Specifications into Declarative Programs

    CERN Document Server

    Braßel, Bernd; Muller, Marion

    2007-01-01

    This paper proposes an environment to support high-level database programming in a declarative programming language. In order to ensure safe database updates, all access and update operations related to the database are generated from high-level descriptions in the entity- relationship (ER) model. We propose a representation of ER diagrams in the declarative language Curry so that they can be constructed by various tools and then translated into this representation. Furthermore, we have implemented a compiler from this representation into a Curry program that provides access and update operations based on a high-level API for database programming.

  2. A Conformance Test Suite for Arden Syntax Compilers and Interpreters.

    Science.gov (United States)

    Wolf, Klaus-Hendrik; Klimek, Mike

    2016-01-01

    The Arden Syntax for Medical Logic Modules is a standardized and well-established programming language to represent medical knowledge. To test the compliance level of existing compilers and interpreters no public test suite exists. This paper presents the research to transform the specification into a set of unit tests, represented in JUnit. It further reports on the utilization of the test suite testing four different Arden Syntax processors. The presented and compared results reveal the status conformance of the tested processors. How test driven development of Arden Syntax processors can help increasing the compliance with the standard is described with two examples. In the end some considerations how an open source test suite can improve the development and distribution of the Arden Syntax are presented.

  3. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  4. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  5. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  6. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    1994-01-01

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  7. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  8. Southwest Indian Ocean Bathymetric Compilation (swIOBC)

    Science.gov (United States)

    Jensen, L.; Dorschel, B.; Arndt, J. E.; Jokat, W.

    2014-12-01

    As result of long-term scientific activities in the southwest Indian Ocean, an extensive amount of swath bathymetric data has accumulated in the AWI database. Using this data as a backbone, supplemented by additional bathymetric data sets and predicted bathymetry, we generate a comprehensive regional bathymetric data compilation for the southwest Indian Ocean. A high resolution bathymetric chart of this region will support geological and climate research: Identification of current-induced seabed structures will help modelling oceanic currents and, thus, provide proxy information about the paleo-climate. Analysis of the sediment distribution will contribute to reconstruct the erosional history of Eastern Africa. The aim of swIOBC is to produce a homogeneous and seamless bathymetric grid with an associated meta-database and a corresponding map for the area from 5° to 39° S and 20° to 44° E. Recently, multibeam data with a track length of approximately 86,000 km are held in-house. In combination with external echosounding data this allows for the generation of a regional grid, significantly improving the existing, mostly satellite altimetry derived, bathymetric models. The collected data sets are heterogeneous in terms of age, acquisition system, background data, resolution, accuracy, and documentation. As a consequence, the production of a bathymetric grid requires special techniques and algorithms, which were already developed for the IBCAO (Jakobsson et al., 2012) and further refined for the IBCSO (Arndt et al., 2013). The new regional southwest Indian Ocean chart will be created based on these methods. Arndt, J.E., et al., 2013. The International Bathymetric Chart of the Southern Ocean (IBCSO) Version 1.0—A new bathymetric compilation covering circum-Antarctic waters. GRL 40, 1-7, doi: 10.1002/grl.50413, 2013. Jakobsson, M., et al., 2012. The International Bathymetric Chart of the Arctic Ocean (IBCAO) Version 3.0. GRL 39, L12609, doi: 10.1029/2012GL052219.

  9. Geologic map of the Chewelah 30' x 60' Quadrangle, Washington and Idaho

    Science.gov (United States)

    Miller, F.K.

    2001-01-01

    This data set maps and describes the geology of the Chewelah 30' X 60' quadrangle, Washington and Idaho. Created using Environmental Systems Research Institute's ARC/INFO software, the data base consists of the following items: (1) a map coverage containing geologic contacts and units, (2) a point coverage containing site-specific geologic structural data, (3) two coverages derived from 1:100,000 Digital Line Graphs (DLG); one of which represents topographic data, and the other, cultural data, (4) two line coverages that contain cross-section lines and unit-label leaders, respectively, and (5) attribute tables for geologic units (polygons), contacts (arcs), and site-specific data (points). In addition, the data set includes the following graphic and text products: (1) A PostScript graphic plot-file containing the geologic map, topography, cultural data, and two cross sections, and on a separate sheet, a Correlation of Map Units (CMU) diagram, an abbreviated Description of Map Units (DMU), modal diagrams for granitic rocks, an index map, a regional geologic and structure map, and a key for point and line symbols; (2) PDF files of the Readme text-file and expanded Description of Map Units (DMU), and (3) this metadata file. The geologic map database contains original U.S. Geological Survey data generated by detailed field observation and by interpretation of aerial photographs. The map was compiled from geologic maps of eight 1:48,000 15' quadrangle blocks, each of which was made by mosaicing and reducing the four constituent 7.5' quadrangles. These 15' quadrangle blocks were mapped chiefly at 1:24,000 scale, but the detail of the mapping was governed by the intention that it was to be compiled at 1:48,000 scale. The compilation at 1:100,000 scale entailed necessary simplification in some areas and combining of some geologic units. Overall, however, despite a greater than two times reduction in scale, most geologic detail found on the 1:48,000 maps is retained on the

  10. Digital geologic map of the Thirsty Canyon NW quadrangle, Nye County, Nevada

    Science.gov (United States)

    Minor, S.A.; Orkild, P.P.; Sargent, K.A.; Warren, R.G.; Sawyer, D.A.; Workman, J.B.

    1998-01-01

    This digital geologic map compilation presents new polygon (i.e., geologic map unit contacts), line (i.e., fault, fold axis, dike, and caldera wall), and point (i.e., structural attitude) vector data for the Thirsty Canyon NW 7 1/2' quadrangle in southern Nevada. The map database, which is at 1:24,000-scale resolution, provides geologic coverage of an area of current hydrogeologic and tectonic interest. The Thirsty Canyon NW quadrangle is located in southern Nye County about 20 km west of the Nevada Test Site (NTS) and 30 km north of the town of Beatty. The map area is underlain by extensive layers of Neogene (about 14 to 4.5 million years old [Ma]) mafic and silicic volcanic rocks that are temporally and spatially associated with transtensional tectonic deformation. Mapped volcanic features include part of a late Miocene (about 9.2 Ma) collapse caldera, a Pliocene (about 4.5 Ma) shield volcano, and two Pleistocene (about 0.3 Ma) cinder cones. Also documented are numerous normal, oblique-slip, and strike-slip faults that reflect regional transtensional deformation along the southern part of the Walker Lane belt. The Thirsty Canyon NW map provides new geologic information for modeling groundwater flow paths that may enter the map area from underground nuclear testing areas located in the NTS about 25 km to the east. The geologic map database comprises six component ArcINFO map coverages that can be accessed after decompressing and unbundling the data archive file (tcnw.tar.gz). These six coverages (tcnwpoly, tcnwflt, tcnwfold, tcnwdike, tcnwcald, and tcnwatt) are formatted here in ArcINFO EXPORT format. Bundled with this database are two PDF files for readily viewing and printing the map, accessory graphics, and a description of map units and compilation methods.

  11. Advanced compilation techniques in the PARADIGM compiler for distributed-memory multicomputers

    Science.gov (United States)

    Su, Ernesto; Lain, Antonio; Ramaswamy, Shankar; Palermo, Daniel J.; Hodges, Eugene W., IV; Banerjee, Prithviraj

    1995-01-01

    The PARADIGM compiler project provides an automated means to parallelize programs, written in a serial programming model, for efficient execution on distributed-memory multicomputers. .A previous implementation of the compiler based on the PTD representation allowed symbolic array sizes, affine loop bounds and array subscripts, and variable number of processors, provided that arrays were single or multi-dimensionally block distributed. The techniques presented here extend the compiler to also accept multidimensional cyclic and block-cyclic distributions within a uniform symbolic framework. These extensions demand more sophisticated symbolic manipulation capabilities. A novel aspect of our approach is to meet this demand by interfacing PARADIGM with a powerful off-the-shelf symbolic package, Mathematica. This paper describes some of the Mathematica routines that performs various transformations, shows how they are invoked and used by the compiler to overcome the new challenges, and presents experimental results for code involving cyclic and block-cyclic arrays as evidence of the feasibility of the approach.

  12. An Alternative Approach to Mapping Thermophysical Units from Martian Thermal Inertia and Albedo Data Using a Combination of Unsupervised Classification Techniques

    Directory of Open Access Journals (Sweden)

    Eriita Jones

    2014-06-01

    Full Text Available Thermal inertia and albedo provide information on the distribution of surface materials on Mars. These parameters have been mapped globally on Mars by the Thermal Emission Spectrometer (TES onboard the Mars Global Surveyor. Two-dimensional clusters of thermal inertia and albedo reflect the thermophysical attributes of the dominant materials on the surface. In this paper three automated, non-deterministic, algorithmic classification methods are employed for defining thermophysical units: Expectation Maximisation of a Gaussian Mixture Model; Iterative Self-Organizing Data Analysis Technique (ISODATA; and Maximum Likelihood. We analyse the behaviour of the thermophysical classes resulting from the three classifiers, operating on the 2007 TES thermal inertia and albedo datasets. Producing a rigorous mapping of thermophysical classes at ~3 km/pixel resolution remains important for constraining the geologic processes that have shaped the Martian surface on a regional scale, and for choosing appropriate landing sites. The results from applying these algorithms are compared to geologic maps, surface data from lander missions, features derived from imaging, and previous classifications of thermophysical units which utilized manual (and potentially more time consuming classification methods. These comparisons comprise data suitable for validation of our classifications. Our work shows that a combination of the algorithms—ISODATA and Maximum Likelihood—optimises the sensitivity to the underlying dataspace, and that new information on Martian surface materials can be obtained by using these methods. We demonstrate that the algorithms used here can be applied to define a finer partitioning of albedo and thermal inertia for a more detailed mapping of surface materials, grain sizes and thermal behaviour of the Martian surface and shallow subsurface, at the ~3 km scale.

  13. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    The performance of many parallel applications relies not on instruction-level parallelism but on loop-level parallelism. Unfortunately, automatic parallelization of loops is a fragile process; many different obstacles affect or prevent it in practice. To address this predicament we developed...... an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection......, resulting in scalable parallelized code that runs up to 8.3 times faster on an eightcore Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should be combined...

  14. A survey of compiler optimization techniques

    Science.gov (United States)

    Schneck, P. B.

    1972-01-01

    Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.

  15. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  16. Programming cells: towards an automated 'Genetic Compiler'.

    Science.gov (United States)

    Clancy, Kevin; Voigt, Christopher A

    2010-08-01

    One of the visions of synthetic biology is to be able to program cells using a language that is similar to that used to program computers or robotics. For large genetic programs, keeping track of the DNA on the level of nucleotides becomes tedious and error prone, requiring a new generation of computer-aided design (CAD) software. To push the size of projects, it is important to abstract the designer from the process of part selection and optimization. The vision is to specify genetic programs in a higher-level language, which a genetic compiler could automatically convert into a DNA sequence. Steps towards this goal include: defining the semantics of the higher-level language, algorithms to select and assemble parts, and biophysical methods to link DNA sequence to function. These will be coupled to graphic design interfaces and simulation packages to aid in the prediction of program dynamics, optimize genes, and scan projects for errors.

  17. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    Research in the design of aspect-oriented programming languages requires a workbench that facilitates easy experimentation with new language features and implementation techniques. In particular, new features for AspectJ have been proposed that require extensions in many dimensions: syntax, type...... checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  18. Summary of significant floods in the United States, Puerto Rico, and the Virgin Islands, 1970 through 1989

    Science.gov (United States)

    Perry, Charles A.; Aldridge, Byron N.; Ross, Heather C.

    2001-01-01

    This volume is a compilation of significant floods that occurred throughout the United States, Puerto Rico, and the Virgin Islands during 1970 through 1989. A summary of most devastating floods according to amount of damage and lives lost is provided for each year. State-by-state compilations include a description of the general hydroclimatology and conditions that produce significant floods, a description of climate and basin characteristics that significantly affect maximum flows, tables of data that compare each significant flood during 1970 through 1989 with the maximum flood for the entire period of record at selected streamflow-gaging stations, and maps showing the location of the streamflow-gaging stations.

  19. A Service-Learning Project for Geography: Designing a Painted Playground Map of the United States for Elementary Schools

    Science.gov (United States)

    Petzold, Donald; Heppen, John

    2005-01-01

    Many student geography organizations or clubs associated with colleges and universities undertake community service projects each year to meet local needs and to gain recognition within the community. A uniquely geographical project of playground map painting provides a great community service and goes one step further by incorporating elements of…

  20. Continental Margin Mapping Program (CONMAP) sediments grainsize distribution for the United States East Coast Continental Margin (CONMAPSG)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Sediments off the eastern United States vary markedly in texture - the size, shape, and arrangement of their grains. However, for descriptive purposes, it is...

  1. Continental Margin Mapping Program (CONMAP) sediments grainsize distribution for the United States East Coast Continental Margin (CONMAPSG)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Sediments off the eastern United States vary markedly in texture - the size, shape, and arrangement of their grains. However, for descriptive purposes, it is...

  2. Geologic map of the Caetano caldera, Lander and Eureka counties, Nevada

    Science.gov (United States)

    Colgan, Joseph P.; Henry, Christopher D.; John, David A.

    2011-01-01

    The Eocene (34 Ma) Caetano caldera in north-central Nevada offers an exceptional opportunity to study the physical and petrogenetic evolution of a large (20 km by 10–18 km pre-extensional dimensions) silicic magma chamber, from precursor magmatism to caldera collapse and intrusion of resurgent plutons. Caldera-related rocks shown on this map include two units of crystal-rich intracaldera tuff totaling over 4 km thickness, caldera collapse breccias, tuff dikes that fed the eruption, hydrothermally altered post-eruption rocks, and two generations of resurgent granitic intrusions (John et al., 2008). The map also depicts middle Miocene (about 16–12 Ma) normal faults and synextensional basins that accommodated >100 percent extension and tilted the caldera into a series of ~40° east-dipping blocks, producing exceptional 3-D exposures of the caldera interior (Colgan et al., 2008). This 1:75,000-scale map is a compilation of published maps and extensive new mapping by the authors (fig. 1), and supersedes a preliminary 1:100,000-scale map published by Colgan et al. (2008) and John et al. (2008). New mapping focused on the margins of the Caetano caldera, the distribution and lithology of rocks within the caldera, and on the Miocene normal faults and sedimentary basins that record Neogene extensional faulting. The definition of geologic units and their distribution within the caldera is based entirely on new mapping, except in the northern Toiyabe Range, where mapping by Gilluly and Gates (1965) was modified with new field observations. The distribution of pre-Cenozoic rocks outside the caldera was largely compiled from existing sources with minor modifications, with the exception of the northeastern caldera margin (west of the Cortez Hills Mine), which was remapped in the course of this work and published as a stand-alone 1:6000-scale map (Moore and Henry, 2010).

  3. Hierarchical object-based mapping of riverscape units and in-stream mesohabitats using LiDAR and VHR imagery

    OpenAIRE

    Luca Demarchi; Simone Bizzi; Hervé Piégay

    2015-01-01

    In this paper, we present a new, semi-automated methodology for mapping hydromorphological indicators of rivers at a regional scale using multisource remote sensing (RS) data. This novel approach is based on the integration of spectral and topographic information within a multilevel, geographic, object-based image analysis (GEOBIA). Different segmentation levels were generated based on the two sources of Remote Sensing (RS) data, namely very-high spatial resolution, near-infrared imagery (VHR...

  4. Quaternary geologic map of the Wolf Point 1° × 2° quadrangle, Montana and North Dakota

    Science.gov (United States)

    Fullerton, David S.; Colton, Roger B.; Bush, Charles A.

    2016-09-08

    The Wolf Point quadrangle encompasses approximately 16,084 km2 (6,210 mi2). The northern boundary is the Montana/Saskatchewan (U.S.-Canada) boundary. The quadrangle is in the Northern Plains physiographic province and it includes the Peerless Plateau and Flaxville Plain. The primary river is the Missouri River.The map units are surficial deposits and materials, not landforms. Deposits that comprise some constructional landforms (for example, ground-moraine deposits, end-moraine deposits, and stagnation-moraine deposits, all composed of till) are distinguished for purposes of reconstruction of glacial history. Surficial deposits and materials are assigned to 23 map units on the basis of genesis, age, lithology or composition, texture or particle size, and other physical, chemical, and engineering characteristics. It is not a map of soils that are recognized in pedology or agronomy.  Rather, it is a generalized map of soils recognized in engineering geology, or of substrata or parent materials in which pedologic or agronomic soils are formed.  Glaciotectonic (ice-thrust) structures and deposits are mapped separately, represented by a symbol. The surficial deposits are glacial, ice-contact, glaciofluvial, alluvial, lacustrine, eolian, colluvial, and mass-movement deposits.Till of late Wisconsin age is represented by three map units. Till of Illinoian age also is mapped.  Till deposited during pre-Illinoian glaciations is not mapped, but is widespread in the subsurface.  Linear ice-molded landforms (primarily drumlins), shown by symbol, indicate directions of ice flow during late Wisconsin and Illinoian glaciations. The Quaternary geologic map of the Wolf Point quadrangle, northeastern Montana and North Dakota, was prepared to provide a database for compilation of a Quaternary geologic map of the Regina 4° × 6° quadrangle, United States and Canada, at scale 1:1,000,000, for the U.S. Geological Survey Quaternary Geologic Atlas of the United States map series

  5. Parvovirus B19 promoter at map unit 6 confers autonomous replication competence and erythroid specificity to adeno-associated virus 2 in primary human hematopoietic progenitor cells.

    Science.gov (United States)

    Wang, X S; Yoder, M C; Zhou, S Z; Srivastava, A

    1995-01-01

    The pathogenic human parvovirus B19 is an autonomously replicating virus with a remarkable tropism for human erythroid progenitor cells. Although the target cell specificity for B19 infection has been suggested to be mediated by the erythrocyte P-antigen receptor (globoside), a number of nonerythroid cells that express this receptor are nonpermissive for B19 replication. To directly test the role of expression from the B19 promoter at map unit 6 (B19p6) in the erythroid cell specificity of B19, we constructed a recombinant adeno-associated virus 2 (AAV), in which the authentic AAV promoter at map unit 5 (AAVp5) was replaced by the B19p6 promoter. Although the wild-type (wt) AAV requires a helper virus for its optimal replication, we hypothesized that inserting the B19p6 promoter in a recombinant AAV would permit autonomous viral replication, but only in erythroid progenitor cells. In this report, we provide evidence that the B19p6 promoter is necessary and sufficient to impart autonomous replication competence and erythroid specificity to AAV in primary human hematopoietic progenitor cells. Thus, expression from the B19p6 promoter plays an important role in post-P-antigen receptor erythroid-cell specificity of parvovirus B19. The AAV-B19 hybrid vector system may also prove to be useful in potential gene therapy of human hemoglobinopathies. Images Fig. 2 Fig. 3 Fig. 4 PMID:8618912

  6. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  7. Continuation-Passing C, compiling threads to events through continuations

    CERN Document Server

    Kerneis, Gabriel

    2010-01-01

    In this paper, we introduce Continuation Passing C (CPC), a programming language for concurrent systems in which native and cooperative threads are unified and presented to the programmer as a single abstraction. The CPC compiler uses a compilation technique, based on the CPS transform, that yields efficient code and an extremely lightweight representation for contexts. We provide a complete proof of the correctness of our compilation scheme. We show in particular that lambda-lifting, a common compilation technique for functional languages, is also correct in an imperative language like C, under some conditions enforced by the CPC compiler. The current CPC compiler is mature enough to write substantial programs such as Hekate, a highly concurrent BitTorrent seeder. Our benchmark results show that CPC is as efficient, while significantly cheaper, as the most efficient thread libraries available.

  8. Design and Implementation of Java Just-in-Time Compiler

    Institute of Scientific and Technical Information of China (English)

    丁宇新; 梅嘉; 程虎

    2000-01-01

    Early Java implementations relied on interpretation, leading to poor performance compared to compiled programs. Java just-in-time (JIT) compiler can compile Java programs at runtime, so it not only improves Java's performance prominently, but also preserves Java's portability. In this paper the design and implementing techniques of Java JIT compiler based on Chinese open system are discussed in detail. To enhance the portability, a translating method which combines the static simulating method and macro expansion method is adopted. The optimization technique for JIT compiler is also discussed and a way to evaluate the hotspots in Java programs is presented. Experiments have been conducted to verify JIT compilation technique as an efficient way to accelerate Java.

  9. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  10. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  11. Performance of Compiler-Assisted Memory Safety Checking

    Science.gov (United States)

    2014-08-01

    Performance of Compiler -Assisted Memory Safety Checking David Keaton Robert C. Seacord August 2014 TECHNICAL NOTE CMU/SEI-2014-TN...014 | vii Abstract Buffer overflows affect a large installed base of C code. This technical note describes the criteria for deploying a compiler ...describes a modification to the LLVM compiler to enable hoisting bounds checks from loops and functions. This proof-of-concept prototype has been used

  12. Ground Operations Aerospace Language (GOAL). Volume 2: Compiler

    Science.gov (United States)

    1973-01-01

    The principal elements and functions of the Ground Operations Aerospace Language (GOAL) compiler are presented. The technique used to transcribe the syntax diagrams into machine processable format for use by the parsing routines is described. An explanation of the parsing technique used to process GOAL source statements is included. The compiler diagnostics and the output reports generated during a GOAL compilation are explained. A description of the GOAL program package is provided.

  13. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  14. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  15. DisBlue+: A distributed annotation-based C# compiler

    Directory of Open Access Journals (Sweden)

    Samir E. AbdelRahman

    2010-06-01

    Full Text Available Many programming languages utilize annotations to add useful information to the program but they still result in more tokens to be compiled and hence slower compilation time. Any current distributed compiler breaks the program into scattered disjoint pieces to speed up the compilation. However, these pieces cooperate synchronously and depend highly on each other. This causes massive overhead since messages, symbols, or codes must be roamed throughout the network. This paper presents two promising compilers named annotation-based C# (Blue+ and distributed annotation-based C# (DisBlue+. The proposed Blue+ annotation is based on axiomatic semantics to replace the if/loop constructs. As the developer tends to use many (complex conditions and repeat them in the program, such annotations reduce the compilation scanning time and increases the whole code readability. Built on the top of Blue+, DisBlue+ presents its proposed distributed concept which is to divide each program class to its prototype and definition, as disjoint distributed pieces, such that each class definition is compiled with only its related compiled prototypes (interfaces. Such concept reduces the amount of code transferred over the network, minimizes the dependencies among the disjoint pieces, and removes any possible synchronization between them. To test their efficiencies, Blue+ and DisBlue+ were verified with large-size codes against some existing compilers namely Javac, DJavac, and CDjava.

  16. Hydrostratigraphic mapping of the Milford-Souhegan glacial drift aquifer, and effects of hydrostratigraphy on transport of PCE, Operable Unit 1, Savage Superfund Site, Milford, New Hampshire

    Science.gov (United States)

    Harte, Philip T.

    2010-01-01

    The Savage Municipal Well Superfund site in the Town of Milford, New Hampshire, was underlain by a 0.5-square mile plume (as mapped in 1994) of volatile organic compounds (VOCs), most of which consisted of tetrachloroethylene (PCE). The plume occurs mostly within highly transmissive stratified-drift deposits but also extends into underlying till and bedrock. The plume has been divided into two areas called Operable Unit 1 (OU1), which contains the primary source area, and Operable Unit 2 (OU2), which is defined as the extended plume area outside of OU1. The OU1 remedial system includes a low-permeability barrier wall that encircles the highest detected concentrations of PCE and a series of injection and extraction wells to contain and remove contaminants. The barrier wall likely penetrates the full thickness of the sand and gravel; in many places, it also penetrates the full thickness of the underlying basal till and sits atop bedrock.From 1998 to 2004, PCE concentrations decreased by an average of 80 percent at most wells outside the barrier wall. However, inside the barrier, PCE concentrations greater than 10,000 micrograms per liter (μg/L) still exist (2008). The remediation of these areas of recalcitrant PCE presents challenges to successful remediation.The U.S. Geological Survey (USGS), in cooperation with the New Hampshire Department of Environmental Services (NHDES) and the U.S. Environmental Protection Agency (USEPA), Region 1, is studying the solute transport of VOCs (primarily PCE) in contaminated groundwater in the unconsolidated sediments (overburden) of the Savage site and specifically assisting in the evaluation of the effectiveness of remedial operations in the OU1 area. As part of this effort, the USGS analyzed the subsurface stratigraphy to help understand hydrostratigraphic controls on remediation.A combination of lithologic, borehole natural gamma-ray and electromagnetic (EM) induction logging, and test drilling has identified 11 primary

  17. [Effectiveness of the kit Conversation Map in the therapeutic education of diabetic people attending the Diabetes Unit in Carpi, Italy].

    Science.gov (United States)

    Ciardullo, Anna Vittoria; Daghio, Maria Monica; Fattori, Giuseppe; Giudici, Graziella; Rossii, Lorella; Vagnini, Claudio

    2010-12-01

    We implemented the "Diabetes conversations", programme of the International Diabetes Federation-Europe, characterised by the use of the Conversation Map, an educational interactive kit addressed to groups of diabetic patients on: Living with diabetes, What is diabetes, Healthy diet and physical activity, Initiating insulin therapy. After at least three month from the end of the 4-session course, clinical data of 63 participants from the first 10 groups--age (mean +/- std dev) 61.7 +/- 10.2 years, 56% women, 18.5% T1DM-improved: fasting glycemia decreased from 152.9 +/- 55.2 to 138.2 +/- 38.9 mg/dl (P Conversation Maps are useful because: (a) contribute to improve glycometabolic control; (b) educate patients on the main topics related to diabetes; (c) give to the nurse a key and active role in patients'education; (d) facilitate the connection between knowledge and behaviour; (e) involve the volunteers of the diabetic association as tutors; (f) improve the relationship and the communication between the doctor/nurse and the patient.

  18. Simulation modeling to evaluate the persistence of Mycobacterium avium subsp. paratuberculosis (MAP) on commercial dairy farms in the United States.

    Science.gov (United States)

    Mitchell, R M; Whitlock, R H; Stehman, S M; Benedictus, A; Chapagain, P P; Grohn, Y T; Schukken, Y H

    2008-03-17

    We developed a series of deterministic mathematical models of Mycobacterium avium subsp. paratuberculosis (MAP) transmission on commercial US dairies. Our models build upon and modify models and assumptions in previous work to better reflect the pathobiology of the disease. Parameter values were obtained from literature for animal turnover in US dairy herds and rates of transition between disease states. The models developed were used to test three hypotheses. (1) Infectious transmission following intervention is relatively insensitive to the presence of high-shedding animals. (2) Vertical and pseudo-vertical transmission increases prevalence of disease but is insufficient to explain persistence following intervention. (3) Transiently shedding young animals might aid persistence. Our simulations indicated that multiple levels of contagiousness among infected adult animals in combination with vertical transmission and MAP shedding in infected young animals explained the maintenance of low-prevalence infections in herds. High relative contagiousness of high-shedding adult animals resulted in these animals serving as the predominant contributor to transmission. This caused elimination of infection in herds using the test-and-cull intervention tested in these simulations. Addition of vertical transmission caused persistence of infection in a moderately complicated model. In the most complex model that allowed age-based contacts, calf-to-calf transmission was required for persistence.

  19. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  20. Renewable Energy Atlas of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Kuiper, J. [Environmental Science Division; Hlava, K. [Environmental Science Division; Greenwood, H. [Environmentall Science Division; Carr, A. [Environmental Science Division

    2013-12-13

    The Renewable Energy Atlas (Atlas) of the United States is a compilation of geospatial data focused on renewable energy resources, federal land ownership, and base map reference information. This report explains how to add the Atlas to your computer and install the associated software. The report also includes: A description of each of the components of the Atlas; Lists of the Geographic Information System (GIS) database content and sources; and A brief introduction to the major renewable energy technologies. The Atlas includes the following: A GIS database organized as a set of Environmental Systems Research Institute (ESRI) ArcGIS Personal GeoDatabases, and ESRI ArcReader and ArcGIS project files providing an interactive map visualization and analysis interface.

  1. State Soil Geographic (STATSGO) Data Base for the Conterminous United States

    Science.gov (United States)

    Schwarz, Gregory E.; Alexander, R.B.

    1995-01-01

    USSOILS is an Arc 7.0 coverage containing hydrology-relevant information for 10,498 map units covering the entire conterminous United States. The coverage was compiled from individual State coverages contained in the October 1994 State Soil Geographic (STATSGO) Data Base produced on CD-ROM. The geo-dataset USSOILS.PAT relates (on the basis of a map unit identifier) the 10,498 map units to 78,518 polygons. The scale of the geo-dataset is 1:250,000. The INFO attribute table USSOILS.MUID_ATTS contains selected variables from the STATSGO data set for 10,501 map units (an extra 3 map units are contained in the attribute table that are not in the geo-dataset - see the 'Procedures' section below), including: the map unit identifier, a 2-character state abbreviation, available water capacity of the soil, percent clay in the soil, the actual k-factor used in the water erosion component of the universal soil loss equation, the organic material in soil, soil permeability, cumulative thickness of all soil layers, hydrologic characteristics of the soil, quality of drainage, surface slope, liquid limit of the soil, share of a map unit having hydric soils, and the annual frequency of flooding. To facilitate mapping the attribute data, the narrative section below contains instructions for transferring the information contained in the attribute table USSOILS.MUID_ATTS to the polygon attribute table USSOILS.PAT. STATSGO United States Soil Water Capacity Clay Organic material Permeability Infiltration Drainage Hydric Flood frequency Slope

  2. Geologic Map of the Valles Caldera, Jemez Mountains, New Mexico

    Science.gov (United States)

    Goff, F.; Gardner, J. N.; Reneau, S. L.; Kelley, S. A.; Kempter, K. A.; Lawrence, J. R.

    2011-12-01

    Valles caldera is famous as the type locality of large resurgent calderas (Smith and Bailey, 1968), the location of a classic 260-300 °C liquid-dominated geothermal system (Goff and Gardner, 1994), and the site of a long-lived late Pleistocene lake (Fawcett et al., 2011). We have published a detailed color geologic map of the Valles caldera and surrounding areas at 1:50,000 scale obtainable from New Mexico Bureau of Geology and Mineral Resources (geoinfo.nmt.edu/publications/maps/geologic/gm/79/). The new Valles map has been compiled from all or parts of nine 1:24,000 geologic maps completed between 2004 and 2008 (Bland, Cerro del Grant, Jarosa, Jemez Springs, Polvadera Peak, Redondo Peak, Seven Springs, Valle San Antonio, and Valle Toledo). Our map provides more detailed geology on the resurgent dome, caldera collapse breccias, post-caldera lava and tuff eruptions, intracaldera sedimentary and lacustrine deposits, and precaldera volcanic and sedimentary rocks than previous maps and incorporates recent stratigraphic revisions to the geology of the Jemez Mountains volcanic field. Three cross sections supported by surface geology, geophysical data and deep borehole logs (≤4500 m) show an updated view of the caldera interior, depict a modern interpretation of caldera collapse and resurgence, and provide caldera-wide subsurface isotherms (≤500 °C). A 30 page booklet included with the map contains extensive rock descriptions for 162 stratigraphic units and figures showing physiographic features, structural relations between Valles (1.25 Ma) and the earlier, comparably sized Toledo caldera (1.62 Ma), correlation charts of map units, and the distribution of pre- and post-caldera hydrothermal alteration styles, including recently documented zeolite-type alteration. Finally, the booklet includes a generalized model showing our interpretation of intracaldera structure and subjacent magma chambers, and relations of Valles to earlier Quaternary-Precambrian units.

  3. Compiling nested loop programs to process networks

    NARCIS (Netherlands)

    Turjan, Alexandru

    2007-01-01

    New heterogeneous multiprocessor platforms are emerging that are typically composed of loosely coupled components that exchange data using programmable interconnections. The components can be CPUs or DSPs, specialized IP cores, reconfigurable units, or memories. To program such platform, we use the

  4. Site investigation SFR. Rock type coding, overview geological mapping and identification of rock units and possible deformation zones in drill cores from the construction of SFR

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Jesper (Vattenfall Power Consultant AB, Stockholm (Sweden)); Curtis, Philip; Bockgaard, Niclas (Golder Associates AB (Sweden)); Mattsson, Haakan (GeoVista AB, Luleaa (Sweden))

    2011-01-15

    This report presents the rock type coding, overview lithological mapping and identification of rock units and possible deformation zones in drill cores from 32 boreholes associated with the construction of SFR. This work can be seen as complementary to single-hole interpretations of other older SFR boreholes earlier reported in /Petersson and Andersson 2010/: KFR04, KFR08, KFR09, KFR13, KFR35, KFR36, KFR54, KFR55, KFR7A, KFR7B and KFR7C. Due to deficiencies in the available material, the necessary activities have deviated somewhat from the established methodologies used during the recent Forsmark site investigations for the final repository for spent nuclear fuel. The aim of the current work has been, wherever possible, to allow the incorporation of all relevant material from older boreholes in the ongoing SFR geological modelling work in spite of the deficiencies. The activities include: - Rock type coding of the original geological mapping according to the nomenclature used during the preceding Forsmark site investigation. As part of the Forsmark site investigation such rock type coding has already been performed on most of the old SFR boreholes if the original geological mapping results were available. This earlier work has been complemented by rock type coding on two further boreholes: KFR01 and KFR02. - Lithological overview mapping, including documentation of (1) rock types, (2) ductile and brittle-ductile deformation and (3) alteration for drill cores from eleven of the boreholes for which no original geological borehole mapping was available (KFR31, KFR32, KFR34, KFR37,KFR38, KFR51, KFR69, KFR70, KFR71, KFR72 and KFR89). - Identification of possible deformation zones and merging of similar rock types into rock units. This follows SKB's established criteria and methodology of the geological Single-hole interpretation (SHI) process wherever possible. Deviations from the standard SHI process are associated with the lack of data, for example BIPS images

  5. Pick'n'Fix: Capturing Control Flow in Modular Compilers

    DEFF Research Database (Denmark)

    Day, Laurence E.; Bahr, Patrick

    2014-01-01

    We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control s...

  6. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon;

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  7. NUAPC:A Parallelizing Compiler for C++

    Institute of Scientific and Technical Information of China (English)

    朱根江; 谢立; 等

    1997-01-01

    is paper presents a model for automatically parallelizing compiler based on C++ which consists of compile-time and run-time parallelizing facilities.The paper also describes a method for finding both intra-object and inter-object parallelism.The parallelism detection is completely transparent to users.

  8. 38 CFR 45.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Semi-annual compilation. 45.600 Section 45.600 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) NEW RESTRICTIONS ON LOBBYING Agency Reports § 45.600 Semi-annual compilation. (a) The head...

  9. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing tas

  10. Preliminary Geologic Map of the Cook Inlet Region, Alaska-Including Parts of the Talkeetna, Talkeetna Mountains, Tyonek, Anchorage, Lake Clark, Kenai, Seward, Iliamna, Seldovia, Mount Katmai, and Afognak 1:250,000-scale Quadrangles

    Science.gov (United States)

    Geologic map compiled by Wilson, Frederic H.; Hults, Chad P.; Schmoll, Henry R.; Haeussler, Peter J.; Schmidt, Jeanine M.; Yehle, Lynn A.; Digital files prepared by Wilson, Frederic H.; Labay, Keith A.; Shew, Nora

    2009-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. The files named __geol contain geologic polygons and line (contact) attributes; files named __fold contain fold axes; files named __lin contain lineaments; and files named __dike contain dikes as lines. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make

  11. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  12. Experiences with Compiler Support for Processors with Exposed Pipelines

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Schleuniger, Pascal; Hindborg, Andreas Erik;

    2015-01-01

    Field programmable gate arrays, FPGAs, have become an attractive implementation technology for a broad range of computing systems. We recently proposed a processor architecture, Tinuso, which achieves high performance by moving complexity from hardware to the compiler tool chain. This means...... that the compiler tool chain must handle the increased complexity. However, it is not clear if current production compilers can successfully meet the strict constraints on instruction order and generate efficient object code. In this paper, we present our experiences developing a compiler backend using the GNU...... Compiler Collection, GCC. For a set of C benchmarks, we show that a Tinuso implementation with our GCC backend reaches a relative speedup of up to 1.73 over a similar Xilinx Micro Blaze configuration while using 30% fewer hardware resources. While our experiences are generally positive, we expose some...

  13. Code Commentary and Automatic Refactorings using Feedback from Multiple Compilers

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Probst, Christian W.; Karlsson, Sven

    2014-01-01

    Optimizing compilers are essential to the performance of parallel programs on multi-core systems. It is attractive to expose parallelism to the compiler letting it do the heavy lifting. Unfortunately, it is hard to write code that compilers are able to optimize aggressively and therefore tools...... exist that can guide programmers with refactorings allowing the compilers to optimize more aggressively. We target the problem with many false positives that these tools often generate, where the amount of feedback can be overwhelming for the programmer. Our approach is to use a filtering scheme based...... on feedback from multiple compilers and show how we are able to filter out 87.6% of the comments by only showing the most promising comments....

  14. Fully Countering Trusting Trust through Diverse Double-Compiling

    CERN Document Server

    Wheeler, David A

    2010-01-01

    An Air Force evaluation of Multics, and Ken Thompson's Turing award lecture ("Reflections on Trusting Trust"), showed that compilers can be subverted to insert malicious Trojan horses into critical software, including themselves. If this "trusting trust" attack goes undetected, even complete analysis of a system's source code will not find the malicious code that is running. Previously-known countermeasures have been grossly inadequate. If this attack cannot be countered, attackers can quietly subvert entire classes of computer systems, gaining complete control over financial, infrastructure, military, and/or business systems worldwide. This dissertation's thesis is that the trusting trust attack can be detected and effectively countered using the "Diverse Double-Compiling" (DDC) technique, as demonstrated by (1) a formal proof that DDC can determine if source code and generated executable code correspond, (2) a demonstration of DDC with four compilers (a small C compiler, a small Lisp compiler, a small malic...

  15. Application of the Lean Office philosophy and mapping of the value stream in the process of designing the banking units of a financial company

    Directory of Open Access Journals (Sweden)

    Nelson Antônio Calsavara

    2016-09-01

    Full Text Available The purpose of this study is to conduct a critical analysis of the effects of Lean Office on the design process of the banking units of a financial company and how the implementation of this philosophy may contribute to productivity, thus reducing implementation time. A literature review of the Toyota Production System was conducted, as well as studies on its methods, with advancement to lean thinking and consistent application of Lean philosophies in services and Office. A bibliographic and documentary survey of the Lean processes and procedures for opening bank branches was taken. A Current State Map was developed, modeling the current operating procedures. Soon after the identification and analysis of waste, proposals were presented for reducing deadlines and eliminating and grouping stages, with consequent development of the Future State Map, implementation and monitoring of stages, and the measurement of estimated time gains in operation, demonstrating an estimated 45% reduction, in days, from start to end of the process, concluding that the implementation of the Lean Office philosophy contributed to the process.

  16. Geologic and Fossil Locality Maps of the West-Central Part of the Howard Pass Quadrangle and Part of the Adjacent Misheguk Mountain Quadrangle, Western Brooks Range, Alaska

    Science.gov (United States)

    Dover, James H.; Tailleur, Irvin L.; Dumoulin, Julie A.

    2004-01-01

    The map depicts the field distribution and contact relations between stratigraphic units, the tectonic relations between major stratigraphic sequences, and the detailed internal structure of these sequences. The stratigraphic sequences formed in a variety of continental margin depositional environments, and subsequently underwent a complexde formational history of imbricate thrust faulting and folding. A compilation of micro and macro fossil identifications is included in this data set.

  17. Geologic map of the southern White Ledge Peak and Matilija quadrangles, Santa Barbara and Ventura Counties, California

    Science.gov (United States)

    Minor, Scott A.; Brandt, Theodore R.

    2015-01-01

    This report presents a digital geologic strip map of the southern parts of the contiguous White Ledge Peak and Matilija 7.5’ quadrangles in coastal southern California. With a compilation scale of 1:24,000 (one inch on the map to 2,000 feet on the ground), the map depicts the distribution of bedrock units, surficial deposits, and associated deformation adjacent to and south of the Arroyo Parida fault and in the southern Ojai Valley east of the Ventura River. This new compilation, combined with a recently published geologic map of the Santa Barbara coastal plain (U.S. Geological Survey Scientific Investigations Map 3001), completes a 69-km-long east-west mapping transect from Goleta to Ojai by the U.S. Geological Survey. These two contiguous geologic maps provide new insights and constraints on Neogene-through-Quaternary tectonic deformation and consequent landscape change, including geohazards in the urbanized southern flank of the Santa Ynez Mountains.

  18. Feasibility and utility of mapping disease risk at the neighbourhood level within a Canadian public health unit: an ecological study

    Directory of Open Access Journals (Sweden)

    Wanigaratne Susitha

    2010-05-01

    Full Text Available Abstract Background We conducted spatial analyses to determine the geographic variation of cancer at the neighbourhood level (dissemination areas or DAs within the area of a single Ontario public health unit, Wellington-Dufferin-Guelph, covering a population of 238,326 inhabitants. Cancer incidence data between 1999 and 2003 were obtained from the Ontario Cancer Registry and were geocoded down to the level of DA using the enhanced Postal Code Conversion File. The 2001 Census of Canada provided information on the size and age-sex structure of the population at the DA level, in addition to information about selected census covariates, such as average neighbourhood income. Results Age standardized incidence ratios for cancer and the prevalence of census covariates were calculated for each of 331 dissemination areas in Wellington-Dufferin-Guelph. The standardized incidence ratios (SIR for cancer varied dramatically across the dissemination areas. However, application of the Moran's I statistic, a popular index of spatial autocorrelation, suggested significant spatial patterns for only two cancers, lung and prostate, both in males (p Conclusion This paper demonstrates the feasibility and utility of a systematic approach to identifying neighbourhoods, within the area served by a public health unit, that have significantly higher risks of cancer. This exploratory, ecologic study suggests several hypotheses for these spatial patterns that warrant further investigations. To the best of our knowledge, this is the first Canadian study published in the peer-reviewed literature estimating the risk of relatively rare public health outcomes at a very small areal level, namely dissemination areas.

  19. Mars synthetic topographic mapping

    Science.gov (United States)

    Wu, S.S.C.

    1978-01-01

    Topographic contour maps of Mars are compiled by the synthesis of data acquired from various scientific experiments of the Mariner 9 mission, including S-band radio-occulation, the ultraviolet spectrometer (UVS), the infrared radiometer (IRR), the infrared interferometer spectrometer (IRIS) and television imagery, as well as Earth-based radar information collected at Goldstone, Haystack, and Arecibo Observatories. The entire planet is mapped at scales of 1:25,000,000 and 1:25,000,000 using Mercator, Lambert, and polar stereographic map projections. For the computation of map projections, a biaxial spheroid figure is adopted. The semimajor and semiminor axes are 3393.4 and 3375.7 km, respectively, with a polar flattening of 0.0052. For the computation of elevations, a topographic datum is defined by a gravity field described in terms of spherical harmonics of fourth order and fourth degree combined with a 6.1-mbar occulation pressure surface. This areoid can be approximated by a triaxial ellipsoid with semimajor axes of A = 3394.6 km and B = 3393.3 km and a semiminor axis of C = 3376.3 km. The semimajor axis A intersects the Martian surface at longitude 105??W. The dynamic flattening of Mars is 0.00525. The contour intercal of the maps is 1 km. For some prominent features where overlapping pictures from Mariner 9 are available, local contour maps at relatively larger scales were also compiled by photogrammetric methods on stereo plotters. ?? 1978.

  20. Seismic scattering and absorption mapping of debris flows, feeding paths, and tectonic units at Mount St. Helens volcano

    Science.gov (United States)

    De Siena, L.; Calvet, M.; Watson, K. J.; Jonkers, A. R. T.; Thomas, C.

    2016-05-01

    Frequency-dependent peak-delay times and coda quality factors have been used jointly to separate seismic absorption from scattering quantitatively in Earth media at regional and continental scale; to this end, we measure and map these two quantities at Mount St. Helens volcano. The results show that we can locate and characterize volcanic and geological structures using their unique contribution to seismic attenuation. At 3 Hz a single high-scattering and high-absorption anomaly outlines the debris flows that followed the 1980 explosive eruption, as deduced by comparison with remote sensing imagery. The flows overlay a NNW-SSE interface, separating rocks of significant varying properties down to 2-4 km, and coinciding with the St. Helens Seismic Zone. High-scattering and high-absorption anomalies corresponding to known locations of magma emplacement follow this signature under the volcano, showing the important interconnections between its feeding systems and the regional tectonic boundaries. With frequency increasing from 6 to 18 Hz the NNW-SSE tectonic/feeding trends rotate around an axis centered on the volcano in the direction of the regional-scale magmatic arc (SW-NE). While the aseismic high-scattering region WSW of the volcano shows no evidence of high absorption, the regions of highest-scattering and absorption are consistently located at all frequencies under either the eastern or the south-eastern flank of the volcanic edifice. From the comparison with the available geological and geophysical information we infer that these anomalies mark both the location and the trend of the main feeding systems at depths greater than 4 km.

  1. Compilation of kinetic data for geochemical calculations

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, R.C. [Monitor Scientific, LLC., Denver, Colorado (United States); Savage, D. [Quintessa, Ltd., Nottingham (United Kingdom); Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu [Japan Nuclear Cycle Development Inst., Tokai, Ibaraki (Japan). Tokai Works

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the

  2. The current status of mapping karst areas and availability of public sinkhole-risk resources in karst terrains of the United States

    Science.gov (United States)

    Kuniansky, Eve L.; Weary, David J.; Kaufmann, James E.

    2016-01-01

    Subsidence from sinkhole collapse is a common occurrence in areas underlain by water-soluble rocks such as carbonate and evaporite rocks, typical of karst terrain. Almost all 50 States within the United States (excluding Delaware and Rhode Island) have karst areas, with sinkhole damage highest in Florida, Texas, Alabama, Missouri, Kentucky, Tennessee, and Pennsylvania. A conservative estimate of losses to all types of ground subsidence was $125 million per year in 1997. This estimate may now be low, as review of cost reports from the last 15 years indicates that the cost of karst collapses in the United States averages more than $300 million per year. Knowing when a catastrophic event will occur is not possible; however, understanding where such occurrences are likely is possible. The US Geological Survey has developed and maintains national-scale maps of karst areas and areas prone to sinkhole formation. Several States provide additional resources for their citizens; Alabama, Colorado, Florida, Indiana, Iowa, Kentucky, Minnesota, Missouri, Ohio, and Pennsylvania maintain databases of sinkholes or karst features, with Florida, Kentucky, Missouri, and Ohio providing sinkhole reporting mechanisms for the public.

  3. The current status of mapping karst areas and availability of public sinkhole-risk resources in karst terrains of the United States

    Science.gov (United States)

    Kuniansky, Eve L.; Weary, David J.; Kaufmann, James E.

    2016-05-01

    Subsidence from sinkhole collapse is a common occurrence in areas underlain by water-soluble rocks such as carbonate and evaporite rocks, typical of karst terrain. Almost all 50 States within the United States (excluding Delaware and Rhode Island) have karst areas, with sinkhole damage highest in Florida, Texas, Alabama, Missouri, Kentucky, Tennessee, and Pennsylvania. A conservative estimate of losses to all types of ground subsidence was 125 million per year in 1997. This estimate may now be low, as review of cost reports from the last 15 years indicates that the cost of karst collapses in the United States averages more than 300 million per year. Knowing when a catastrophic event will occur is not possible; however, understanding where such occurrences are likely is possible. The US Geological Survey has developed and maintains national-scale maps of karst areas and areas prone to sinkhole formation. Several States provide additional resources for their citizens; Alabama, Colorado, Florida, Indiana, Iowa, Kentucky, Minnesota, Missouri, Ohio, and Pennsylvania maintain databases of sinkholes or karst features, with Florida, Kentucky, Missouri, and Ohio providing sinkhole reporting mechanisms for the public.

  4. Shape indexes for semi-automated detection of windbreaks in thematic tree cover maps from the central United States

    Science.gov (United States)

    Liknes, Greg C.; Meneguzzo, Dacia M.; Kellerman, Todd A.

    2017-07-01

    Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., morphology-based index that we have named the Straight and Narrow Feature Index (SNFI), a windbreak sinuosity index, and an area index indicating the occupied fractional area of a bounding box. The indexes were tested in two study areas: (1) a riparian area dominated by sinuous bands of trees but mixed with row crop agriculture and (2) an agricultural area with a mix of straight-line and L-shaped windbreaks. In the riparian area, a Kruskall-Wallis rank sum test indicated class differences for all three indexes, and pairwise comparisons indicate windbreaks and riparian trees are separable using any of the three indexes. SNFI also produced significant differences between windbreaks oriented in different directions (east-west vs. north-south). In the agricultural area, the Kruskall-Wallis rank sum test indicated differences between classes for all three indexes, and pairwise comparisons show that all class pairs have significant differences for at least one index, with the exception of L-shaped windbreaks vs. non-windbreak tree patches. We also used classification trees to objectively assign representative samples of tree patches to classes using both single indexes and multiple indexes. Classes were correctly assigned for more than 90% of the samples in both the riparian and agricultural study areas. In the riparian area, combining indexes did not improve accuracy compared to using SNFI alone, whereas in the agricultural area, combining the three indexes produced the best result. Thematic datasets derived from high-resolution imagery are becoming more available, and extracting useful information can be a challenge, partly due to the large amount of data to assess. Calculating the three shape indexes presented can assist with

  5. A Compiler for CPPNs: Transforming Phenotypic Descriptions Into Genotypic Representations

    DEFF Research Database (Denmark)

    Risi, Sebastian

    2013-01-01

    , the question of how to start evolution from a promising part of the search space becomes more and more important. To address this challenge, we introduce the concept of a CPPN-Compiler, which allows the user to directly compile a high-level description of the desired starting structure into the CPPN itself......-specific regularities like symmetry or repetition. Thus the results presented in this paper open up a new research direction in GDS, in which specialized CPPN-Compilers for different domains could help to overcome the black box of evolutionary optimization....

  6. Efficient Compilation of a Class of Variational Forms

    CERN Document Server

    Kirby, Robert C

    2012-01-01

    We investigate the compilation of general multilinear variational forms over affines simplices and prove a representation theorem for the representation of the element tensor (element stiffness matrix) as the contraction of a constant reference tensor and a geometry tensor that accounts for geometry and variable coefficients. Based on this representation theorem, we design an algorithm for efficient pretabulation of the reference tensor. The new algorithm has been implemented in the FEniCS Form Compiler (FFC) and improves on a previous loop-based implementation by several orders of magnitude, thus shortening compile-times and development cycles for users of FFC.

  7. Compiler Optimization: A Case for the Transformation Tool Contest

    Directory of Open Access Journals (Sweden)

    Sebastian Buchwald

    2011-11-01

    Full Text Available An optimizing compiler consists of a front end parsing a textual programming language into an intermediate representation (IR, a middle end performing optimizations on the IR, and a back end lowering the IR to a target representation (TR built of operations supported by the target hardware. In modern compiler construction graph-based IRs are employed. Optimization and lowering tasks can then be implemented with graph transformation rules. This case provides two compiler tasks to evaluate the participating tools regarding performance.

  8. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  9. Geologic Map of the Atlin Quadrangle, Southeastern Alaska

    Science.gov (United States)

    Brew, David A.; Himmelberg, Glen R.; Ford, Arthur B.

    2009-01-01

    This map presents the results of U.S. Geological Survey (USGS) geologic bedrock mapping studies in the mostly glacier covered Atlin 1:250,000-scale quadrangle, northern southeastern Alaska. These studies are part of a long-term systematic effort by the USGS to provide bedrock geologic and mineral-resource information for all of southeastern Alaska, covering all of the Tongass National Forest (including Wilderness Areas) and Glacier Bay National Park and Preserve. Some contributions to this effort are those concerned with southwesternmost part of the region, the Craig and Dixon Entrance quadrangles (Brew, 1994; 1996) and with the Wrangell-Petersburg area (Brew, 1997a-m; Brew and Grybeck, 1997; Brew and Koch, 1997). As shown on the index map (fig. 1), the study area is almost entirely in the northern Coast Mountains adjacent to British Columbia, Canada. No previous geologic map has been published for the area, although Brew and Ford (1985) included a small part of it in a preliminary compilation of the adjoining Juneau quadrangle; and Brew and others (1991a) showed the geology at 1:500,000 scale. Areas mapped nearby in British Columbia and the United States are also shown on figure 1. All of the map area is in the Coast Mountains Complex as defined by Brew and others (1995a). A comprehensive bibliography is available for this and adjacent areas (Brew, 1997n).

  10. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Rosie D Salazar

    Full Text Available The common toad (Bufo bufo is of increasing conservation concern in the United Kingdom (UK due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat.We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013. We developed a population-level resource selection function (RSF to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads.The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  11. Mapping information exposure on social media to explain differences in HPV vaccine coverage in the United States.

    Science.gov (United States)

    Dunn, Adam G; Surian, Didi; Leask, Julie; Dey, Aditi; Mandl, Kenneth D; Coiera, Enrico

    2017-05-25

    Together with access, acceptance of vaccines affects human papillomavirus (HPV) vaccine coverage, yet little is known about media's role. Our aim was to determine whether measures of information exposure derived from Twitter could be used to explain differences in coverage in the United States. We conducted an analysis of exposure to information about HPV vaccines on Twitter, derived from 273.8 million exposures to 258,418 tweets posted between 1 October 2013 and 30 October 2015. Tweets were classified by topic using machine learning methods. Proportional exposure to each topic was used to construct multivariable models for predicting state-level HPV vaccine coverage, and compared to multivariable models constructed using socioeconomic factors: poverty, education, and insurance. Outcome measures included correlations between coverage and the individual topics and socioeconomic factors; and differences in the predictive performance of the multivariable models. Topics corresponding to media controversies were most closely correlated with coverage (both positively and negatively); education and insurance were highest among socioeconomic indicators. Measures of information exposure explained 68% of the variance in one dose 2015 HPV vaccine coverage in females (males: 63%). In comparison, models based on socioeconomic factors explained 42% of the variance in females (males: 40%). Measures of information exposure derived from Twitter explained differences in coverage that were not explained by socioeconomic factors. Vaccine coverage was lower in states where safety concerns, misinformation, and conspiracies made up higher proportions of exposures, suggesting that negative representations of vaccines in the media may reflect or influence vaccine acceptance. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  12. Geologic Map of the Santa Barbara Coastal Plain Area, Santa Barbara County, California

    Science.gov (United States)

    Minor, Scott A.; Kellogg, Karl S.; Stanley, Richard G.; Gurrola, Larry D.; Keller, Edward A.; Brandt, Theodore R.

    2009-01-01

    This report presents a newly revised and expanded digital geologic map of the Santa Barbara coastal plain area at a compilation scale of 1:24,000 (one inch on the map to 2,000 feet on the ground)1 and with a horizontal positional accuracy of at least 20 m. The map depicts the distribution of bedrock units and surficial deposits and associated deformation underlying and adjacent to the coastal plain within the contiguous Dos Pueblos Canyon, Goleta, Santa Barbara, and Carpinteria 7.5' quadrangles. The new map supersedes an earlier preliminary geologic map of the central part of the coastal plain (Minor and others, 2002; revised 2006) that provided coastal coverage only within the Goleta and Santa Barbara quadrangles. In addition to new mapping to the west and east, geologic mapping in parts of the central map area has been significantly revised from the preliminary map compilation - especially north of downtown Santa Barbara in the Mission Ridge area - based on new structural interpretations supplemented by new biostratigraphic data. All surficial and bedrock map units, including several new units recognized in the areas of expanded mapping, are described in detail in the accompanying pamphlet. Abundant new biostratigraphic and biochronologic data based on microfossil identifications are presented in expanded unit descriptions of the marine Neogene Monterey and Sisquoc Formations. Site-specific fault kinematic observations embedded in the digital map database are more complete owing to the addition of slip-sense determinations. Finally, the pamphlet accompanying the present report includes an expanded and refined summary of stratigraphic and structural observations and interpretations that are based on the composite geologic data contained in the new map compilation. The Santa Barbara coastal plain is located in the western Transverse Ranges physiographic province along an east-west-trending segment of the southern California coastline about 100 km (62 mi) northwest

  13. Alaska NWRS Legacy Seabird Monitoring Data Inventory and Compilation

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The objective of this project is to compile and standardize data from the Alaska Peninsula/Becharof, Kodiak, Togiak, and Yukon Delta National Wildlife Refuges. This...

  14. Compiler for Fast, Accurate Mathematical Computing on Integer Processors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposers will develop a computer language compiler to enable inexpensive, low-power, integer-only processors to carry our mathematically-intensive comptutations...

  15. Compilation and Synthesis for Fault-Tolerant Digital Microfluidic Biochips

    DEFF Research Database (Denmark)

    Alistar, Mirela

    of electrodes to perform operations such as dispensing, transport, mixing, split, dilution and detection. Researchers have proposed compilation approaches, which, starting from a biochemical application and a biochip architecture, determine the allocation, resource binding, scheduling, placement and routing...

  16. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  17. Solid state technology: A compilation. [on semiconductor devices

    Science.gov (United States)

    1973-01-01

    A compilation, covering selected solid state devices developed and integrated into systems by NASA to improve performance, is presented. Data are also given on device shielding in hostile radiation environments.

  18. Specification and compilation of real-time stream processing applications

    NARCIS (Netherlands)

    Geuns, Stephanus Joannes

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically hav

  19. Geologic Map of the Goleta Quadrangle, Santa Barbara County, California

    Science.gov (United States)

    Minor, Scott A.; Kellogg, Karl S.; Stanley, Richard G.; Brandt, Theodore R.

    2007-01-01

    This map depicts the distribution of bedrock units and surficial deposits and associated deformation underlying those parts of the Santa Barbara coastal plain and adjacent southern flank of the Santa Ynez Mountains within the Goleta 7 ?? quadrangle at a compilation scale of 1:24,000 (one inch on the map = 2,000 feet on the ground) and with a horizontal positional accuracy of at least 20 m. The Goleta map overlaps an earlier preliminary geologic map of the central part of the coastal plain (Minor and others, 2002) that provided coverage within the coastal, central parts of the Goleta and contiguous Santa Barbara quadrangles. In addition to new mapping in the northern part of the Goleta quadrangle, geologic mapping in other parts of the map area has been revised from the preliminary map compilation based on new structural interpretations supplemented by new biostratigraphic data. All surficial and bedrock map units are described in detail in the accompanying map pamphlet. Abundant biostratigraphic and biochronologic data based on microfossil identifications are presented in expanded unit descriptions of the marine Neogene Monterey and Sisquoc Formations. Site-specific fault-kinematic observations (including slip-sense determinations) are embedded in the digital map database. The Goleta quadrangle is located in the western Transverse Ranges physiographic province along an east-west-trending segment of the southern California coastline about 100 km (62 mi) northwest of Los Angeles. The Santa Barbara coastal plain surface, which spans the central part of the quadrangle, includes several mesas and hills that are geomorphic expressions of underlying, potentially active folds and partly buried oblique and reverse faults of the Santa Barbara fold and fault belt (SBFFB). Strong earthquakes have occurred offshore within 10 km of the Santa Barbara coastal plain in 1925 (6.3 magnitude), 1941 (5.5 magnitude) and 1978 (5.1 magnitude). These and numerous smaller seismic events

  20. Trident: An FPGA Compiler Framework for Floating-Point Algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Tripp J. L. (Justin L.); Peterson, K. D. (Kristopher D.); Poznanovic, J. D. (Jeffrey Daniel); Ahrens, C. M. (Christine Marie); Gokhale, M. (Maya)

    2005-01-01

    Trident is a compiler for floating point algorithms written in C, producing circuits in reconfigurable logic that exploit the parallelism available in the input description. Trident automatically extracts parallelism and pipelines loop bodies using conventional compiler optimizations and scheduling techniques. Trident also provides an open framework for experimentation, analysis, and optimization of floating point algorithms on FPGAs and the flexibility to easily integrate custom floating point libraries.

  1. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry;

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite...... by replacing the equivalence test with a constraint-specific measure of distance. We demonstrate the value of the approach for approximate and exact MDD compilation and evaluate its benefits in one of the main MDD application domains, interactive configuration....

  2. Compiler writing system detail design specification. Volume 2: Component specification

    Science.gov (United States)

    Arthur, W. J.

    1974-01-01

    The logic modules and data structures composing the Meta-translator module are desribed. This module is responsible for the actual generation of the executable language compiler as a function of the input Meta-language. Machine definitions are also processed and are placed as encoded data on the compiler library data file. The transformation of intermediate language in target language object text is described.

  3. On search guide phrase compilation for recommending home medical products.

    Science.gov (United States)

    Luo, Gang

    2010-01-01

    To help people find desired home medical products (HMPs), we developed an intelligent personal health record (iPHR) system that can automatically recommend HMPs based on users' health issues. Using nursing knowledge, we pre-compile a set of "search guide" phrases that provides semantic translation from words describing health issues to their underlying medical meanings. Then iPHR automatically generates queries from those phrases and uses them and a search engine to retrieve HMPs. To avoid missing relevant HMPs during retrieval, the compiled search guide phrases need to be comprehensive. Such compilation is a challenging task because nursing knowledge updates frequently and contains numerous details scattered in many sources. This paper presents a semi-automatic tool facilitating such compilation. Our idea is to formulate the phrase compilation task as a multi-label classification problem. For each newly obtained search guide phrase, we first use nursing knowledge and information retrieval techniques to identify a small set of potentially relevant classes with corresponding hints. Then a nurse makes the final decision on assigning this phrase to proper classes based on those hints. We demonstrate the effectiveness of our techniques by compiling search guide phrases from an occupational therapy textbook.

  4. MAPPING IN MICRONESIA.

    Science.gov (United States)

    Olsen, Randle W.; Swinnerton, J.R.

    1984-01-01

    The U. S. Geological Survey has recently completed a series of new topographic maps of Micronesia in cooperation with the Trust Territory of the Pacific Islands, the Federal agency administering the islands. Monocolor 1:10,000-scale manuscripts were compiled, from which 1:25,000-scale metric quadrangles were derived with symbology consistent with USGS quadrangle mapping. The publication of these new maps coincides with the impending political changes resulting from self-determination referendums held in Micronesia. Local sources have helped considerably with field logistics and resolution of geographic name controversies. Technical aspects of this project included development of tropical feature symbology, location of cadastral subdivisions and associated boundaries and mapping of many outlying coral reefs.

  5. Geology of the Conterminous United States at 1:2,500,000 Scale -- A Digital Representation of the 1974 P.B. King and H.M. Beikman Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This CD-ROM contains a digital version of the Geologic Map of the United States, originally published at a scale of 1:2,500,000 (King and Beikman, 1974b). It...

  6. Geologic map of Indonesia - Peta geologi Indonesia

    Science.gov (United States)

    Sigit, Soetarjo

    1965-01-01

    The geology, compiled by Th. H. F. Klompe in 1954 from published and unpublished maps of the Direktorat Geologi, has been brought up to date on the basis of investigations carried out to 1962 (Ref. Sigit, Soetarjo, "I. A brief outline of the geology of the Indonesian Archipelago, and II. Geological map of Indonesia;" Direktorat Geologi publication, 1962.)

  7. Bedrock Geologic Map of Vermont - Dikes

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  8. Compilation of the GSHAP regional seismic hazard for Europe, Africa and the Middle East

    Directory of Open Access Journals (Sweden)

    D. Mayer-Rosa

    1999-06-01

    Full Text Available The seismic hazard map of the larger Europe-Africa-Middle East region has been generated as part of the global GSHAP hazard map. The hazard, expressing Peak Ground Acceleration (PGA expected at 10% probability of exceedance in 50 years, is obtained by combining the results of 16 independent regional and national projects; among these is the hazard assessment for Libya and for the wide sub-Saharan Western African region, specifically produced for this regional compilation and here discussed to some length. Features of enhanced seismic hazard are observed along the African rift zone and in the Alpine-Himalayan belt, where there is a general eastward increase in hazard with peak levels in Greece, Turkey, Caucasus and Iran.

  9. Angola Seismicity MAP

    Science.gov (United States)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  10. Ground-water-quality data in Pennsylvania: A compilation of computerized [electronic] databases, 1979-2004

    Science.gov (United States)

    Low, Dennis J.; Chichester, Douglas C.

    2006-01-01

    This study, by the U.S. Geological Survey (USGS) in cooperation with the Pennsylvania Department of Environmental Protection (PADEP), provides a compilation of ground-water-quality data for a 25-year period (January 1, 1979, through August 11, 2004) based on water samples from wells. The data are from eight source agencies唯orough of Carroll Valley, Chester County Health Department, Pennsylvania Department of Environmental Protection-Ambient and Fixed Station Network, Montgomery County Health Department, Pennsylvania Drinking Water Information System, Pennsylvania Department of Agriculture, Susquehanna River Basin Commission, and the U.S. Geological Survey. The ground-water-quality data from the different source agencies varied in type and number of analyses; however, the analyses are represented by 12 major analyte groups:biological (bacteria and viruses), fungicides, herbicides, insecticides, major ions, minor ions (including trace elements), nutrients (dominantly nitrate and nitrite as nitrogen), pesticides, radiochemicals (dominantly radon or radium), volatile organic compounds, wastewater compounds, and water characteristics (dominantly field pH, field specific conductance, and hardness).A summary map shows the areal distribution of wells with ground-water-quality data statewide and by major watersheds and source agency. Maps of 35 watersheds within Pennsylvania are used to display the areal distribution of water-quality information. Additional maps emphasize the areal distribution with respect to 13 major geolithologic units in Pennsylvania and concentration ranges of nitrate (as nitrogen). Summary data tables by source agency provide information on the number of wells and samples collected for each of the 35 watersheds and analyte groups. The number of wells sampled for ground-water-quality data varies considerably across Pennsylvania. Of the 8,012 wells sampled, the greatest concentration of wells are in the southeast (Berks, Bucks, Chester, Delaware

  11. Geologic Map of the State of Hawai`i

    Science.gov (United States)

    Sherrod, David R.; Sinton, John M.; Watkins, Sarah E.; Brunt, Kelly M.

    2007-01-01

    1983 and the Universal Transverse Mercator system projection to zone 4. 'This digital statewide map allows engineers, consultants, and scientists from many different fields to take advantage of the geologic database,' said John Sinton, a geology professor at the University of Hawai`i, whose new mapping of the Wai`anae Range (West O`ahu) appears on the map. Indeed, when a testing version was first made available, most requests came from biologists, archaeologists, and soil scientists interested in applying the map's GIS database to their ongoing investigations. Another area newly depicted on the map, in addition to the Wai`anae Range, is Haleakala volcano, East Maui. So too for the active lava flows of Kilauea volcano, Island of Hawai`i, where the landscape has continued to evolve in the ten years since publication of the Big Island's revised geologic map. For the other islands, much of the map is compiled from mapping published in the 1930-1960s. This reliance stems partly from shortage of funding to undertake entirely new mapping but is warranted by the exemplary mapping of those early experts. The boundaries of all map units are digitized to show correctly on modern topographic maps.

  12. Global Geomorphometric Map of Mars

    Science.gov (United States)

    Jasiewicz, J.; Stepinski, T. F.

    2012-03-01

    A global geomorphometric map of Mars is generated from DEM using a novel computer algorithm. This map provides a new valuable tool for terrain analysis and objective quantification of surface units. Auto-mapping of surface units is a future application.

  13. Compilation of Quality Improvement Standards of General Dentistry Program in Islamic Republic

    Directory of Open Access Journals (Sweden)

    Fakhrossadat Hosseini

    Full Text Available Introduction: The importance of quality assurance makes the standard compilation in educational systems as a high priority subject in medical education. The purpose of this study was to study the compilation of quality improvement standards in general dentistry program in Islamic Republic of Iran.Materials & Methods: This descriptive study was performed during the years of 2011 & 2012 in three phases. In the first phase, previous literature and similar standards were included in a comparative study and screened based on national health policies in Health map. Results were evaluated by 16 dental school representatives using modified Delphi methodology and open-closed questionnaires were filled by their faculty members and were reported back to the dental secretariat of ministry of health in the second phase. In the final phase, results were evaluated in the secretariat by a focus group and the final criteria were introduced based on the secretariat politics. Results: Fifty-eight criteria were created in the first phase. Data were collected from 13 faculties in the second phase (response rate=81%. Eighteen criteria had less than 90% agreement of the participants; however, all of the criteria were agreed by more than 70% of the participants. In the final phase, 48 quality improvement standards in seven areas were accepted and introduced in dental secretariat of the ministry of health.Conclusion: The final standard documents could be used as a national covenant of quality improvement based on their high agreement rate and dependence by national politics in the health map.

  14. Map Service Showing Geology, Oil and Gas Fields, and Geologic Provinces of Europe including Turkey

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digitally compiled map includes geology, oil and gas fields, and geologic provinces of Europe. The oil and gas map is part of a worldwide series released on...

  15. Map Service Showing Geology, Oil and Gas Fields, and Geologic Provinces of South America

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digitally compiled map includes geology, oil and gas fields, and geologic provinces of South America. The oil and gas map is part of a worldwide series released...

  16. Inquiry on exhibition enterprise budget compilation%展览企业预算编制探讨

    Institute of Scientific and Technical Information of China (English)

    段娟

    2012-01-01

    结合工作实践,对展览企业项目实施全过程预算编制进行了分析,从施工单位的角度对预算编制进行了论述,提出了分阶段编制的方法,解决了预算编制不准确、不客观的问题。%Combining with working practice, the paper analyzes the budge compilation of project implantation process, discusses the budget com- pilation from aspect of construction unit, and puts forward compilation method at various phases. As a result, it solves some problem of inaccurate and subjective budget compilation.

  17. Geologic map of the Oasis Valley basin and vicinity, Nye County, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Fridrich, C.J.; Minor, S.A.; Ryder, P.L.; Slate, J.L.

    2000-01-13

    This map and accompanying cross sections present an updated synthesis of the geologic framework of the Oasis Valley area, a major groundwater discharge site located about 15 km west of the Nevada Test Site. Most of the data presented in this compilation is new geologic map data, as discussed below. In addition, the cross sections incorporate new geophysical data that have become available in the last three years (Grauch and others, 1997; written comm., 1999; Hildenbrand and others, 1999; Mankinen and others, 1999). Geophysical data are used to estimate the thickness of the Tertiary volcanic and sedimentary rocks on the cross sections, and to identify major concealed structures. Large contiguous parts of the map area are covered either by alluvium or by volcanic units deposited after development of the major structures present at the depth of the water table and below. Hence, geophysical data provide critical constraints on our geologic interpretations. A companion paper by Fridrich and others (1999) and the above-cited reports by Hildenbrand and others (1999) and Mankinen and others (1999) provide explanations of the interpretations that are presented graphically on this map. This map covers nine 7.5-minute quadrangles in Nye County, Nevada, centered on the Thirsty Canyon SW quadrangle, and is a compilation of one published quadrangle map (O'Connor and others, 1966) and eight new quadrangle maps, two of which have been previously released (Minor and others, 1997; 1998). The cross sections that accompany this map were drawn to a depth of about 5 km below land surface at the request of hydrologists who are modeling the Death Valley groundwater system.

  18. Retargeting of existing FORTRAN program and development of parallel compilers

    Science.gov (United States)

    Agrawal, Dharma P.

    1988-01-01

    The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.

  19. CAPS OpenACC Compilers: Performance and Portability

    CERN Document Server

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  20. The Conterminous United States Mineral Assessment Program; background information to accompany folio of geologic, geophysical, geochemical, mineral-occurrence, mineral-resource potential, and mineral-production maps of the Charlotte 1 degree x 2 degrees Quadrangle, North Carolina and South Carolina

    Science.gov (United States)

    Gair, Jacob Eugene; Goldsmith, Richard; Daniels, D.L.; Griffitts, W.R.; DeYoung, J.H.; Lee, M.P.

    1986-01-01

    This Circular and the folio of separately published maps described herein are part of a series of reports compiled under the Conterminous United States Mineral Assessment Program ICUSMAP). The folio on the Charlotte 1 degree ? 2 degree quadrangle, North Carolina and South Carolina, includes (1) a geologic map; (2) four geophysical maps; (3) geochemical maps for metamorphic heavy minerals, copper, lead and artifacts, zinc, gold, tin, beryllium, niobium, tungsten, molybdenum, titanium, cobalt, lithium, barium, antimony-arsenic-bismuth-cadmium, thorium-cerium-monazite, and limonite; (4) mineral-occurrence maps for kyanite-sillimanite-lithium-mica-feldspar-copper-lead-zinc, gold-quartz-barite-fluorite, iron-thorium-tin-niobium, and construction materials-gemstones; (5) mineral-resource potential maps for copper-lead-zinc-combined base metals, gold, tin-tungsten, beryllium-molybdenum-niobium, lithium-kyanite- sillimanitebarite, thorium (monazite)-uranium, and construction materials; and (6) mineral-production maps. The Charlotte quadrangle is mainly within the Piedmont physiographic province and extends from near the Coastal Plain on the southeast into the Blue Ridge province on the northwest for a short distance. Parts of six lithotectonic belts are present--the Blue Ridge, the Inner Piedmont, the Kings Mountain belt, the Charlotte belt, the Carolina slate belt, and the Wadesboro basin. Igneous, metamorphic, and sedimentary rocks are present and range in age from Proterozoic to Mesozoic; alluvial sediments of Quaternary age occur along rivers and larger streams. Rocks of the Blue Ridge include Middle Proterozoic granitoid gneiss intruded by Late Proterozoic granite; Late Proterozoic paragneiss, schist, and other metasedimentary and metavolcaniclastic rocks (Ashe and Grandfather Mountain Formations); Late Proterozoic and Early Cambrian metasedimentary rocks (Chilhowee Group); and Early Cambrian sedimentary rocks (Shady Dolomite). Paleozoic granites intrude the

  1. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  2. Preliminary surficial geologic map of the Newberry Springs 30' x 60' quadrangle, California

    Science.gov (United States)

    Phelps, G.A.; Bedford, D.R.; Lidke, D.J.; Miller, D.M.; Schmidt, K.M.

    2012-01-01

    The Newberry Springs 30' x 60' quadrangle is located in the central Mojave Desert of southern California. It is split approximately into northern and southern halves by I-40, with the city of Barstow at its western edge and the town of Ludlow near its eastern edge. The map area spans lat 34°30 to 35° N. to long -116 °to -117° W. and covers over 1,000 km2. We integrate the results of surficial geologic mapping conducted during 2002-2005 with compilations of previous surficial mapping and bedrock geologic mapping. Quaternary units are subdivided in detail on the map to distinguish variations in age, process of formation, pedogenesis, lithology, and spatial interdependency, whereas pre-Quaternary bedrock units are grouped into generalized assemblages that emphasize their attributes as hillslope-forming materials and sources of parent material for the Quaternary units. The spatial information in this publication is presented in two forms: a spatial database and a geologic map. The geologic map is a view (the display of an extracted subset of the database at a given time) of the spatial database; it highlights key aspects of the database and necessarily does not show all of the data contained therein. The database contains detailed information about Quaternary geologic unit composition, authorship, and notes regarding geologic units, faults, contacts, and local vegetation. The amount of information contained in the database is too large to show on a single map, so a restricted subset of the information was chosen to summarize the overall nature of the geology. Refer to the database for additional information. Accompanying the spatial data are the map documentation and spatial metadata. The map documentation (this document) describes the geologic setting and history of the Newberry Springs map sheet, summarizes the age and physical character of each map unit, and describes principal faults and folds. The Federal Geographic Data Committee (FGDC) compliant metadata

  3. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  4. Geologic map of the Peach Orchard Flat quadrangle, Carbon County, Wyoming, and descriptions of new stratigraphic units in the Upper Cretaceous Lance Formation and Paleocene Fort Union Formation, eastern Greater Green River Basin, Wyoming-Colorado

    Science.gov (United States)

    Honey, J.D.; Hettinger, R.D.

    2004-01-01

    This report provides a geologic map of the Peach Orchard Flat 7.5-minute quadrangle, located along the eastern flank of the Washakie Basin, Wyo. Geologic formations and individual coal beds were mapped at a scale of 1:24,000; surface stratigraphic sections were measured and described; and well logs were examined to determine coal correlations and thicknesses in the subsurface. In addition, four lithostratigraphic units were named: the Red Rim Member of the Upper Cretaceous Lance Formation, and the China Butte, Blue Gap, and Overland Members of the Paleocene Fort Union Formation.

  5. A DRAM compiler algorithm for high performance VLSI embedded memories

    Science.gov (United States)

    Eldin, A. G.

    1992-01-01

    In many applications, the limited density of the embedded SRAM does not allow integrating the memory on the same chip with other logic and functional blocks. In such cases, the embedded DRAM provides the optimum combination of very high density, low power, and high performance. For ASIC's to take full advantage of this design strategy, an efficient and highly reliable DRAM compiler must be used. The embedded DRAM architecture, cell, and peripheral circuit design considerations and the algorithm of a high performance memory compiler are presented .

  6. Compilation of current high-energy-physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976.

  7. The necessity of do needs analysis in textbook compilation

    Institute of Scientific and Technical Information of China (English)

    姚茂

    2014-01-01

    <正>Needs analysis plays an important role in textbook compilation.Compile an excellent textbook need to meet a lot of conditions,but the starting point of any textbook should be meet the needs of users.So do need analysis in order to understand users’need,to make textbook to better reflect the correlation and the practicability.Only textbook writers to fully understand the users’(students,teachers,education department managers)actual demands of teaching textbook,they would be able to write out the applicable materials.

  8. Map Service Showing Geologic and Geophysical Data of Bangladesh

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This map service includes geology, major faults, geologic provinces, and political boundaries in Bangladesh. This compilation is part of an interim product of the...

  9. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  10. UAV Data Processing for Large Scale Topographical Mapping

    Science.gov (United States)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV) bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI) i.e. Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan) level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D) and digital terrain models (3D) will be integrated in order to provide Digital Elevation Models (DEM) as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD). Finally this result will be used as the benchmark for alternative geospatial data acquisition in the future in which it can support

  11. Generating a Danish raster-based topsoil property map combining choropleth maps and point information

    DEFF Research Database (Denmark)

    Greve, Mogens H.; Greve, Mette B.; Bøcher, Peder K.

    2007-01-01

    classification flaws. The objective of this work is to compile a continuous national topsoil texture map to replace the old topsoil map. Approximately 45,000 point samples were interpolated using ordinary kriging in 250 m x 250 m cells. To reduce variability and to obtain more homogeneous strata, the samples...

  12. Geologic Maps as the Foundation of Mineral-Hazards Maps in California

    Science.gov (United States)

    Higgins, C. T.; Churchill, R. K.; Downey, C. I.; Clinkenbeard, J. P.; Fonseca, M. C.

    2010-12-01

    The basic geologic map is essential to the development of products that help planners, engineers, government officials, and the general public make decisions concerning natural hazards. Such maps are the primary foundation that the California Geological Survey (CGS) uses to prepare maps that show potential for mineral-hazards. Examples of clients that request these maps are the California Department of Transportation (Caltrans) and California Department of Public Health (CDPH). Largely because of their non-catastrophic nature, mineral hazards have received much less public attention compared to earthquakes, landslides, volcanic eruptions, and floods. Nonetheless, mineral hazards can be a major concern locally when considering human health and safety and potential contamination of the environment by human activities such as disposal of earth materials. To address some of these concerns, the CGS has focused its mineral-hazards maps on naturally occurring asbestos (NOA), radon, and various potentially toxic metals as well as certain artificial features such as mines and oil and gas wells. The maps range in scope from statewide to counties and Caltrans districts to segments of selected highways. To develop the hazard maps, the CGS begins with traditional paper and digital versions of basic geologic maps, which are obtained from many sources such as its own files, the USGS, USDA Forest Service, California Department of Water Resources, and counties. For each study area, these maps present many challenges of compilation related to vintage, scale, definition of units, and edge-matching across map boundaries. The result of each CGS compilation is a digital geologic layer that is subsequently reinterpreted and transformed into new digital layers (e.g., lithologic) that focus on the geochemical and mineralogical properties of the area’s earth materials and structures. These intermediate layers are then integrated with other technical data to derive final digital layers

  13. [Mapping of road traffic accidents with pedestrians in the territory of a Local Health Unit of Rome through integration of administrative and health data].

    Science.gov (United States)

    D'Alessandro, D; Paone, M; Salvatori, R; Ciaramella, I

    2010-01-01

    The study analyzes the distribution of traffic accidents with injuries that involved pedestrians and occurred in the territory of the Local Health Unit (LHU) Rome B, to identify areas at higher risk and to implement preventive measures. The road traffic injuries (RTI) reports issued by the Municipal police (2003) were examined. Data were linked with those collected by the Regional Emergency Information System (EIS) and processed using the programs ArcGIS, Access and Excel in order to obtain descriptive maps. During the period under review 423 pedestrians were involved in 392 accidents (11.2% of total accidents); 34% suffered serious injuries and 4% died. Of these, 73% were the elderly (> or = 65 years). The hours between 18:00 and 20:00 are the most critical, with an accident rate 3.7 times higher than the average, 6.4-fold in young (Accidents happen, especially on straight roads (66%) or at intersections (22%); failure to comply with traffic lights causes 5% of the events. Municipality V concentrates the highest percentage of accidents (30%). In this municipality, Tiburtina road, in the part closest to the center of the city, is the road at highest risk with a rate of about 7.6 accidents/km (average of municipalities: 0.23/km and value for the entire LHU: 0.19/km). The neighborhoods closer to the center of the city show a higher risk for pedestrians: 1 event per km takes place in the Tuscolano Neighborhood, 0.7/km in Prenestino-Centocelle and 0.5/km in Pietralata-Collatino, all neighborhoods close to roads with high flow and high risk. The survey highlighted some black points of roads on which it will be appropriate to act with specific preventive measures.

  14. GPU-S2S:面向GPU的源到源翻译转化%GPU-S2S: a source to source compiler for GPU

    Institute of Scientific and Technical Information of China (English)

    李丹; 曹海军; 董小社; 张保

    2012-01-01

    To address the problem of poor software portability and programmability of a graphic processing unit ( GPU ) , and to facilitate the development of parallel programs on GPU, this study proposed a novel directive based compiler guided approach, and then the GPU-S2S, a prototypic tool for automatic source-to-source translation, was implemented through combining automatic mapping with static compilation configuration, which is capable of translating a C sequential program with directives into a compute unified device architecture (CUDA) program. The experimental results show that CUDA codes generated by the GPU-S2S can achieve comparable performance to that of CUDA benchmarks provided by NVIDIA CUDA SDK, and have significant performance improvements compared to its original C sequential codes.%针对图形处理器(GPU)架构下的软件可移植性、可编程性差的问题,为了便于在GPU上开发并行程序,通过自动映射与静态编译相结合,提出了一种新的基于制导语句控制的编译优化方法,实现了一个源到源的自动转化工具GPU-S2S,它能够将插入了制导语句的串行C程序转化为统一计算架构(CUDA)程序.实验结果表明,经GPU-S2S转化生成的代码和英伟达(NVIDIA)提供的基准测试代码具有相当的性能;与原串行程序在CPU上执行相比,转换后的并行程序在GPU上能够获取显著的性能提升.

  15. 22 CFR 519.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 519.600 Section 519.600 Foreign Relations BROADCASTING BOARD OF GOVERNORS NEW RESTRICTIONS ON LOBBYING Agency Reports § 519.600...

  16. 15 CFR 28.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Semi-annual compilation. 28.600 Section 28.600 Commerce and Foreign Trade Office of the Secretary of Commerce NEW RESTRICTIONS ON LOBBYING...

  17. 22 CFR 138.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 138.600 Section 138.600 Foreign Relations DEPARTMENT OF STATE MISCELLANEOUS NEW RESTRICTIONS ON LOBBYING Agency Reports...

  18. 22 CFR 712.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 712.600 Section 712.600 Foreign Relations OVERSEAS PRIVATE INVESTMENT CORPORATION ADMINISTRATIVE PROVISIONS NEW RESTRICTIONS ON...

  19. 22 CFR 311.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Senate and the Committee on Foreign Affairs of the House of Representatives or the Committees on Armed... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 311.600 Section 311.600 Foreign Relations PEACE CORPS NEW RESTRICTIONS ON LOBBYING Agency Reports § 311.600 Semi-annual...

  20. 22 CFR 227.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 227.600 Section 227.600 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT NEW RESTRICTIONS ON LOBBYING Agency Reports...

  1. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  2. Effective Compiler Error Message Enhancement for Novice Programming Students

    Science.gov (United States)

    Becker, Brett A.; Glanville, Graham; Iwashima, Ricardo; McDonnell, Claire; Goslin, Kyle; Mooney, Catherine

    2016-01-01

    Programming is an essential skill that many computing students are expected to master. However, programming can be difficult to learn. Successfully interpreting compiler error messages (CEMs) is crucial for correcting errors and progressing toward success in programming. Yet these messages are often difficult to understand and pose a barrier to…

  3. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  4. Compiler Optimization Pass Visualization: The Procedural Abstraction Case

    Science.gov (United States)

    Schaeckeler, Stefan; Shang, Weijia; Davis, Ruth

    2009-01-01

    There is an active research community concentrating on visualizations of algorithms taught in CS1 and CS2 courses. These visualizations can help students to create concrete visual images of the algorithms and their underlying concepts. Not only "fundamental algorithms" can be visualized, but also algorithms used in compilers. Visualizations that…

  5. Compilation of a global inventory of emissions of nitrous oxide.

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing, oceans, fossil fuel and bi

  6. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    S.G.A. Flantua; H. Hooghiemstra; E.C. Grimm; H. Behling; M.B Bush; C. González-Arrango; W.D. Gosling; M.-P. Ledru; S. Lozano-Garciá; A. Maldonado; A.R. Prieto; V. Rull; J.H. van Boxel

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of s

  7. Compiler Optimization Techniques for OpenMP Programs

    Directory of Open Access Journals (Sweden)

    Shigehisa Satoh

    2001-01-01

    Full Text Available We have developed compiler optimization techniques for explicit parallel programs using the OpenMP API. To enable optimization across threads, we designed dataflow analysis techniques in which interactions between threads are effectively modeled. Structured description of parallelism and relaxed memory consistency in OpenMP make the analyses effective and efficient. We developed algorithms for reaching definitions analysis, memory synchronization analysis, and cross-loop data dependence analysis for parallel loops. Our primary target is compiler-directed software distributed shared memory systems in which aggressive compiler optimizations for software-implemented coherence schemes are crucial to obtaining good performance. We also developed optimizations applicable to general OpenMP implementations, namely redundant barrier removal and privatization of dynamically allocated objects. Experimental results for the coherency optimization show that aggressive compiler optimizations are quite effective for a shared-write intensive program because the coherence-induced communication volume in such a program is much larger than that in shared-read intensive programs.

  8. 5 CFR 9701.524 - Compilation and publication of data.

    Science.gov (United States)

    2010-01-01

    ... MANAGEMENT SYSTEM (DEPARTMENT OF HOMELAND SECURITY-OFFICE OF PERSONNEL MANAGEMENT) DEPARTMENT OF HOMELAND SECURITY HUMAN RESOURCES MANAGEMENT SYSTEM Labor-Management Relations § 9701.524 Compilation and... agreements and arbitration decisions and publish the texts of its impasse resolution decisions and...

  9. Calculating Certified Compilers for Non-deterministic Languages

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2015-01-01

    Reasoning about programming languages with non-deterministic semantics entails many difficulties. For instance, to prove correctness of a compiler for such a language, one typically has to split the correctness property into a soundness and a completeness part, and then prove these two parts...

  10. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  11. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  12. Experience with PASCAL compilers on mini-computers

    CERN Document Server

    Bates, D

    1977-01-01

    This paper relates the history of an implementation of the language PASCAL on a minicomputer. The unnecessary difficulties encountered on the way, led the authors to reflect on the distribution of 'portable' compilers in general and suggest some guidelines for the future. (4 refs).

  13. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  14. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  15. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  16. Digital geologic map of part of the Thompson Falls 1:100,000 quadrangle, Idaho

    Science.gov (United States)

    Lewis, Reed S.; Derkey, Pamela D.

    1999-01-01

    The geology of the Thompson Falls 1:100,000 quadrangle, Idaho was compiled by Reed S. Lewis in 1997 onto a 1:100,000-scale greenline mylar of the topographic base map for input into a geographic information system (GIS). The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000). The map area is located in north Idaho. This open-file report describes the geologic map units, the methods used to convert the geologic map data into a digital format, the Arc/Info GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.

  17. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  18. Bitwise identical compiling setup: prospective for reproducibility and reliability of earth system modeling

    Directory of Open Access Journals (Sweden)

    R. Li

    2015-11-01

    Full Text Available Reproducibility and reliability are fundamental principles of scientific research. A compiling setup that includes a specific compiler version and compiler flags is essential technical supports for Earth system modeling. With the fast development of computer software and hardware, compiling setup has to be updated frequently, which challenges the reproducibility and reliability of Earth system modeling. The existing results of a simulation using an original compiling setup may be irreproducible by a newer compiling setup because trivial round-off errors introduced by the change of compiling setup can potentially trigger significant changes in simulation results. Regarding the reliability, a compiler with millions of lines of codes may have bugs that are easily overlooked due to the uncertainties or unknowns in Earth system modeling. To address these challenges, this study shows that different compiling setups can achieve exactly the same (bitwise identical results in Earth system modeling, and a set of bitwise identical compiling setups of a model can be used across different compiler versions and different compiler flags. As a result, the original results can be more easily reproduced; for example, the original results with an older compiler version can be reproduced exactly with a newer compiler version. Moreover, this study shows that new test cases can be generated based on the differences of bitwise identical compiling setups between different models, which can help detect software bugs or risks in the codes of models and compilers and finally improve the reliability of Earth system modeling.

  19. TOXMAP®: Environmental Health Maps

    Data.gov (United States)

    U.S. Department of Health & Human Services — TOXMAP® is a Geographic Information System (GIS) that uses maps of the United States and Canada to help users visually explore data primarily from the EPA's Toxics...

  20. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zheming [Argonne National Lab. (ANL), Argonne, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Yoshii, Kazutomo [Argonne National Lab. (ANL), Argonne, IL (United States); Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-07-28

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLS compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.

  1. Ada Compiler Validation Summary Report. Certificate Number: 891019W1. 10178, Hewlett Packard Company HP 9000 Series 800 Ada Compiler, Version 4.35 HP 9000 Series 800 Model 850. Completion of On-Site Testing: 19 October 1989

    Science.gov (United States)

    1989-10-19

    1985, the Commerce Department issued Part’ 379 Technical Data of the Export Administration specifically listing Ada Programming Support Environments...maintaining a uniform process for validation of Ada compilers. The AVO provides administrative and technical support for Ada validations to ensure...unused trailing bytes will not be appended. The principie use for the RECORD UNIT parameter is in reading and writing external files that are in

  2. Renewable energy atlas of the United States.

    Energy Technology Data Exchange (ETDEWEB)

    Kuiper, J.A.; Hlava, K.Greenwood, H.; Carr, A. (Environmental Science Division)

    2012-05-01

    The Renewable Energy Atlas (Atlas) of the United States is a compilation of geospatial data focused on renewable energy resources, federal land ownership, and base map reference information. It is designed for the U.S. Department of Agriculture Forest Service (USFS) and other federal land management agencies to evaluate existing and proposed renewable energy projects. Much of the content of the Atlas was compiled at Argonne National Laboratory (Argonne) to support recent and current energy-related Environmental Impact Statements and studies, including the following projects: (1) West-wide Energy Corridor Programmatic Environmental Impact Statement (PEIS) (BLM 2008); (2) Draft PEIS for Solar Energy Development in Six Southwestern States (DOE/BLM 2010); (3) Supplement to the Draft PEIS for Solar Energy Development in Six Southwestern States (DOE/BLM 2011); (4) Upper Great Plains Wind Energy PEIS (WAPA/USFWS 2012, in progress); and (5) Energy Transport Corridors: The Potential Role of Federal Lands in States Identified by the Energy Policy Act of 2005, Section 368(b) (in progress). This report explains how to add the Atlas to your computer and install the associated software; describes each of the components of the Atlas; lists the Geographic Information System (GIS) database content and sources; and provides a brief introduction to the major renewable energy technologies.

  3. Mineral operations outside the United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Mineral facilities and operations outside the United States compiled by the National Minerals Information Center of the USGS. This representation combines source...

  4. USGS Geologic Map of Pipe Spring National Monument and the Western Kaibab-Paiute Indian Reservation, Mohave County, Arizona

    Data.gov (United States)

    National Park Service, Department of the Interior — The digital map publication, compiled from previously published and unpublished data and new mapping by the author, represents the general distribution of surficial...

  5. USGS Geologic Map of Pipe Spring National Monument and the Western Kaibab-Paiute Indian Reservation, Mohave County, Arizona

    Data.gov (United States)

    National Park Service, Department of the Interior — The digital map publication, compiled from previously published and unpublished data and new mapping by the author, represents the general distribution of surficial...

  6. Performance modelling of parallel BLAST using Intel and PGI compilers on an infiniband-based HPC cluster.

    Science.gov (United States)

    Al-Mulhem, Muhammed; Al-Shaikh, Raed

    2013-01-01

    The Basic Local Alignment Search (BLAST) is one of the most widely used bioinformatics programs for searching all available sequence databases for similarities between a protein or DNA query and predefined sequences, using sequence alignment technique. Recently, many attempts have been made to make the algorithm practical to run against the publicly available genome databases. This paper presents our experience in mapping and evaluating both the serial and parallel BLAST algorithms onto a large Infiniband-based High Performance Cluster. The evaluation is performed using two commonly used parallel compilers, Intel and Portland's PGI. The paper also presents the evaluation methodology along with the experimental results to illustrate the scalability of the BLAST algorithm on our state-of-the-art HPC system. Our results show that BLAST runtime scalability can be achieved with up to 87% efficiency when considering the right combination of the MPI suite, the parallel compiler, the cluster interconnect and the CPU technology.

  7. BAC-HAPPY mapping (BAP mapping): a new and efficient protocol for physical mapping.

    Science.gov (United States)

    Vu, Giang T H; Dear, Paul H; Caligari, Peter D S; Wilkinson, Mike J

    2010-02-08

    Physical and linkage mapping underpin efforts to sequence and characterize the genomes of eukaryotic organisms by providing a skeleton framework for whole genome assembly. Hitherto, linkage and physical "contig" maps were generated independently prior to merging. Here, we develop a new and easy method, BAC HAPPY MAPPING (BAP mapping), that utilizes BAC library pools as a HAPPY mapping panel together with an Mbp-sized DNA panel to integrate the linkage and physical mapping efforts into one pipeline. Using Arabidopsis thaliana as an exemplar, a set of 40 Sequence Tagged Site (STS) markers spanning approximately 10% of chromosome 4 were simultaneously assembled onto a BAP map compiled using both a series of BAC pools each comprising 0.7x genome coverage and dilute (0.7x genome) samples of sheared genomic DNA. The resultant BAP map overcomes the need for polymorphic loci to separate genetic loci by recombination and allows physical mapping in segments of suppressed recombination that are difficult to analyze using traditional mapping techniques. Even virtual "BAC-HAPPY-mapping" to convert BAC landing data into BAC linkage contigs is possible.

  8. Compilation and synthesis for embedded reconfigurable systems an aspect-oriented approach

    CERN Document Server

    Diniz, Pedro; Coutinho, José; Petrov, Zlatko

    2013-01-01

    This book provides techniques to tackle the design challenges raised by the increasing diversity and complexity of emerging, heterogeneous architectures for embedded systems. It describes an approach based on techniques from software engineering called aspect-oriented programming, which allow designers to control today’s sophisticated design tool chains, while maintaining a single application source code.  Readers are introduced to the basic concepts of an aspect-oriented, domain specific language that enables control of a wide range of compilation and synthesis tools in the partitioning and mapping of an application to a heterogeneous (and possibly multi-core) target architecture.  Several examples are presented that illustrate the benefits of the approach developed for applications from avionics and digital signal processing. Using the aspect-oriented programming techniques presented in this book, developers can reuse extensive sections of their designs, while preserving the original application source-...

  9. On palaeogeographic map

    Directory of Open Access Journals (Sweden)

    Zeng-Zhao Feng

    2016-01-01

    Full Text Available The palaeogeographic map is a graphic representation of physical geographical characteristics in geological history periods and human history periods. It is the most important result of palaeogeographic study. The author, as the Editor-in-Chief of Journal of Palaeogeography, Chinese Edition and English Edition, aimed at the problems of the articles submitted to and published in the Journal of Palaeogeography in recent years and the relevant papers and books of others, and integrated with his practice of palaeogeographic study and mapping, wrote this paper. The content mainly includes the data of palaeogeographic mapping, the problems of palaeogeographic mapping method, the “Single factor analysis and multifactor comprehensive mapping method —— Methodology of quantitative lithofacies palaeogeography”, i.e., the “4 steps mapping method”, the nomenclature of each palaeogeographic unit in palaeogeographic map, the explanation of each palaeogeographic unit in palaeogeographic map, the explanation of significance of palaeogeographic map and palaeogeographic article, the evaluative standards of palaeogeographic map and palaeogeographic article, and the self-evaluation. Criticisms and corrections are welcome.

  10. Map images portraying flight paths of low-altitude transects over the Arctic Network of national park units and Selawik National Wildlife Refuge, Alaska, July 2013

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Maps portraying the flight paths for low altitude transects conducted from small aircraft over the National Park Service’s Arctic Network (Bering Land Bridge...

  11. Jules Verne Voyager, Jr: An Interactive Map Tool for Teaching Plate Tectonics

    Science.gov (United States)

    Hamburger, M. W.; Meertens, C. M.

    2010-12-01

    We present an interactive, web-based map utility that can make new geological and geophysical results accessible to a large number and variety of users. The tool provides a user-friendly interface that allows users to access a variety of maps, satellite images, and geophysical data at a range of spatial scales. The map tool, dubbed 'Jules Verne Voyager, Jr.', allows users to interactively create maps of a variety of study areas around the world. The utility was developed in collaboration with the UNAVCO Consortium for study of global-scale tectonic processes. Users can choose from a variety of base maps (including "Face of the Earth" and "Earth at Night" satellite imagery mosaics, global topography, geoid, sea-floor age, strain rate and seismic hazard maps, and others), add a number of geographic and geophysical overlays (coastlines, political boundaries, rivers and lakes, earthquake and volcano locations, stress axes, etc.), and then superimpose both observed and model velocity vectors representing a compilation of 2933 GPS geodetic measurements from around the world. A remarkable characteristic of the geodetic compilation is that users can select from some 21 plates' frames of reference, allowing a visual representation of both 'absolute' plate motion (in a no-net rotation reference frame) and relative motion along all of the world's plate boundaries. The tool allows users to zoom among at least three map scales. The map tool can be viewed at http://jules.unavco.org/VoyagerJr/Earth. A more detailed version of the map utility, developed in conjunction with the EarthScope initiative, focuses on North America geodynamics, and provides more detailed geophysical and geographic information for the United States, Canada, and Mexico. The ‘EarthScope Voyager’ can be accessed at http://jules.unavco.org/VoyagerJr/EarthScope. Because the system uses pre-constructed gif images and overlays, the system can rapidly create and display maps to a large number of users

  12. Function Interface Models for Hardware Compilation: Types, Signatures, Protocols

    CERN Document Server

    Ghica, Dan R

    2009-01-01

    The problem of synthesis of gate-level descriptions of digital circuits from behavioural specifications written in higher-level programming languages (hardware compilation) has been studied for a long time yet a definitive solution has not been forthcoming. The argument of this essay is mainly methodological, bringing a perspective that is informed by recent developments in programming-language theory. We argue that one of the major obstacles in the way of hardware compilation becoming a useful and mature technology is the lack of a well defined function interface model, i.e. a canonical way in which functions communicate with arguments. We discuss the consequences of this problem and propose a solution based on new developments in programming language theory. We conclude by presenting a prototype implementation and some examples illustrating our principles.

  13. Compiler analysis for irregular problems in FORTRAN D

    Science.gov (United States)

    Vonhanxleden, Reinhard; Kennedy, Ken; Koelbel, Charles; Das, Raja; Saltz, Joel

    1992-01-01

    We developed a dataflow framework which provides a basis for rigorously defining strategies to make use of runtime preprocessing methods for distributed memory multiprocessors. In many programs, several loops access the same off-processor memory locations. Our runtime support gives us a mechanism for tracking and reusing copies of off-processor data. A key aspect of our compiler analysis strategy is to determine when it is safe to reuse copies of off-processor data. Another crucial function of the compiler analysis is to identify situations which allow runtime preprocessing overheads to be amortized. This dataflow analysis will make it possible to effectively use the results of interprocedural analysis in our efforts to reduce interprocessor communication and the need for runtime preprocessing.

  14. Efficient Integration of Pipelined IP Blocks into Automatically Compiled Datapaths

    Directory of Open Access Journals (Sweden)

    Andreas Koch

    2006-12-01

    Full Text Available Compilers for reconfigurable computers aim to generate problem-specific optimized datapaths for kernels extracted from an input language. In many cases, however, judicious use of preexisting manually optimized IP blocks within these datapaths could improve the compute performance even further. The integration of IP blocks into the compiled datapaths poses a different set of problems than stitching together IPs to form a system-on-chip; though, instead of the loose coupling using standard busses employed by SoCs, the one between datapath and IP block must be much tighter. To this end, we propose a concise language that can be efficiently synthesized using a template-based approach for automatically generating lightweight data and control interfaces at the datapath level.

  15. Efficient Integration of Pipelined IP Blocks into Automatically Compiled Datapaths

    Directory of Open Access Journals (Sweden)

    Koch Andreas

    2007-01-01

    Full Text Available Compilers for reconfigurable computers aim to generate problem-specific optimized datapaths for kernels extracted from an input language. In many cases, however, judicious use of preexisting manually optimized IP blocks within these datapaths could improve the compute performance even further. The integration of IP blocks into the compiled datapaths poses a different set of problems than stitching together IPs to form a system-on-chip; though, instead of the loose coupling using standard busses employed by SoCs, the one between datapath and IP block must be much tighter. To this end, we propose a concise language that can be efficiently synthesized using a template-based approach for automatically generating lightweight data and control interfaces at the datapath level.

  16. Compilation of gallium resource data for bauxite deposits

    Science.gov (United States)

    Schulte, Ruth F.; Foley, Nora K.

    2014-01-01

    Gallium (Ga) concentrations for bauxite deposits worldwide have been compiled from the literature to provide a basis for research regarding the occurrence and distribution of Ga worldwide, as well as between types of bauxite deposits. In addition, this report is an attempt to bring together reported Ga concentration data into one database to supplement ongoing U.S. Geological Survey studies of critical mineral resources. The compilation of Ga data consists of location, deposit size, bauxite type and host rock, development status, major oxide data, trace element (Ga) data and analytical method(s) used to derive the data, and tonnage values for deposits within bauxite provinces and districts worldwide. The range in Ga concentrations for bauxite deposits worldwide is

  17. Twelve tips on how to compile a medical educator's portfolio.

    Science.gov (United States)

    Dalton, Claudia Lucy; Wilson, Anthony; Agius, Steven

    2017-09-17

    Medical education is an expanding area of specialist interest for medical professionals. Whilst most doctors will be familiar with the compilation of clinical portfolios for scrutiny of their clinical practice and provision of public accountability, teaching portfolios used specifically to gather and demonstrate medical education activity remain uncommon in many non-academic settings. For aspiring and early career medical educators in particular, their value should not be underestimated. Such a medical educator's portfolio (MEP) is a unique compendium of evidence that is invaluable for appraisal, revalidation, and promotion. It can stimulate and provide direction for professional development, and is a rich source for personal reflection and learning. We recommend that all new and aspiring medical educators prepare an MEP, and suggest twelve tips on how to skillfully compile one.

  18. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  19. Compilation of Non-Financial Balances in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Vítězslav Ondruš

    2011-09-01

    Full Text Available The System of National Accounts in the Czech Republic consists of three main parts — institutional sector accounts, input-output tables and balances of non-financial assets. All three parts are compiled interactively by common time schedule. The article deals with balances of non-financial assets and their relation to core institutional sector accounts and explains why the third parallel part of SNA in the Czech Republic was build, describes its weaknesses and future development.

  20. Compiler-Driven Performance Optimization and Tuning for Multicore Architectures

    Science.gov (United States)

    2015-04-10

    Workshop on Libraries and Automatic Tuning for Extreme Scale Systems, Lake Tahoe, CA. August 2011. J. Ramanujam, “The Tensor Contraction Engine...Hartono, M. Baskaran, L.-N. Pouchet, J. Ramanujam, and P. Sadayappan, “ Parametric Tiling of Affine Loop Nests,” in 15th Workshop on Compilers for Parallel... Parametric Tiling for Autotuning,” in Workshop on Parallel Matrix Algorithms and Applications (PMAA 2010), Basel, Switzerland, July 2010. J. Ramanujam

  1. Analysis on Establishing Urban Cemetery Planning and Compiling System

    Institute of Scientific and Technical Information of China (English)

    Kun; YANG; Xiaogang; CHEN

    2015-01-01

    Currently,there are many problems in construction of urban cemetery like improper location,low land utilization,backward greening facilities and imperfect cemetery management,which have greatly affected people’s normal production and life. This article discusses the establishment of a sustainable city cemetery planning and compiling system from three levels of " macro-view,medium-view and micro-view" in order to perfect the present cemetery system.

  2. An Adaptation of the ADA Language for Machine Generated Compilers.

    Science.gov (United States)

    1980-12-01

    Ada Augusta, Lady Lovelace , the daughter of the poet, Lord Byron, and Charles Babbage’s programmer.# 2UNIX is a Trademark/Service Mark of the Bell...AN ADAPTATION OF THE ADA LANGUAGE FOR MACHINE GENERATED COMPILE-ETC(U) JNLSIIO DEC AG M A ROGERS, L P MYERS 7k .A9 22NVLPSTRDASHOOLMONEREYCAF EE9...mmhhhhhhmhhhhlLEhhhhhmmh LEV EU NAVAL POSTGRADUATE SCHOOL Monterey, California DTIC ~ELECTEf All 0 3 198 /12 )THESIS 7 ,AN *DAPTATION OF THE ADA

  3. Recent Efforts in Data Compilations for Nuclear Astrophysics

    CERN Document Server

    Dillmann, I

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on "Nuclear Physics Data Compilation for Nucleosynthesis Modeling" held at the ECT* in Trento/ Italy from May 29th- June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The "JINA Reaclib Database" on http://www.nscl.msu.edu/\\~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections...

  4. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  5. 盲文地图的编制与印刷%Braille Maps Compiling and Printing

    Institute of Scientific and Technical Information of China (English)

    莫新彦; 何秀丽

    2006-01-01

    编制出方便实用的盲文地图具有重要的意义.介绍了盲文地图的特点和编制过程,以及盲文地图的印刷,主要包括基础要素的制作、注记及图形要素的表示、印刷工艺和特殊印刷技术等内容.

  6. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  7. Geological assessing of urban environments with a systematic mapping survey: The 1:5000 urban geological map of Catalonia

    Science.gov (United States)

    Vilà, Miquel; Pi, Roser; Cirés, Jordi; de Paz, Ana; Berástegui, Xavier

    2010-05-01

    harmonised and stored it in a database. The analysis of the database allows to compile and print the 1:5000 scale urban geological map according to the 1:5000 topographic grid of Catalonia. The map is composed by a principal map, geologic cross sections and several complementary maps, charts and tables. Regardless of the geological map units, the principal map also includes the main artificial deposits (such as infilled river valleys and road embankments), very recent or current superficial deposits, contours of outcropping areas, structural data and other relevant information gathered in stations, sampling points, boreholes indicating the thickness of artificial deposits and the depth of the pre-quaternary basement, contour lines of the top of the pre-quaternary basement surface and, water level data. The complementary maps and charts may change depending on the gathered data, the geological features of the area and the urban typology. However, the most representative complementary maps that includes the printed urban map are the quaternary subsurface bedrock map and the isopach map of thickness of quaternary and anthropogenic deposits. The map also includes charts and tables of relevant physical and chemical parameters of the geological materials, harmonised downhole lithological columns from selected boreholes, and, photographs and figures illustrating the geology of the mapped area and how urbanisation has changed the natural environment. The object of this systematic urban mapping survey is to provide a robust database to be used in targeted studies related to urban planning, geoengineering works, soil pollution and other important environmental issues that society should deal in the future.

  8. Cluster-Enabled OpenMP: An OpenMP Compiler for the SCASH Software Distributed Shared Memory System

    Directory of Open Access Journals (Sweden)

    Mitsuhisa Sato

    2001-01-01

    Full Text Available OpenMP is attracting wide-spread interest because of its easy-to-use parallel programming model for shared memory multiprocessors. We have implemented a "cluster-enabled" OpenMP compiler for a page-based software distributed shared memory system, SCASH, which works on a cluster of PCs. It allows OpenMP programs to run transparently in a distributed memory environment. The compiler transforms OpenMP programs into parallel programs using SCASH so that shared global variables are allocated at run time in the shared address space of SCASH. A set of directives is added to specify data mapping and loop scheduling method which schedules iterations onto threads associated with the data mapping. Our experimental results show that the data mapping may greatly impact on the performance of OpenMP programs in the software distributed shared memory system. The performance of some NAS parallel benchmark programs in OpenMP is improved by using our extended directives.

  9. Surficial Geologic Map of the Worcester North-Oxford- Wrentham-Attleboro Nine-Quadrangle Area in South- Central Massachusetts

    Science.gov (United States)

    Stone, Byron D.; Stone, Janet R.; DiGiacomo-Cohen, Mary L.

    2008-01-01

    The surficial geologic map layer shows the distribution of nonlithified earth materials at land surface in an area of nine 7.5-minute quadrangles (417 mi2 total) in south-central Massachusetts (fig. 1). Across Massachusetts, these materials range from a few feet to more than 500 ft in thickness. They overlie bedrock, which crops out in upland hills and in resistant ledges in valley areas. The geologic map differentiates surficial materials of Quaternary age on the basis of their lithologic characteristics (such as grain size and sedimentary structures), constructional geomorphic features, stratigraphic relationships, and age. Surficial materials also are known in engineering classifications as unconsolidated soils, which include coarse-grained soils, fine-grained soils, or organic fine-grained soils. Surficial materials underlie and are the parent materials of modern pedogenic soils, which have developed in them at the land surface. Surficial earth materials significantly affect human use of the land, and an accurate description of their distribution is particularly important for water resources, construction aggregate resources, earth-surface hazards assessments, and land-use decisions. The mapped distribution of surficial materials that lie between the land surface and the bedrock surface is based on detailed geologic mapping of 7.5-minute topographic quadrangles, produced as part of an earlier (1938-1982) cooperative statewide mapping program between the U.S. Geological Survey and the Massachusetts Department of Public Works (now Massachusetts Highway Department) (Page, 1967; Stone, 1982). Each published geologic map presents a detailed description of local geologic map units, the genesis of the deposits, and age correlations among units. Previously unpublished field compilation maps exist on paper or mylar sheets and these have been digitally rendered for the present map compilation. Regional summaries based on the Massachusetts surficial geologic mapping

  10. Geological mapping of the Kuiper quadrangle (H06) of Mercury

    Science.gov (United States)

    Giacomini, Lorenza; Massironi, Matteo; Galluzzi, Valentina

    2017-04-01

    Kuiper quadrangle (H06) is located at the equatorial zone of Mercury and encompasses the area between longitudes 288°E - 360°E and latitudes 22.5°N - 22.5°S. The quadrangle was previously mapped for its most part by De Hon et al. (1981) that, using Mariner10 data, produced a final 1:5M scale map of the area. In this work we present the preliminary results of a more detailed geological map (1:3M scale) of the Kuiper quadrangle that we compiled using the higher resolution of MESSENGER data. The main basemap used for the mapping is the MDIS (Mercury Dual Imaging System) 166 m/pixel BDR (map-projected Basemap reduced Data Record) mosaic. Additional datasets were also taken into account, such as DLR stereo-DEM of the region (Preusker et al., 2016), global mosaics with high-incidence illumination from the east and west (Chabot et al., 2016) and MDIS global color mosaic (Denevi et al., 2016). The preliminary geological map shows that the western part of the quadrangle is characterized by a prevalence of crater materials (i.e. crater floor, crater ejecta) which were distinguished into three classes on the basis of their degradation degree (Galluzzi et al., 2016). Different plain units were also identified and classified as: (i) intercrater plains, represented by densely cratered terrains, (ii) intermediate plains, which are terrains with a moderate density of superposed craters, and (iii) smooth plains, which are poorly cratered volcanic deposits emplaced mainly on the larger crater floors. Finally, several structures were mapped all over the quadrangle. Most of these features are represented by thrusts, some of which appear to form systematic alignments. In particular, two main thrust systems have been identified: i) the "Thakur" system, a 1500 km-long system including several scarps with a NNE-SSW orientation, located at the edge between the Kuiper and Beethoven (H07) quadrangles; ii) the "Santa Maria" system, located at the centre of the quadrangle. It is a 1700 km

  11. Geodynamics map of northeast Asia

    Science.gov (United States)

    Parfenov, Leonid M.; Khanchuk, Alexander I.; Badarch, Gombosuren; Miller, Robert J.; Naumova, Vera V.; Nokleberg, Warren J.; Ogasawara, Masatsugu; Prokopiev, Andrei V.; Yan, Hongquan

    2013-01-01

    This map portrays the geodynamics of Northeast Asia at a scale of 1:5,000,000 using the concepts of plate tectonics and analysis of terranes and overlap assemblages. The map is the result of a detailed compilation and synthesis at 5 million scale and is part of a major international collaborative study of the mineral resources, metallogenesis, and tectonics of northeast Asia conducted from 1997 through 2002 by geologists from earth science agencies and universities in Russia, Mongolia, northeastern China, South Korea, Japan, and the USA.

  12. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  13. International survey of environmental programmes - a compilation of information from twelve countries received in response to a questionnaire distributed in 1992

    Energy Technology Data Exchange (ETDEWEB)

    Gyllander, C.; Karlberg, O.; Luening, M.; Larsson, C.M.; Johansson, G.

    1995-11-01

    The report compiles information from Cuba, Finland, Germany, Japan, South Korea, Lithuania, Luxembourg, Malaysia, Romania, Sweden, Switzerland and United Kingdom, relevant to the organisation and execution of programmes for environmental surveillance of nuclear facilities (source and environmental monitoring). 28 refs, 19 tabs.

  14. A Preliminary Study on the Use of Mind Mapping as a Visual-Learning Strategy in General Education Science Classes for Arabic Speakers in the United Arab Emirates

    Science.gov (United States)

    Wilson, Kenesha; Copeland-Solas, Eddia; Guthrie-Dixon, Natalie

    2016-01-01

    Mind mapping was introduced as a culturally relevant pedagogy aimed at enhancing the teaching and learning experience in a general education, Environmental Science class for mostly Emirati English Language Learners (ELL). Anecdotal evidence suggests that the students are very artistic and visual and enjoy group-based activities. It was decided to…

  15. Digital Geologic Map of Mount Mazama and Crater Lake Caldera, Oregon

    Science.gov (United States)

    Bacon, C. R.; Ramsey, D. W.

    2002-12-01

    Crater Lake caldera formed ~7700 cal yr B.P. by the eruption of 50 km3 of mainly rhyodacitic magma and the resulting collapse of Mount Mazama. A new 1:24,000-scale digital geologic map compiled in ArcInfo depicts the geology of this volcanic center, peripheral volcanoes, the caldera walls and floor, and superjacent pyroclastic, talus, and glacial deposits. The geology of the caldera walls was mapped in the field on photographs taken from the lake (see accompanying abstract and poster, "Geologic panoramas of the walls of Crater Lake caldera,Oregon"); the geology of the flanks of Mount Mazama and the surrounding area was mapped on aerial photographs; and features of the caldera floor were mapped on a multibeam echo-sounding bathymetric map (Gardner et al., 2001; Bacon et al., 2002). Volcanic map units are defined on the basis of chemical composition and petrographic characteristics. Map unit colors were chosen to indicate the compositions of volcanic rocks, cooler colors for mafic units and warmer colors for silicic units. Map unit color intensity indicates age, with more intense coloring for younger units. Ages of many units have been determined by K-Ar and 40Ar/39Ar dating by M.A. Lanphere. Several undated units have been correlated using paleomagnetic secular variation measurements by D.E. Champion. Crystallization facies of some of the larger lava flows are mapped separately (e.g., vitrophyre, felsite, carapace), as are breccia and lava facies of submerged postcaldera volcanoes. Also shown on the caldera floor are landslide (debris avalanche) and sediment gravity-flow deposits. A major north-south normal fault system traverses the map area west of the caldera and displaces dated late Pleistocene lava flows, allowing determination of a long-term slip rate of ~0.3 mm/yr (Bacon et al., 1999). Faults bounding large downdropped blocks of the south caldera wall are also shown. Where practical, lava flow margins are represented as intra-unit contacts. A number of small

  16. Benchmarking Domain-Specific Compiler Optimizations for Variational Forms

    CERN Document Server

    Kirby, Robert C

    2012-01-01

    We examine the effect of using complexity-reducing relations to generate optimized code for the evaluation of finite element variational forms. The optimizations are implemented in a prototype code named FErari, which has been integrated as an optimizing backend to the FEniCS Form Compiler, FFC. In some cases, FErari provides very little speedup, while in other cases, we obtain reduced local operation counts of a factor of as much as 7.9 and speedups for the assembly of the global sparse matrix of as much as a factor of 2.8.

  17. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  18. An advanced compiler designed for a VLIW DSP for sensors-based systems.

    Science.gov (United States)

    Yang, Xu; He, Hu

    2012-01-01

    The VLIW architecture can be exploited to greatly enhance instruction level parallelism, thus it can provide computation power and energy efficiency advantages, which satisfies the requirements of future sensor-based systems. However, as VLIW codes are mainly compiled statically, the performance of a VLIW processor is dominated by the behavior of its compiler. In this paper, we present an advanced compiler designed for a VLIW DSP named Magnolia, which will be used in sensor-based systems. This compiler is based on the Open64 compiler. We have implemented several advanced optimization techniques in the compiler, and fulfilled the O3 level optimization. Benchmarks from the DSPstone test suite are used to verify the compiler. Results show that the code generated by our compiler can make the performance of Magnolia match that of the current state-of-the-art DSP processors.

  19. An Advanced Compiler Designed for a VLIW DSP for Sensors-Based Systems

    Directory of Open Access Journals (Sweden)

    Hu He

    2012-04-01

    Full Text Available The VLIW architecture can be exploited to greatly enhance instruction level parallelism, thus it can provide computation power and energy efficiency advantages, which satisfies the requirements of future sensor-based systems. However, as VLIW codes are mainly compiled statically, the performance of a VLIW processor is dominated by the behavior of its compiler. In this paper, we present an advanced compiler designed for a VLIW DSP named Magnolia, which will be used in sensor-based systems. This compiler is based on the Open64 compiler. We have implemented several advanced optimization techniques in the compiler, and fulfilled the O3 level optimization. Benchmarks from the DSPstone test suite are used to verify the compiler. Results show that the code generated by our compiler can make the performance of Magnolia match that of the current state-of-the-art DSP processors.

  20. Multi-hazard Non-regulatory Risk Maps for Resilient Coastal Communities of Washington State in Pacific Northwest Region of the United States

    Science.gov (United States)

    Cakir, R.; Walsh, T. J.; Zou, Y.; Gufler, T.; Norman, D. K.

    2015-12-01

    Washington Department of Natural Resources - Division of Geology and Earth Resources (WADNR-DGER) partnered with FEMA through the FEMA Cooperating Technical Partners (CTP) program to assess annualized losses from flood and other hazards and prepare supportive risk related data for FEMA's coastal RiskMAP projects. We used HAZUS-MH analysis to assess losses from earthquake, flood and other potential hazards such as landslide and tsunami in the project areas; on shorelines of the Pacific Ocean and Puget Sound of Washington Grays Harbor, Pacific, Skagit, Whatcom, Island, Mason, Clallam, Jefferson and San Juan counties. The FEMA's Hazus-MH tool was applied to estimate losses and damages for each building due to floods and earthquakes. User-defined facilities (UDF) inventory data were prepared and used for individual building damage estimations and updating general building stocks. Flood depth grids were used to determine which properties are most impacted by flooding. For example, the HAZUS-MH (flood model) run based on the 1% annual chance event (or 100 year flood) for Grays Harbor County, resulted in a total of 161 million in losses to buildings including residential, commercial properties, and other building and occupancy types. A likely M9 megathrust Cascadia earthquake scenario USGS-ShakeMap was used for the HAZUS-MH earthquake model. For example, the HAZUS-MH (earthquake model) run based on the Cascadia M9 earthquake for Grays Harbor County, resulted in a total of 1.15 billion in losses to building inventory. We produced GIS-based overlay maps of properties exposed to tsunami, landslide, and liquefaction hazards within the communities. This multi-hazard approach is an essential component to produce non-regulatory maps for FEMA's RiskMAP project, and they help further improve local and regional mitigation efforts and emergency response plans, and overall resiliency plan of the communities in and around the coastal communities in western Washington.