WorldWideScience

Sample records for newly compiled bouguer

  1. New Bouguer Gravity Maps of Venezuela: Representation and Analysis of Free-Air and Bouguer Anomalies with Emphasis on Spectral Analyses and Elastic Thickness

    OpenAIRE

    Sanchez-Rojas, Javier

    2012-01-01

    A new gravity data compilation for Venezuela was processed and homogenized. Gravity was measured in reference to the International Gravity Standardization Net 1971, and the complete Bouguer anomaly was calculated by using the Geodetic Reference System 1980 and 2.67 Mg/m3. A regional gravity map was computed by removing wavelengths higher than 200 km from the Bouguer anomaly. After the anomaly separation, regional and residual Bouguer gravity fields were then critically discussed in term of th...

  2. New Bouguer Gravity Maps of Venezuela: Representation and Analysis of Free-Air and Bouguer Anomalies with Emphasis on Spectral Analyses and Elastic Thickness

    Directory of Open Access Journals (Sweden)

    Javier Sanchez-Rojas

    2012-01-01

    Full Text Available A new gravity data compilation for Venezuela was processed and homogenized. Gravity was measured in reference to the International Gravity Standardization Net 1971, and the complete Bouguer anomaly was calculated by using the Geodetic Reference System 1980 and 2.67 Mg/m3. A regional gravity map was computed by removing wavelengths higher than 200 km from the Bouguer anomaly. After the anomaly separation, regional and residual Bouguer gravity fields were then critically discussed in term of the regional tectonic features. Results were compared with the previous geological and tectonic information obtained from former studies. Gravity and topography data in the spectral domain were used to examine the elastic thickness and depths of the structures of the causative measured anomaly. According to the power spectrum analysis results of the gravity data, the averaged Moho depths for the massif, plains, and mountainous areas in Venezuela are 42, 35, and 40 km, respectively. The averaged admittance function computed from the topography and Free-Air anomaly profiles across Mérida Andes showed a good fit for a regional compensation model with an effective elastic thickness of 15 km.

  3. The Bouguer Correction Algorithm for Gravity with Limited Range

    OpenAIRE

    MA Jian; WEI Ziqing; WU Lili; YANG Zhenghui

    2017-01-01

    The Bouguer correction is an important item in gravity reduction, while the traditional Bouguer correction, whether the plane Bouguer correction or the spherical Bouguer correction, exists approximation error because of far-zone virtual terrain. The error grows as the calculation point gets higher. Therefore gravity reduction using the Bouguer correction with limited range, which was in accordance with the scope of the topographic correction, was researched in this paper. After that, a simpli...

  4. The Bouguer Correction Algorithm for Gravity with Limited Range

    Directory of Open Access Journals (Sweden)

    MA Jian

    2017-01-01

    Full Text Available The Bouguer correction is an important item in gravity reduction, while the traditional Bouguer correction, whether the plane Bouguer correction or the spherical Bouguer correction, exists approximation error because of far-zone virtual terrain. The error grows as the calculation point gets higher. Therefore gravity reduction using the Bouguer correction with limited range, which was in accordance with the scope of the topographic correction, was researched in this paper. After that, a simplified formula to calculate the Bouguer correction with limited range was proposed. The algorithm, which is innovative and has the value of mathematical theory to some extent, shows consistency with the equation evolved from the strict integral algorithm for topographic correction. The interpolation experiment shows that gravity reduction based on the Bouguer correction with limited range is prior to unlimited range when the calculation point is taller than 1000 m.

  5. Maine Bouguer Gravity Grid

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 2 kilometer Bouguer anomaly grid for the state of Maine. Number of columns is 197 and number of rows is 292. The order of the data is from the lower left to the...

  6. Minnesota Bouguer Anomaly Grid

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 1.5 kilometer Bouguer anomaly grid for the state of Minnesota. Number of columns is 404 and number of rows is 463. The order of the data is from the lower left to...

  7. Bolivian Bouguer Anomaly Grid

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 1 kilometer Bouguer anomaly grid for the country of Bolivia.Number of columns is 550 and number of rows is 900. The order of the data is from the lower left to the...

  8. New Mars free-air and Bouguer gravity: Correlation with topography, geology and large impact basins

    Science.gov (United States)

    Frey, Herbert; Bills, Bruce G.; Kiefer, Walter S.; Nerem, R. Steven; Roark, James H.; Zuber, Maria T.

    1993-01-01

    Free-air and Bouguer gravity anomalies from a 50x50 field (MGM635), derived at the Goddard Space Flight Center, with global topography, geology, and the distribution of large impact basins was compared. The free-air gravity anomalies were derived from re-analysis of Viking Orbiter and Mariner 9 tracking data and have a spatial resolution of 250-300 km. Bouguer anomalies were calculated using a 50x50 expansion of the current Mars topography and the GSFC degree 50 geoid as the equipotential reference surface. Rotational flattening was removed using a moment of inertia of 0.365 and the corrections from Table B2 of Sleep and Phillips. Crustal density and mean density were assumed to be 2.9 and 3.93 gm/cm(sup 3). The spherical harmonic topography used has zero mean elevation, and differs from the USGS maps by about 2 km. Comparisons with global geology use a simplified map with about 1/3 the number of units on the current maps. For correlation with impact basins, the recent compilation by Schultz and Frey was used.

  9. Complete Bouguer gravity anomaly map of the state of Colorado

    Science.gov (United States)

    Abrams, Gerda A.

    1993-01-01

    The Bouguer gravity anomaly map is part of a folio of maps of Colorado cosponsored by the National Mineral Resources Assessment Program (NAMRAP) and the National Geologic Mapping Program (COGEOMAP) and was produced to assist in studies of the mineral resource potential and tectonic setting of the State. Previous compilations of about 12,000 gravity stations by Behrendt and Bajwa (1974a,b) are updated by this map. The data was reduced at a 2.67 g/cm3 and the grid contoured at 3 mGal intervals. This map will aid in the mineral resource assessment by indicating buried intrusive complexes, volcanic fields, major faults and shear zones, and sedimentary basins; helping to identify concealed geologic units; and identifying localities that might be hydrothermically altered or mineralized.

  10. Utah Bouguer Gravity Grid

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 2.5 kilometer Bouguer anomaly grid for the state of Utah. Number of columns is 196 and number of rows is 245. The order of the data is from the lower left to the...

  11. Interior Alaska Bouguer Gravity Anomaly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 1 kilometer Complete Bouguer Anomaly gravity grid of interior Alaska. Only those grid cells within 10 kilometers of a gravity data point have gravity values....

  12. Interior Alaska Bouguer Gravity Anomaly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 1 kilometer Complete Bouguer Anomaly gravity grid of interior Alaska. All grid cells within the rectangular data area (from 61 to 66 degrees North latitude and...

  13. Slow-light enhancement of Beer-Lambert-Bouguer absorption

    DEFF Research Database (Denmark)

    Mortensen, Asger; Xiao, Sanshui

    2007-01-01

    We theoretically show how slow light in an optofluidic environment facilitates enhanced light-matter interactions, by orders of magnitude. The proposed concept provides strong opportunities for improving existing miniaturized chemical absorbance cells for Beer-Lambert-Bouguer absorption measureme......We theoretically show how slow light in an optofluidic environment facilitates enhanced light-matter interactions, by orders of magnitude. The proposed concept provides strong opportunities for improving existing miniaturized chemical absorbance cells for Beer-Lambert-Bouguer absorption...

  14. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  15. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    OpenAIRE

    Gunawan, Hendra; Micheldiament, Micheldiament; Mikhailov, Valentin

    2008-01-01

    http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density) estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting ...

  16. Lunar Bouguer gravity anomalies - Imbrian age craters

    Science.gov (United States)

    Dvorak, J.; Phillips, R. J.

    1978-01-01

    The Bouguer gravity of mass anomalies associated with four Imbrian age craters, analyzed in the present paper, are found to differ considerably from the values of the mass anomalies associated with some young lunar craters. Of the Imbrian age craters, only Piccolomini exhibits a negative gravity anomaly (i.e., a low density region) which is characteristic of the young craters studied. The Bouguer gravity anomalies are zero for each of the remaining Imbrian age craters. Since, Piccolomini is younger, or at least less modified, than the other Imbrian age craters, it is suggested that the processes responsible for the post-impact modification of the Imbrian age craters may also be responsible for removing the negative mass anomalies initially associated with these features.

  17. Estimasi Kedalaman Dengan Spektral 1Dimensin Terhadap Data Anomali Bouguer Lengkap

    OpenAIRE

    Maria, Maria

    2003-01-01

    Telah dilakukan interpretasi terhadap data anomali gravitasi Bouguer lengkap dari data pengukuran di daerah gunng Merapi dan gunung Merbabu, untuk mengatahui kedalaman struktur lokal dan regional. Data anomali Bouguer lengkap di proyeksikan ke bidang datar pada ketinggian 4 km di atas sferoida referensi. perhitungan kedalaman struktur lokal dan regional diperolah berdasarkan analisis spektral 1-Dimensi.

  18. Principal facts for gravity data collected in the southern Albuquerque Basin area and a regional compilation, central New Mexico

    Science.gov (United States)

    Gillespie, Cindy L.; Grauch, V.J.S.; Oshetski, Kim; Keller, Gordon R.

    2000-01-01

    Principal facts for 156 new gravity stations in the southern Albuquerque basin are presented. These data fill a gap in existing data coverage. The compilation of the new data and two existing data sets into a regional data set of 5562 stations that cover the Albuquerque basin and vicinity is also described. Bouguer anomaly and isostatic residual gravity data for this regional compilation are available in digital form from ftp://greenwood.cr.usgs.gov/pub/openfile- reports/ofr-00-490.

  19. Idaho Batholith Study Area Bouguer Gravity Grid

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A 2 kilometer Bouguer gravity anomaly grid for the Idaho batholith study area. Number of columns is 331 and number of rows is 285. The order of the data is from the...

  20. Integrated 3D density modelling and segmentation of the Dead Sea

    OpenAIRE

    H.-J. Götze; R. El-Kelani; Sebastian Schmidt; M. Rybakov; M. Hassouneh; Hans-Jürgen Förster; J. Ebbing; DESERT Group;  ;  ;  

    2007-01-01

    A 3D interpretation of the newly compiled Bouguer anomaly in the area of the '‘Dead Sea Rift’’ is presented. A high-resolution 3D model constrained with the seismic results reveals the crustal thickness and density distribution beneath the Arava/Araba Valley (AV), the region between the Dead Sea and the Gulf of Aqaba/Elat. The Bouguer anomalies along the axial portion of the AV, as deduced from the modelling results, are mainly caused by deep-seated sedimentary basins (D > 10 km). An inferred...

  1. The origin of lunar mascons - Analysis of the Bouguer gravity associated with Grimaldi

    Science.gov (United States)

    Phillips, R. J.; Dvorak, J.

    1981-01-01

    Grimaldi is a relatively small multi-ringed basin located on the western limb of the moon. Spacecraft free-air gravity data reveal a mascon associated with the inner ring of this structure, and the topographic correction to the local lunar gravity field indicates a maximum Bouguer anomaly of +90 milligals at an altitude of 70 kilometers. Approximately 20% of this positive Bouguer anomaly can be attributed to the mare material lying within the inner ring of this basin. From a consideration of the Bouguer gravity and structure of large lunar craters comparable in size to the central basin of Grimaldi, it is suggested that the remaining positive Bouguer anomaly is due to a centrally uplifted plug of lunar mantle material. The uplift was caused by inward crustal collapse which also resulted in the formation of the concentric outer scarp of Grimaldi. In addition, an annulus of low density material, probably a combination of ejecta and in situ breccia, is required to fully reproduce the Bouguer gravity signature across this basin. It is proposed that Grimaldi supplies a critical test in the theory of mascon formation: crustal collapse by ring faulting and central uplift to depths of the crust-mantle boundary are requisites

  2. Bouguer Images of the North American Craton

    Science.gov (United States)

    Arvidson, R. E.; Bindschadler, D.; Bowring, S.; Eddy, M.; Guinness, E.; Leff, C.

    1985-01-01

    Processing of existing gravity and aeromagnetic data with modern methods is providing new insights into crustal and mantle structures for large parts of the United States and Canada. More than three-quarters of a million ground station readings of gravity are now available for this region. These data offer a wealth of information on crustal and mantle structures when reduced and displayed as Bouguer anomalies, where lateral variations are controlled by the size, shape and densities of underlying materials. Digital image processing techniques were used to generate Bouguer images that display more of the granularity inherent in the data as compared with existing contour maps. A dominant NW-SE linear trend of highs and lows can be seen extending from South Dakota, through Nebaska, and into Missouri. This trend is probably related to features created during an early and perhaps initial episode of crustal assembly by collisional processes. The younger granitic materials are probably a thin cover over an older crust.

  3. Slow-light enhancement of Beer-Lambert-Bouguer absorption

    OpenAIRE

    Mortensen, Niels Asger; Xiao, Sanshui

    2007-01-01

    We theoretically show how slow light in an optofluidic environment facilitates enhanced light-matter interactions, by orders of magnitude. The proposed concept provides strong opportunities for improving existing miniaturized chemical absorbance cells for Beer-Lambert-Bouguer absorption measurements widely employed in analytical chemistry.

  4. Data reduction and tying in regional gravity surveys—results from a new gravity base station network and the Bouguer gravity anomaly map for northeastern Mexico

    Science.gov (United States)

    Hurtado-Cardador, Manuel; Urrutia-Fucugauchi, Jaime

    2006-12-01

    Since 1947 Petroleos Mexicanos (Pemex) has conducted oil exploration projects using potential field methods. Geophysical exploration companies under contracts with Pemex carried out gravity anomaly surveys that were referred to different floating data. Each survey comprises observations of gravity stations along highways, roads and trails at intervals of about 500 m. At present, 265 separate gravimeter surveys that cover 60% of the Mexican territory (mainly in the oil producing regions of Mexico) are available. This gravity database represents the largest, highest spatial resolution information, and consequently has been used in the geophysical data compilations for the Mexico and North America gravity anomaly maps. Regional integration of gravimeter surveys generates gradients and spurious anomalies in the Bouguer anomaly maps at the boundaries of the connected surveys due to the different gravity base stations utilized. The main objective of this study is to refer all gravimeter surveys from Pemex to a single new first-order gravity base station network, in order to eliminate problems of gradients and spurious anomalies. A second objective is to establish a network of permanent gravity base stations (BGP), referred to a single base from the World Gravity System. Four regional loops of BGP covering eight States of Mexico were established to support the tie of local gravity base stations from each of the gravimeter surveys located in the vicinity of these loops. The third objective is to add the gravity constants, measured and calculated, for each of the 265 gravimeter surveys to their corresponding files in the Pemex and Instituto Mexicano del Petroleo database. The gravity base used as the common datum is the station SILAG 9135-49 (Latin American System of Gravity) located in the National Observatory of Tacubaya in Mexico City. We present the results of the installation of a new gravity base network in northeastern Mexico, reference of the 43 gravimeter surveys

  5. Mars - Crustal structure inferred from Bouguer gravity anomalies.

    Science.gov (United States)

    Phillips, R. J.; Saunders, R. S.; Conel, J. E.

    1973-01-01

    Bouguer gravity has been computed for the equatorial region of Mars by differencing free air gravity and the gravity predicted from topographic variations. The free air gravity was generated from an eighth-order set of spherical harmonic coefficients. The gravity from topographic variations was generated by integrating a two-dimensional Green's function over each contour level. The Bouguer gravity indicates crustal inhomogeneities on Mars that are postulated to be variations in crustal thickness. The Tharsis ridge is a region of thick continental type crust. The gravity data, structural patterns, topography, and surface geology of this region lead to the interpretation of the Tharsis topographic high as a broad crustal upwarp possibly associated with local formation of lower-density crustal material and subsequent rise of a thicker crust. The Amazonis region is one of several basins of relatively thin crust, analogous to terrestrial ocean basins. The Libya and Hellas basins, which are probable impact features, are also underlain by thin crust and are possible regions of mantle upwelling.

  6. Bouguer correction density determination from fractal analysis using ...

    African Journals Online (AJOL)

    In this work, Bouguer density is determined using the fractal approach. This technique was applied to the gravity data of the Kwello area of the Basement Complex, north-western Nigeria. The density obtained using the fractal approach is 2500 kgm which is lower than the conventional value of 2670 kgm used for average ...

  7. Annual accumulation over the Greenland ice sheet interpolated from historical and newly compiled observation data

    Science.gov (United States)

    Shen, Dayong; Liu, Yuling; Huang, Shengli

    2012-01-01

    The estimation of ice/snow accumulation is of great significance in quantifying the mass balance of ice sheets and variation in water resources. Improving the accuracy and reducing uncertainty has been a challenge for the estimation of annual accumulation over the Greenland ice sheet. In this study, we kriged and analyzed the spatial pattern of accumulation based on an observation data series including 315 points used in a recent research, plus 101 ice cores and snow pits and newly compiled 23 coastal weather station data. The estimated annual accumulation over the Greenland ice sheet is 31.2 g cm−2 yr−1, with a standard error of 0.9 g cm−2 yr−1. The main differences between the improved map developed in this study and the recently published accumulation maps are in the coastal areas, especially southeast and southwest regions. The analysis of accumulations versus elevation reveals the distribution patterns of accumulation over the Greenland ice sheet.

  8. Worldwide complete spherical Bouguer and isostatic anomaly maps

    Science.gov (United States)

    Bonvalot, S.; Balmino, G.; Briais, A.; Peyrefitte, A.; Vales, N.; Biancale, R.; Gabalda, G.; Reinquin, F.

    2011-12-01

    We present here a set of digital maps of the Earth's gravity anomalies (surface "free air", Bouguer and isostatic), computed at Bureau Gravimetric International (BGI) as a contribution to the Global Geodetic Observing Systems (GGOS) and to the global geophysical maps published by the Commission for the Geological Map of the World (CGMW). The free air and Bouguer anomaly concept is extensively used in geophysical interpretation to investigate the density distributions in the Earth's interior. Complete Bouguer anomalies (including terrain effects) are usually computed at regional scales by integrating the gravity attraction of topography elements over and beyond a given area (under planar or spherical approximations). Here, we developed and applied a worldwide spherical approach aimed to provide a set of homogeneous and high resolution gravity anomaly maps and grids computed at the Earth's surface, taking into account a realistic Earth model and reconciling geophysical and geodetic definitions of gravity anomalies. This first version (1.0) has been computed by spherical harmonics analysis / synthesis of the Earth's topography-bathymetry up to degree 10800. The detailed theory of the spherical harmonics approach is given in Balmino et al., (Journal of Geodesy, submitted). The Bouguer and terrain corrections have thus been computed in spherical geometry at 1'x1' resolution using the ETOPO1 topography/bathymetry, ice surface and bedrock models from the NOAA (National Oceanic and Atmospheric Administration) and taking into account precise characteristics (boundaries and densities) of major lakes, inner seas, polar caps and of land areas below sea level. Isostatic corrections have been computed according to the Airy Heiskanen model in spherical geometry for a constant depth of compensation of 30km. The gravity information given here is provided by the Earth Geopotential Model (EGM2008), developed at degree 2160 by the National Geospatial Intelligence Agency (NGA) (Pavlis

  9. Bouguer gravity regional and residual separation application to geology and environment

    CERN Document Server

    Mallick, K; Sharma, KK

    2012-01-01

    Resolving regional and residual components arising out of deeper and shallower sources in observed Bouguer gravity anomalies is an old problem. The technique covered here is an attempt to sort out the difficulties that performs better than existing methods.

  10. On the link between particle size and deviations from the Beer–Lambert–Bouguer law for direct transmission

    International Nuclear Information System (INIS)

    Larsen, Michael L.; Clark, Aaron S.

    2014-01-01

    Ballistic photon models of radiative transfer in discrete absorbing random media have demonstrated deviations from the Beer–Lambert–Bouguer law of exponential attenuation. A number of theoretical constructs to quantify the deviation from the Beer–Lambert–Bouguer law have appeared in the literature, several of which rely principally on a statistical measure related to the statistics of the absorber spatial positions alone. Here, we utilize a simple computational model to explore the interplay between the geometric size of the absorbing obstacles and the statistics governing the placement of the absorbers in the volume. We find that a description of the volume that depends on particle size and the spatial statistics of absorbers is not sufficient to fully characterize deviations from the Beer–Lambert–Bouguer law. Implications for future further theoretical and computational explorations of the problem are explored. -- Highlights: • We observe deviations from classical Beer–Lambert–Bouguer behavior in correlated random media. • We have demonstrated that absorber size is a spatial scale relevant to BLB law deviations. • We have argued that BLB deviations that neglect consideration of particle size are bound to fail

  11. Free-air and Bouguer gravity anomalies and the Martian crustal dichotomy

    Science.gov (United States)

    Frey, Herbert; Bills, Bruce G.; Kiefer, Walter S.; Nerem, R. Steven; Roark, James H.; Zuber, Maria T.

    1993-01-01

    Free-air and Bouguer gravity anomalies from a 50x50 field, derived from re-analysis of Viking Orbiter and Mariner 9 tracking data and using a 50x50 expansion of the current Mars topography and the GSFC degree 50 geoid as the equipotential reference surface, with the Martian crustal dichotomy are compared. The spherical harmonic topography used has zero mean elevation, and differs from the USGS maps by about 2 km. In this field the dichotomy boundary in eastern Mars lies mostly at -1 to -2 km elevation. Bouguer gravity anomalies are shown on a map of Noachian, Hesperian, and Amazonian age terrains, simplified from current geologic maps. The map is centered at 300 deg W to show the continuity of the dichotomy boundary. Contour interval is 100 mgals. Gravity and topography were compared along approximately 40 profiles oriented parallel to the dichotomy boundary topographic gradient, to determine how the geophysical character of the boundary changes along its length and what this implies for its origin and development.

  12. Bouguer density analysis using nettleton method at Banten NPP site

    International Nuclear Information System (INIS)

    Yuliastuti; Hadi Suntoko; Yarianto SBS

    2017-01-01

    Sub-surface information become crucial in determining a feasible NPP site that safe from external hazards. Gravity survey which result as density information, is essential to understand the sub-surface structure. Nevertheless, overcorrected or under corrected will lead to a false interpretation. Therefore, density correction in term of near-surface average density or Bouguer density is necessary to be calculated. The objective of this paper is to estimate and analyze Bouguer density using Nettleton method at Banten NPP Site. Methodology used in this paper is Nettleton method that applied in three different slices (A-B, A-C and A-D) with density assumption range between 1700 and 3300 kg/m"3. Nettleton method is based on minimum correlation between gravity anomaly and topography to determine density correction. The result shows that slice A-B which covers rough topography difference, Nettleton method failed. While using the other two slices, Nettleton method yield with a different density value, 2700 kg/m"3 for A-C and 2300 kg/m"3 for A-D. A-C provides the lowest correlation value which represents the Upper Banten tuff and Gede Mt. volcanic rocks in accordance with Quartenary rocks exist in the studied area. (author)

  13. Features of light attenuation in crystals under violation of the Bouguer law

    International Nuclear Information System (INIS)

    Kolesnikov, A. I.; Kaplunov, I. A.; Talyzin, I. V.; Tret'yakov, S. A.; Gritsunova, O. V.; Vorontsova, E. Yu.

    2008-01-01

    A computer simulation and measurements of the light transmittance of germanium and paratellurite crystals of different thickness were used to show that, at scattering probabilities of photons comparable to their absorption probabilities, the standard methods for calculating light extinction coefficients on the basis of the Bouguer law lead to rough errors in estimation of the optical quality of a material.

  14. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  15. Spatial dispersion effects in spectral line broadening by pressure. I. The Bouguer Law and absorption coefficient

    International Nuclear Information System (INIS)

    Cherkasov, M.R.

    1995-01-01

    Based on the general principles of semiclassical electrodynamics, the Bouguer law is derived, and the expression for the absorption coefficient is obtained, formally including all effects related to the phenomenon of spatial dispersion

  16. A Geological and Geophysical Information System for the Middle East and North Africa,

    Science.gov (United States)

    1995-08-14

    Saad, D., Sawaf, T., and Gebran, A., 1990, Bouguer gravity trends and crustal structure of the Palmyride Mountain belt and surrounding northern Arabian ...that occurred between 1977 and 1992 (Figure 2). We have finished compiling a crustal scale Bouguer gravity data for Syria, Israel and Lebanon (Figure...3). This Bouguer gravity database is a part of our attempt to form a uniform grided Bouguer gravity data set for the entire Middle East, which then

  17. of the Bouguer gravity anomaly map using sunshading method (area of the Tangier-Tetuan, Morocco

    Directory of Open Access Journals (Sweden)

    Saad Bakkali

    2007-01-01

    Full Text Available El sombreado es una poderosa herramienta para destacar los bordes de un objeto presente en una imagen. Conociendo la dirección y la elevación de la fuente de iluminación, se puede calcular la reflectancia de las distintas superficies representadas por los datos y así facilitar la interpretación de los mismos. El sombreado se ha convertido en una herramienta universal a la hora de interpretar datos geofísicos de campo de tipo potencial. Datos gravimétricos aéreos y terrestres se obtuvieron en la región de Tanger-Tetuan. A partir de los datos observados y medidos se elaboró el mapa de las anomalías de gravedad de Bouguer. En este artículo se presentan los resultados obtenidos, y su interpretación, al aplicar el método del sombreado aplicado a los mapas de las anomalías de la gravedad de Bouguer del área de Tanger-Tetuan usando técnicas de procesamiento de imágenes.

  18. INTERPRETATION OF BOUGUER ANOMALY TO DETERMINE FAULT AND SUBSURFACE STRUCTURE AT BLAWAN-IJEN GEOTHERMAL AREA

    Directory of Open Access Journals (Sweden)

    Anjar Pranggawan Azhari

    2016-10-01

    Full Text Available Gravity survey has been acquired by Gravimeter Lacoste & Romberg G-1035 at Blawan-Ijen geothermal area. It was a focusing study from previous research. The residual Bouguer anomaly data was obtain after applying gravity data reduction, reduction to horizontal plane, and upward continuation. Result of Bouguer anomaly interpretation shows occurrence of new faults and their relative movement. Blawan fault (F1, F2, F3, and F6 are normal fault. Blawan fault is main fault controlling hot springs at Blawan-Ijen geothermal area. F4 and F5 are oblique fault and forming a graben at Banyupahit River. F7 is reverse fault. Subsurface model shows that Blawan-Ijen geothermal area was dominated by the Ijen caldera forming ignimbrite (ρ1=2.670 g/cm3, embedded shale and sand (ρ2=2.644 g/cm3 as Blawan lake sediments, magma intrusion (ρ3=2.814 g/cm3 & ρ7=2.821 g/cm3, andesite rock (ρ4=2.448 g/cm3 as geothermal reservoir, pyroclastic air fall deposits (ρ5=2.613 g/cm3 from Mt. Blau, and lava flow (ρ6=2.890 g/cm3.

  19. BATI ANADOLU BOUGUER GRAVİTE ANOMALİLERİNİN SÜZGEÇLENMESİ VE YERKABUĞU KALINLIK DAĞILIMININ İNCELENMESİ

    OpenAIRE

    YÜKSEL, Fethi Ahmet

    2005-01-01

    Batı Anadolu Bouguer Gravite haritasının yoruma hazırlanması için iki boyutlu alçak geçişli süzgeçler kullanılmıştır. Elde edilen rejyonal Bouguer gravite anomali haritasından alınan uygun doğrultudaki profillere Talwani yöntemi uygulanarak Batı Anadolu kabuk kalınlığı geometrisi modellenmiştir. Batı Anadolu kabuk kalınlığı kuzeyde Karadeniz sahillerinde ortalama 30 km'den başlayarak Göller Bölgesinde 35-40 km'ye ulaşmakta ve güneyde Akdeniz sahillerinde tekrar 32 km'ye düşerek...

  20. Bouguer images of the North American craton and its structural evolution

    Science.gov (United States)

    Arvidson, R. E.; Bowring, S.; Eddy, M.; Guinness, E.; Leff, C.; Bindschadler, D.

    1984-01-01

    Digital image processing techniques have been used to generate Bouguer images of the North American craton that diplay more of the granularity inherent in the data as compared with existing contour maps. A dominant NW-SE linear trend of highs and lows can be seen extending from South Dakota, through Nebraska, and into Missouri. The structural trend cuts across the major Precambrian boundary in Missouri, separating younger granites and rhyolites from older sheared granites and gneisses. This trend is probably related to features created during an early and perhaps initial episode of crustal assembly by collisional processes. The younger granitic materials are probably a thin cover over an older crust.

  1. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  2. Recognition of possible strong earthquake epicenters. VII. Use of gravitational Bouguer anomaly for California and adjacent regions

    Energy Technology Data Exchange (ETDEWEB)

    Artem' ev, M E; Rotvain, I M; Sadovskii, A M

    1977-01-01

    The possibility of using gravimetric data (Bouguer anomalies) as initial material for determining possible strong earthquake epicenters is determined with the aid of recognition algorithms. This was done for the purpose of correlating geological-geomorphological results and analyzing gravimetric indicators obtained in the study. 9 references, 4 figures, 6 tables.

  3. Strike-slip tectonics and Quaternary basin formation along the Vienna Basin fault system inferred from Bouguer gravity derivatives

    NARCIS (Netherlands)

    Salcher, B. C.; Meurers, B.; Smit, J.; Decker, K.; HöLzel, M.; Wagreich, M.

    2012-01-01

    The Vienna Basin at the transition between the Alpine and Carpathian belt hosts a number of large Pleistocene sub-basins forming along an active continental scale strike-slip fault (Vienna Basin strike-slip fault). We utilize first-order derivatives from industrial Bouguer gravity data to unravel

  4. Effective photons in weakly absorptive dielectric media and the Beer–Lambert–Bouguer law

    International Nuclear Information System (INIS)

    Judge, A C; Brownless, J S; Martijn de Sterke, C; Bhat, N A R; Sipe, J E; Steel, M J

    2014-01-01

    We derive effective photon modes that facilitate an intuitive and convenient picture of photon dynamics in a structured Kramers–Kronig dielectric in the limit of weak absorption. Each mode is associated with a mode field distribution that includes the effects of both material and structural dispersion, and an effective line-width that determines the temporal decay rate of the photon. These results are then applied to obtain an expression for the Beer–Lambert–Bouguer law absorption coefficient for unidirectional propagation in structured media consisting of dispersive, weakly absorptive dielectric materials

  5. Correction to the Beer-Lambert-Bouguer law for optical absorption.

    Science.gov (United States)

    Abitan, Haim; Bohr, Henrik; Buchhave, Preben

    2008-10-10

    The Beer-Lambert-Bouguer absorption law, known as Beer's law for absorption in an optical medium, is precise only at power densities lower than a few kW. At higher power densities this law fails because it neglects the processes of stimulated emission and spontaneous emission. In previous models that considered those processes, an analytical expression for the absorption law could not be obtained. We show here that by utilizing the Lambert W-function, the two-level energy rate equation model is solved analytically, and this leads into a general absorption law that is exact because it accounts for absorption as well as stimulated and spontaneous emission. The general absorption law reduces to Beer's law at low power densities. A criterion for its application is given along with experimental examples. (c) 2008 Optical Society of America

  6. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  7. World Gravity Map: a set of global complete spherical Bouguer and isostatic anomaly maps and grids

    Science.gov (United States)

    Bonvalot, S.; Balmino, G.; Briais, A.; Kuhn, M.; Peyrefitte, A.; Vales, N.; Biancale, R.; Gabalda, G.; Reinquin, F.

    2012-04-01

    We present here a set of digital maps of the Earth's gravity anomalies (surface free air, Bouguer and isostatic), computed at Bureau Gravimetric International (BGI) as a contribution to the Global Geodetic Observing Systems (GGOS) and to the global geophysical maps published by the Commission for the Geological Map of the World (CGMW) with support of UNESCO and other institutions. The Bouguer anomaly concept is extensively used in geophysical interpretation to investigate the density distributions in the Earth's interior. Complete Bouguer anomalies (including terrain effects) are usually computed at regional scales by integrating the gravity attraction of topography elements over and beyond a given area (under planar or spherical approximations). Here, we developed and applied a worldwide spherical approach aimed to provide a set of homogeneous and high resolution gravity anomaly maps and grids computed at the Earth's surface, taking into account a realistic Earth model and reconciling geophysical and geodetic definitions of gravity anomalies. This first version (1.0) has been computed by spherical harmonics analysis / synthesis of the Earth's topography-bathymetry up to degree 10800. The detailed theory of the spherical harmonics approach is given in Balmino et al., (Journal of Geodesy, 2011). The Bouguer and terrain corrections have thus been computed in spherical geometry at 1'x1' resolution using the ETOPO1 topography/bathymetry, ice surface and bedrock models from the NOAA (National Oceanic and Atmospheric Administration) and taking into account precise characteristics (boundaries and densities) of major lakes, inner seas, polar caps and of land areas below sea level. Isostatic corrections have been computed according to the Airy-Heiskanen model in spherical geometry for a constant depth of compensation of 30km. The gravity information given here is provided by the Earth Geopotential Model (EGM2008), developed at degree 2160 by the National Geospatial

  8. 3-D lithospheric structure and regional/residual Bouguer anomalies in the Arabia-Eurasia collision (Iran)

    Science.gov (United States)

    Jiménez-Munt, I.; Fernãndez, M.; Saura, E.; Vergés, J.; Garcia-Castellanos, D.

    2012-09-01

    The aim of this work is to propose a first-order estimate of the crustal and lithospheric mantle geometry of the Arabia-Eurasia collision zone and to separate the measured Bouguer anomaly into its regional and local components. The crustal and lithospheric mantle structure is calculated from the geoid height and elevation data combined with thermal analysis. Our results show that Moho depth varies from ˜42 km at the Mesopotamian-Persian Gulf foreland basin to ˜60 km below the High Zagros. The lithosphere is thicker beneath the foreland basin (˜200 km) and thinner underneath the High Zagros and Central Iran (˜140 km). Most of this lithospheric mantle thinning is accommodated under the Zagros mountain belt coinciding with the suture between two different mantle domains on the Sanandaj-Sirjan Zone. The regional gravity field is obtained by calculating the gravimetric response of the 3-D crustal and lithospheric mantle structure obtained by combining elevation and geoid data. The calculated regional Bouguer anomaly differs noticeably from those obtained by filtering or just isostatic methods. The residual gravity anomaly, obtained by subtraction of the regional components to the measured field, is analyzed in terms of the dominating upper crustal structures. Deep basins and areas with salt deposits are characterized by negative values (˜-20 mGal), whereas the positive values are related to igneous and ophiolite complexes and shallow basement depths (˜20 mGal).

  9. Geração de mapas de anomalia Bouguer a partir de dados gravimétricos terrestres e do EGM2008

    Directory of Open Access Journals (Sweden)

    Gilberto Gagg

    2017-05-01

    Full Text Available Um dos propósitos da determinação da gravidade é a obtenção das anomalias da gravidade. O campo de gravidade é um campo potencial e sofre influência da constituição geológica, pois a densidade das rochas gera pequenas variações nos valores de gravidade. As reduções matemáticas buscam eliminar a influência de fatores que interferem no campo de gravidade. A anomalia Bouguer remove o efeito gravitacional das rochas existentes entre o ponto de observação e o nível de referência. Dados da missão gravimétrica GRACE (Gravity Recovery and Climate Experiment tem auxiliado na densificação das informações, pois muitos modelos geopotenciais são deficientes devido a irregular distribuição dos dados gravimétricos. Assim, efetuou-se um estudo para o estado do Rio Grande do Sul-RS, visando gerar mapas de anomalia Bouguer através de uma análise gravimétrica sob dois aspectos: uso exclusivo de dados do modelo EGM2008 (envolve uso de dados GRACE, e uso combinado de dados EGM2008 (porção oceânica e porção continental externa ao RS com dados terrestres de campo para o Rio Grande do Sul. Concluiu-se que o emprego de dados de missão satelitais facilmente acessíveis, permitiu gerar mapas de anomalias Bouguer que atendem objetivos de cunho geral e até regional, com a vantagem de serem atuais e uniformemente espaçados.

  10. BATI ANADOLU BOUGUER GRAVİTE ANOMALİLERİNİN SÜZGEÇLENMESİ VE YERKABUĞU KALINLIK DAĞILIMININ İNCELENMESİ

    Directory of Open Access Journals (Sweden)

    Fethi Ahmet YÜKSEL

    2005-01-01

    Full Text Available Batı Anadolu Bouguer Gravite haritasının yoruma hazırlanması için iki boyutlu alçak geçişli süzgeçler kullanılmıştır. Elde edilen rejyonal Bouguer gravite anomali haritasından alınan uygun doğrultudaki profillere Talwani yöntemi uygulanarak Batı Anadolu kabuk kalınlığı geometrisi modellenmiştir. Batı Anadolu kabuk kalınlığı kuzeyde Karadeniz sahillerinde ortalama 30 km'den başlayarak Göller Bölgesinde 35-40 km'ye ulaşmakta ve güneyde Akdeniz sahillerinde tekrar 32 km'ye düşerek inceldiği hesaplanmıştır. Ege sahillerinde ise, kabuk ortalama 32 km'den başlayarak Anadolu'nun içlerine doğru B-D doğrultusunda, 40 km. kalınlığa ulaştığı belirlenmiştir.

  11. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  12. A simple Bouguer gravity anomaly map of southwestern Saudi Arabia and an initial interpretation

    Science.gov (United States)

    Gettings, M.E.

    1983-01-01

    Approximately 2,200 gravity stations on a 10-km2 grid were used to construct a simple Bouguer gravity anomaly map at 1:2,000,000 scale along a 150-km-wide by 850-km-long strip of the Arabian Peninsula from Sanam, southwest of Ar Riyad, through the Farasan Islands and including offshore islands, the coastal plain, and the Hijaz-Asir escarpment from Jiddah to the Yemen border. On the Precambrian Arabian Shield, local positive gravity anomalies are associated with greenstone belts, gneiss domes, and the Najd fault zones. Local negative gravity anomalies correlate with granitic plutonic rocks. A steep gravity gradient of as much as 4 mgal-km-1 marks the continental margin on the coastal plain near the southwestern end of the strip. Bouguer gravity anomaly values range from -10 to +40 mgal southwest of this gradient and from -170 to -100 mgal in a 300-km-wide gravity minimum northeast of the gradient. Farther northeast, the minimum is terminated by a regional gradient of about 0.1 mgal-km-1 that increases toward the Arabian Gulf. The regional gravity anomaly pattern has been modeled by using seismic refraction and Raleigh wave studies, heat-flow measurements, and isostatic considerations as constraints. The model is consistent with the hypothesis of upwelling of hot mantle material beneath the Red Sea and lateral mantle flow beneath the Arabian plate. The model yields best-fitting average crustal densities of 2.80 g-cm-3 (0-20 km depth) and 3.00 g-cm-3 (20-40 km depth) southwest of the Nabitah suture zone and 2.74 g-cm-3 (0-20 km depth) and 2.94 g-cm-3 (20-40 km depth) northeast of the suture zone. The gravity model requires that the crust be about 20 km thick at the continental margin and that the lower crust between the margin and Bishah (lat 20? N., long 42.5? E.) be somewhat denser than the lower crust to the northeast. Detailed correlations between 1:250,000- and 1:500,000-scale geologic maps and the gravity anomaly map suggest that the greenstone belts associated

  13. Eastern US crustal thickness estimates from spectral analysis and inversion of onshore Bouguer gravity anaomalies

    Science.gov (United States)

    Dybus, W.; Benoit, M. H.; Ebinger, C. J.

    2011-12-01

    The crustal thickness beneath much of the eastern half of the US is largely unconstrained. Though there have been several controlled source seismic surveys of the region, many of these studies suffer from rays that turn in the crust above the Moho, resulting in somewhat ambiguous crustal thickness values. Furthermore, the broadband seismic station coverage east of the Mississippi has been limited, and most of the region remains largely understudied. In this study, we estimated the depth to the Moho using both spectral analysis and inversion of Bouguer gravity anomalies. We systematically estimated depths to lithospheric density contrasts from radial power spectra of Bouguer gravity within 100 km X 100 km windows eastward from the Mississippi River to the Atlantic Coast, and northward from North Carolina to Maine. The slopes and slope breaks in the radial power spectra were computed using an automated algorithm. The slope values for each window were visually inspected and then used to estimate the depth to the Moho and other lithospheric density contrasts beneath each windowed region. Additionally, we performed a standard Oldenburg-Parker inversion for lithospheric density contrasts using various reference depths and density contrasts that are realistic for the different physiographic provinces in the Eastern US. Our preliminary results suggest that the gravity-derived Moho depths are similar to those found using seismic data, and that the crust is relatively thinner (~28-33 km) than expected in beneath the Piedmont region (~35-40 km). Given the relative paucity of seismic data in the eastern US, analysis of onshore gravity data is a valuable tool for interpolating between seismic stations.

  14. Seismic b-values and its correlation with seismic moment and Bouguer gravity anomaly over Indo-Burma ranges of northeast India: Tectonic implications

    Science.gov (United States)

    Bora, Dipok K.; Borah, Kajaljyoti; Mahanta, Rinku; Borgohain, Jayanta Madhab

    2018-03-01

    b-value is one of the most significant seismic parameters for describing the seismicity of a given region at a definite time window. In this study, high-resolution map of the Gutenberg-Richter b-value, seismic moment-release, Bouguer gravity anomaly and fault-plane solutions containing faulting styles are analyzed in the Indo-Burma ranges of northeast India using the unified and homogeneous part of the seismicity record in the region (January 1964-December 2016). The study region is subdivided into few square grids of geographical window size 1° × 1° and b-values are calculated in each square grid. Our goal is to explore the spatial correlations and anomalous patterns between the b-value and parameters like seismic moment release, Bouguer gravity anomaly and faulting styles that can help us to better understand the seismotectonics and the state of present-day crustal stress within the Indo-Burma region. Most of the areas show an inverse correlation between b-value and seismic moment release as well as convergence rates. While estimating the b-value as a function of depth, a sudden increase of b-value at a depth of 50-60 km was found out and the receiver function modeling confirms that this depth corresponds to the crust-mantle transition beneath the study region. The region is also associated with negative Bouguer gravity anomalies and an inverse relation is found between Gravity anomaly and b-value. Comparing b-values with different faulting styles, reveal that the areas containing low b-values show thrust mechanism, while the areas associated with intermediate b-values show strike-slip mechanism. Those areas, where the events show thrust mechanism but containing a strike-slip component has the highest b-value.

  15. Bouguer gravity and crustal structure of the Dead Sea transform fault and adjacent mountain belts in Lebanon

    Science.gov (United States)

    Kamal; Khawlie, Mohamad; Haddad, Fuad; Barazangi, Muawia; Seber, Dogan; Chaimov, Thomas

    1993-08-01

    The northern extension of the Dead Sea transform fault in southern Lebanon bifurcates into several faults that cross Lebanon from south to north. The main strand, the Yammouneh fault, marks the boundary between the Levantine (eastern Mediterranean) and Arabian plates and separates the western mountain range (Mount Lebanon) from the eastern mountain range (Anti-Lebanon). Bouguer gravity contours in Lebanon approximately follow topographic contours; i.e., positive Bouguer anomalies are associated with the Mount Lebanon and Anti-Lebanon ranges. This suggests that the region is not in simple isostatic compensation. Gravity observations based on 2.5-dimensional modeling and other available geological and geophysical information have produced the following interpretations. (1) The crust of Lebanon thins from ˜35 km beneath the Anti-Lebanon range, near the Syrian border, to ˜27 km beneath the Lebanese coast. No crustal roots exist beneath the Lebanese ranges. (2) The depth to basement is ˜3.5-6 km below sea level under the ranges and is ˜8-10 km beneath the Bekaa depression. (3) The Yammouneh fault bifurcates northward into two branches; one passes beneath the Yammouneh Lake through the eastern part of Mount Lebanon and another bisects the northern part of the Bekaa Valley (i.e., Mid-Bekaa fault). The Lebanese mountain ranges and the Bekaa depression were formed as a result of transtension and later transpression associated with the relative motion of a few crustal blocks in response to the northward movement of the Arabian plate relative to the Levantine plate.

  16. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide

    Science.gov (United States)

    Smittle, Aaron M.; Shoberg, Thomas G.

    2017-06-16

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  17. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  18. Nose Structure Delineation of Bouguer Anomaly as the Interpretation Basis of Probable Hydrocarbon Traps: A Case Study on the Mainland Area of Northwest Java Basin

    Directory of Open Access Journals (Sweden)

    Kamtono Kamtono

    2014-06-01

    Full Text Available DOI: 10.17014/ijog.v7i3.144Two important aspects in the exploration of oil and gas are technology and exploration concepts, but the use of technology is not always suitable for areas with geological conditions covered by young volcanic sediments or limestone. The land of the Northwest Java Basin is mostly covered by young volcanic products, so exploration using seismic methods will produce less clear image resolution. To identify and interpret the subsurface structure and the possibility of hydrocarbon trap, gravity measurements have been carried out. Delineation of nose structures of a Bouguer anomaly map was used to interpret the probability of hydrocarbon traps. The result of the study shows that the gravity anomalies could be categorized into three groups : low anomaly (< 34 mgal, middle anomaly (34 - 50 mgal, and high anomaly (> 50 mgal. The analysis of Bouguer anomaly indicates that the low anomaly is concentrated in Cibarusa area as a southern part of Ciputat Subbasin, and in Cikampek area. The result of delineation of the Bouguer anomaly map shows the nose structures existing on Cibinong-Cileungsi and Pangkalan-Bekasi Highs, while delineation of residual anomaly map shows the nose structures occurs on Cilamaya-Karawang high. Locally, the gas fields of Jatirangon and Cicauh areas exist on the flank of the nose structure of Pangkalan-Bekasi High, while the oil/gas field of Northern Cilamaya is situated on the flank of the nose structure of Cilamaya-Karawang High. The concept of fluid/gas migration concentrated on nose structures which are delineated from gravity data can be applied in the studied area. This concept needs to be tested in other oil and gas field areas.

  19. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  20. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  1. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  2. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  3. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  4. Principal facts for about 16,000 gravity stations in the Nevada Test Site and vicinity

    International Nuclear Information System (INIS)

    Harris, R.N.; Ponce, D.A.; Oliver, H.W.; Healey, D.L.

    1989-01-01

    The Nevada Test Site (NTS) and vicinity includes portions of the Goldfield, Caliente, Death Valley, and Las Vegas. This report documents and consolidates previously published and recently compiled gravity data to establish a gravity data base of about 16,000 stations for the NTS and vicinity. While compiling data sets, redundant stations and stations having doubtful locations or gravity values were excluded. Details of compiling the gravity data sets are discussed in later sections. Where feasible, an accuracy code has been assigned to each station so that the accuracy or reliability of each station can be evaluated. This data base was used in preparing complete Bouguer and isostatic gravity maps of the NTS and vicinity. Since publication of the complete Bouguer gravity map, additional data were incorporated into the isostatic gravity map. Gravity data were compiled from five sources: 14,183 stations from the US Geological Survey (USGS), 326 stations from Exploration Data Consultants (EDCON) of Denver, Colorado, 906 stations from the Los Alamos National Laboratory (LANL), 212 stations from the University of Texas at Dallas (UTD), and 48 stations from the Defense Mapping Agency (DMA). This investigation is an effort to study several areas for potential storage of high-level radioactive waste. Gravity stations established under YMP are shown. The objective of this gravity survey was to explore for the presence of plutons. This volume contains only compiled data

  5. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  6. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  7. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  8. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  9. Convergence of the Bouguer-Beer law for radiation extinction in particulate media

    Science.gov (United States)

    Frankel, A.; Iaccarino, G.; Mani, A.

    2016-10-01

    Radiation transport in particulate media is a common physical phenomenon in natural and industrial processes. Developing predictive models of these processes requires a detailed model of the interaction between the radiation and the particles. Resolving the interaction between the radiation and the individual particles in a very large system is impractical, whereas continuum-based representations of the particle field lend themselves to efficient numerical techniques based on the solution of the radiative transfer equation. We investigate radiation transport through discrete and continuum-based representations of a particle field. Exact solutions for radiation extinction are developed using a Monte Carlo model in different particle distributions. The particle distributions are then projected onto a concentration field with varying grid sizes, and the Bouguer-Beer law is applied by marching across the grid. We show that the continuum-based solution approaches the Monte Carlo solution under grid refinement, but quickly diverges as the grid size approaches the particle diameter. This divergence is attributed to the homogenization error of an individual particle across a whole grid cell. We remark that the concentration energy spectrum of a point-particle field does not approach zero, and thus the concentration variance must also diverge under infinite grid refinement, meaning that no grid-converged solution of the radiation transport is possible.

  10. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  11. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    Science.gov (United States)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  12. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  13. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  14. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  15. Principal facts for about 16,000 gravity stations in the Nevada Test Site and vicinity

    International Nuclear Information System (INIS)

    Harris, R.N.; Ponce, D.A.; Oliver, H.W.; Healey, D.L.

    1989-01-01

    The Nevada Test Site (NTS) and vicinity includes portions of the Goldfield, Caliente, Death Valley, and Las Vegas. This report documents and consolidates previously published and recently compiled gravity data to establish a gravity data base of about 16,000 stations for the NTS and vicinity. While compiling data sets, redundant stations and stations having doubtful locations or gravity values were excluded. Details of compiling the gravity data sets are discussed in later sections. Where feasible, an accuracy code has been assigned to each station so that the accuracy or reliability of each station can be evaluated. This data base was used in preparing complete Bouguer and isostatic gravity maps of the NTS and vicinity. Since publication of the complete Bouguer gravity map, additional data were incorporated into the isostatic gravity map. Gravity data were compiled from five sources: 14,183 stations from the US Geological Survey (USGS), 326 stations from Exploration Data Consultants (EDCON) of Denver, Colorado, 906 stations from the Los Alamos National Laboratory (LANL), 212 stations from the University of Texas at Dallas (UTD), and 48 stations from the Defense Mapping Agency (DMA). This investigation is an effort to study several areas for potential storage of high-level radioactive waste. Gravity stations established under YMP are shown. The objective of this gravity survey was to explore for the presence of plutons. 33 refs., 24 figs., 9 tabs

  16. Vertical and Horizontal Analysis of Crustal Structure of Southeastern Mediterranean and the Egyptian Coastal Zone, from Bouguer and Satellite Mission Data

    Science.gov (United States)

    Saleh, Salah

    2016-07-01

    The present Tectonic system of Southeastern Mediterranean is driven by the collision of the African and Eurasian plates, the Arabian Eurasian convergence and the displacement of the Anatolian Aegean microplate, which generally represents the characteristic of lithospheric structure of the region. In the scope of this study, Bouguer and the satellite gravity (satellite altimetry) anomalies of southeastern Mediterranean and North Eastern part of Egypt were used for investigating the lithospheric structures. Second order trend analyses were applied firstly to Bouguer and satellite altimetry data for examining the characteristic of the anomaly. Later, the vertical and horizontal derivatives applications were applied to the same data. Generally, the purpose of the applying derivative methods is determining the vertical and horizontal borders of the structure. According to the results of derivatives maps, the study area could mainly divided into important four tectonic subzones depending on basement and Moho depth maps. These subzones are distributed from south to the north as: Nile delta-northern Sinai zone, north Egyptian coastal zone, Levantine basin zone and northern thrusting (Cyprus and its surroundings) zone. These zones are separated from each other by horizontal tectonic boundaries and/or near-vertical faults that display the block-faulting tectonic style of this belt. Finally, the gravity studies were evaluated together with the seismic activity of the region. Consequently, the geodynamical structure of the region is examined with the previous studies done in the region. Thus, the current study indicates that satellite gravity mission data is a valuable source of data in understanding the tectonic boundary behavior of the studied region and that satellite gravity data is an important modern source of data in the geodynamical studies.

  17. Geologic implications of topographic, gravity, and aeromagnetic data in the northern Yukon-Koyukuk province and its borderlands, Alaska

    Science.gov (United States)

    Cady, J.W.

    1989-01-01

    The northern Yukon-Koyukuk province is characterized by low elevation and high Bouguer gravity and aeromagnetic anomalies in contrast to the adjacent Brooks Range and Ruby geanticline. Using newly compiled digital topographic, gravity, and aeromagnetic maps, the province is divided into three geophysical domains. The Koyukuk domain, which is nearly equivalent to the Koyukuk lithotectonic terrane, is a horseshoe-shaped area, open to the south, of low topography, high gravity, and high-amplitude magnetic anomalies caused by an intraoceanic magmatic arc. The Angayucham and Kanuti domains are geophysical subdivisions of the Angayucham lithotectonic terrane that occur along the northern and southeastern margins of the Yukon-Koyukuk province, where oceanic rocks have been thrust over continental rocks of the Brooks Range and Ruby geanticline. The modeling supports, but does not prove, the hypothesis that the crust of the Kobuk-Koyukuk basin is 32-35 km thick, consisting of a tectonically thickened section of Cretaceous volcanic and sedimentary rocks and older oceanic crust. -from Author

  18. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  19. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  20. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  1. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  2. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  3. DETERMINATION OF THE COVERED FAULTS BY USE OF BOUGUER ANOMALIES AND MODELLING OF UNDERGROUND STRUCTURE OF İSTANBUL-SİLİVRİ REGION

    Directory of Open Access Journals (Sweden)

    Fethi Ahmet YÜKSEL

    2001-03-01

    Full Text Available In this study, a new method is presented for the determination of the covered vertical discontinuous whose effects can not be observed in the Bouguer's gravity anomaly map of Istanbul-Silivri region. This method is based on the second vertical derivative values as well as cross-correlation between the second derivative values for a theoretical vertical discontinuous model. The maximum or minimum values of cross-correlation function takes place on the origin points of vertical discontinuity. The proposed method is applied to modelling of covered lineament structure of Silivri region after the method is tested for one and two dimensional theoretical models.

  4. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  5. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  6. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  7. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  8. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  9. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  10. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  11. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  12. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  13. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  14. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  15. A genomic audit of newly-adopted autosomal STRs for forensic identification.

    Science.gov (United States)

    Phillips, C

    2017-07-01

    In preparation for the growing use of massively parallel sequencing (MPS) technology to genotype forensic STRs, a comprehensive genomic audit of 73 STRs was made in 2016 [Parson et al., Forensic Sci. Int. Genet. 22, 54-63]. The loci examined included miniSTRs that were not in widespread use, but had been incorporated into MPS kits or were under consideration for this purpose. The current study expands the genomic analysis of autosomal STRs that are not commonly used, to include the full set of developed miniSTRs and an additional 24 STRs, most of which have been recently included in several supplementary forensic multiplex kits for capillary electrophoresis. The genomic audit of these 47 newly-adopted STRs examined the linkage status of new loci on the same chromosome as established forensic STRs; analyzed world-wide population variation of the newly-adopted STRs using published data; assessed their forensic informativeness; and compiled the sequence characteristics, repeat structures and flanking regions of each STR. A further 44 autosomal STRs developed for forensic analyses but not incorporated into commercial kits, are also briefly described. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  17. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  18. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  19. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  20. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  1. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  2. Aeromagnetic maps of the Colorado River region including the Kingman, Needles, Salton Sea, and El Centro 1 degree by 2 degrees quadrangles, California, Arizona, and Nevada

    Science.gov (United States)

    Mariano, John; Grauch, V.J.

    1988-01-01

    Aeromagnetic data for the Colorado river region have been compiled as part of the Pacific to Arizona Crustal Experiment (PACE) Project. The data are presented here in a series of six compilations for the Kingman, Needles, Salton Sea, and El Centro 1 degree by 2 degree quadrangles, California, Arizona, and Nevada, at scales of 1:250,000 and 1:750,000. The scales and map areas are identical to those used by Mariano and others (1986) to display the Bouguer and isotatic residual gravity for this region. Data were compiled separately for the Kingman quadrangle, the Needles quadrangle, and an area covering the Salton Sea quadrangle and part of the El Centro quadrangle.

  3. Compilation of benchmark results for fusion related Nuclear Data

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito

    1998-11-01

    This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs

  4. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  5. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  6. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  7. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  8. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  9. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  10. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  11. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  12. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  13. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  14. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  15. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  16. Gravity data from the San Pedro River Basin, Cochise County, Arizona

    Science.gov (United States)

    Kennedy, Jeffrey R.; Winester, Daniel

    2011-01-01

    The U.S. Geological Survey, Arizona Water Science Center in cooperation with the National Oceanic and Atmospheric Administration, National Geodetic Survey has collected relative and absolute gravity data at 321 stations in the San Pedro River Basin of southeastern Arizona since 2000. Data are of three types: observed gravity values and associated free-air, simple Bouguer, and complete Bouguer anomaly values, useful for subsurface-density modeling; high-precision relative-gravity surveys repeated over time, useful for aquifer-storage-change monitoring; and absolute-gravity values, useful as base stations for relative-gravity surveys and for monitoring gravity change over time. The data are compiled, without interpretation, in three spreadsheet files. Gravity values, GPS locations, and driving directions for absolute-gravity base stations are presented as National Geodetic Survey site descriptions.

  17. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  18. Clarifying the interplate main tectonic elements of Western Anatolia, Turkey by using GNSS velocities and Bouguer gravity anomalies

    Science.gov (United States)

    Çırmık, Ayça; Pamukçu, Oya

    2017-10-01

    In this study, the GNSS and gravity data were processed and compared together for examining the continental structures of the Western Anatolia region which has very complicated tectonism. The GNSS data of three national projects were processed and GNSS velocities were found as approximately 25 mm per year towards southwest with respect to the Eurasia fixed frame. In order to investigate the interplate motions of the region, the Anatolian and Aegean block solutions were calculated and the differences in directions and amplitudes of velocities were observed particularly in the Anatolian block solution. Due to the Anatolian block solutions, the study area was grouped into three regions and compared with the tectonic structures as the first time for Western Anatolia by this study. Additionally, W-E and N-S relative GNSS solutions were obtained for observing the possible tectonic borders of the study area. Besides, 2nd order horizontal derivative and low-pass filter methods were applied to Bouguer gravity anomalies and the results of the gravity applications and the changes on crustal-mantle interface were compared with the GNSS horizontal velocities.

  19. Bouguer gravity trends and crustal structure of the Palmyride Mountain belt and surrounding northern Arabian platform in Syria

    Science.gov (United States)

    Best, John A.; Barazangi, Muawia; Al-Saad, Damen; Sawaf, Tarif; Gebran, Ali

    1990-12-01

    This study examines the crustal structure of the Palmyrides and the northern Arabian platform in Syria by two- and three-dimensional modeling of the Bouguer gravity anomalies. Results of the gravity modeling indicate that (1) western Syria is composed of at least two different crustal blocks, (2) the southern crustal block is penetrated by a series of crustal-scale, high-density intrusive complexes, and (3) short-wavelength gravity anomalies in the southwest part of the mountain belt are clearly related to basement structure. The crustal thickness in Syria, as modeled on the gravity profiles, is approximately 40 ±4 km, which is similar to crustal thicknesses interpreted from refraction data in Jordan and Saudi Arabia. The different crustal blocks and large-scale mafic intrusions are best explained, though not uniquely, by Proterozoic convergence and suturing and early Paleozoic rifting, as interpreted in the exposed rocks of the Arabian shield. These two processes, combined with documented Mesozoic rifting and Cenozoic transpression, compose the crustal evolution of the northern Arabian platform beneath Syria.

  20. Bouguer gravity trends and crustal structure of the Palmyride Mountain belt and surrounding northern Arabian platform in Syria

    Energy Technology Data Exchange (ETDEWEB)

    Best, J.A.; Barazangi, M. (Cornell Univ., Ithaca, NY (USA)); Al-Saad, D.; Sawaf, T.; Gebran, A. (Syrian Petroleum Company, Damascus (Syria))

    1990-12-01

    This study examines the crustal structure of the Palmyrides and the northern Arabian platform in Syria by two- and three-dimensional modeling of the Bouguer gravity anomalies. Results of the gravity modeling indicate that (1) western Syria is composed of at least two different crustal blocks, (2) the southern crustal block is penetrated by a series of crustal-scale, high-density intrusive complexes, and (3) short-wavelength gravity anomalies in the southwest part of the mountain belt are clearly related to basement structure. The crustal thickness in Syria, as modeled on the gravity profiles, is approximately 40{plus minus}4 km, which is similar to crustal thicknesses interpreted from refraction data in Jordan and Saudi Arabia. The different crustal blocks and large-scale mafic intrusions are best explained, though not uniquely, by Proterozoic convergence and suturing and early Paleozoic rifting, as interpreted in the exposed rocks of the Arabian shield. These two processes, combined with documented Mesozoic rifting and Cenozoic transpression, compose the crustal evolution of the northern Arabian platform beneath Syria.

  1. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  2. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  3. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  4. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  5. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  6. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  7. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  8. New gravity anomaly map of Taiwan and its surrounding regions with some tectonic interpretations

    Science.gov (United States)

    Doo, Wen-Bin; Lo, Chung-Liang; Hsu, Shu-Kun; Tsai, Ching-Hui; Huang, Yin-Sheng; Wang, Hsueh-Fen; Chiu, Shye-Donq; Ma, Yu-Fang; Liang, Chin-Wei

    2018-04-01

    In this study, we compiled recently collected (from 2005 to 2015) and previously reported (published and open access) gravity data, including land, shipborne and satellite-derived data, for Taiwan and its surrounding regions. Based on the cross-over error analysis, all data were adjusted; and, new Free-air gravity anomalies were obtained, shedding light on the tectonics of the region. To obtain the Bouguer gravity anomalies, the densities of land terrain and marine sediments were assumed to be 2.53 and 1.80 g/cm3, respectively. The updated gravity dataset was gridded with a spacing of one arc-minute. Several previously unnoticed gravity features are revealed by the new maps and can be used in a broad range of applications: (1) An isolated gravity high is located between the Shoushan and the Kaoping Canyon off southwest Taiwan. (2) Along the Luzon Arc, both Free-air and Bouguer gravity anomaly maps reveal a significant gravity discontinuity feature at the latitude of 21°20‧N. (3) In the southwestern Okinawa Trough, the NE-SW trending cross-back-arc volcanic trail (CBVT) marks the low-high gravity anomaly (both Free-air and Bouguer) boundary.

  9. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  10. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  11. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  12. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  13. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  14. Materials and process engineering projects for the Sandia National Laboratories/Newly Independent States Industrial Partnering Program. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Zanner, F.J.; Moffatt, W.C.

    1995-07-01

    In July, 1994, a team of materials specialists from Sandia and U S Industry traveled to Russia and the Ukraine to select and fund projects in materials and process technology in support of the Newly Independent States/Industrial Partnering Program (NIS/IPP). All of the projects are collaborations with scientists and Engineers at NIS Institutes. Each project is scheduled to last one year, and the deliverables are formatted to supply US Industry with information which will enable rational decisions to be made regarding the commercial value of these technologies. This work is an unedited interim compilation of the deliverables received to date.

  15. Materials and process engineering projects for the Sandia National Laboratories/Newly Independent States Industrial Partnering Program. Volume 2

    International Nuclear Information System (INIS)

    Zanner, F.J.; Moffatt, W.C.

    1995-07-01

    In July, 1994, a team of materials specialists from Sandia and US. Industry traveled to Russia and the Ukraine to select and fund projects in materials and process technology in support of the Newly Independent States/Industrial Partnering Program (NIS/IPP). All of the projects are collaborations with scientists and Engineers at NIS Institutes. Each project is scheduled to last one year, and the deliverables are formatted to supply US. Industry with information which will enable rational decisions to be made regarding the commercial value of these technologies. This work is an unedited interim compilation of the deliverables received to date

  16. Materials and process engineering projects for the Sandia National Laboratories/Newly Independent States Industrial Partnering Program. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Zanner, F.J.; Moffatt, W.C.

    1995-07-01

    In July, 1994, a team of materials specialists from Sandia and US. Industry traveled to Russia and the Ukraine to select and fund projects in materials and process technology in support of the Newly Independent States/Industrial Partnering Program (NIS/IPP). All of the projects are collaborations with scientists and Engineers at NIS Institutes. Each project is scheduled to last one year, and the deliverables are formatted to supply US. Industry with information which will enable rational decisions to be made regarding the commercial value of these technologies. This work is an unedited interim compilation of the deliverables received to date.

  17. Gravity Maps of Antarctic Lithospheric Structure from Remote-Sensing and Seismic Data

    Science.gov (United States)

    Tenzer, Robert; Chen, Wenjin; Baranov, Alexey; Bagherbandi, Mohammad

    2018-02-01

    Remote-sensing data from altimetry and gravity satellite missions combined with seismic information have been used to investigate the Earth's interior, particularly focusing on the lithospheric structure. In this study, we use the subglacial bedrock relief BEDMAP2, the global gravitational model GOCO05S, and the ETOPO1 topographic/bathymetric data, together with a newly developed (continental-scale) seismic crustal model for Antarctica to compile the free-air, Bouguer, and mantle gravity maps over this continent and surrounding oceanic areas. We then use these gravity maps to interpret the Antarctic crustal and uppermost mantle structure. We demonstrate that most of the gravity features seen in gravity maps could be explained by known lithospheric structures. The Bouguer gravity map reveals a contrast between the oceanic and continental crust which marks the extension of the Antarctic continental margins. The isostatic signature in this gravity map confirms deep and compact orogenic roots under the Gamburtsev Subglacial Mountains and more complex orogenic structures under Dronning Maud Land in East Antarctica. Whereas the Bouguer gravity map exhibits features which are closely spatially correlated with the crustal thickness, the mantle gravity map reveals mainly the gravitational signature of the uppermost mantle, which is superposed over a weaker (long-wavelength) signature of density heterogeneities distributed deeper in the mantle. In contrast to a relatively complex and segmented uppermost mantle structure of West Antarctica, the mantle gravity map confirmed a more uniform structure of the East Antarctic Craton. The most pronounced features in this gravity map are divergent tectonic margins along mid-oceanic ridges and continental rifts. Gravity lows at these locations indicate that a broad region of the West Antarctic Rift System continuously extends between the Atlantic-Indian and Pacific-Antarctic mid-oceanic ridges and it is possibly formed by two major

  18. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  19. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  20. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  1. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  2. High resolution Slovak Bouguer gravity anomaly map and its enhanced derivative transformations: new possibilities for interpretation of anomalous gravity fields

    Science.gov (United States)

    Pašteka, Roman; Zahorec, Pavol; Kušnirák, David; Bošanský, Marián; Papčo, Juraj; Szalaiová, Viktória; Krajňák, Martin; Ivan, Marušiak; Mikuška, Ján; Bielik, Miroslav

    2017-06-01

    The paper deals with the revision and enrichment of the present gravimetric database of the Slovak Republic. The output of this process is a new version of the complete Bouguer anomaly (CBA) field on our territory. Thanks to the taking into account of more accurate terrain corrections, this field has significantly higher quality and higher resolution capabilities. The excellent features of this map will allow us to re-evaluate and improve the qualitative interpretation of the gravity field when researching the structural and tectonic geology of the Western Carpathian lithosphere. In the contribution we also analyse the field of the new CBA based on the properties of various transformed fields - in particular the horizontal gradient, which by its local maximums defines important density boundaries in the lateral direction. All original and new transformed maps make a significant contribution to improving the geological interpretation of the CBA field. Except for the horizontal gradient field, we are also interested in a new special transformation of TDXAS, which excellently separates various detected anomalies of gravity field and improves their lateral delimitation.

  3. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  4. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  5. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  6. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  7. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  8. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  9. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  10. A Study on the Compatibility of 3-D Seismic Velocity Structures with Gravity Data of Taiwan

    Directory of Open Access Journals (Sweden)

    Horng-Yuan Yen Hsien-Hsiang Hsieh

    2010-01-01

    Full Text Available The Bouguer anomaly of Taiwan has been revised in this study based on more accurate terrain data provided by the Taiwanese Digital Terrain Model compiled by the Taiwan Forestry Bureau. Three seismic velocity models, those determined by Rau and Wu (1995, Kim et al. (2005, and Wu et al. (2007 respectively, were selected for our study. We converted their velocity models to density models using the relationship between P-wave velocity and rock density proposed by Ludwig et al. (1970 and Barton (1986, and then calculated their corresponding gravity anomalies. According to the correlation coefficient between the Bouguer anomalies calculated from the velocity models and the revised Bouguer anomalies, the Kim et al. model was more compatible with gravity data than the other two velocity models. The differences between the revised gravity anomaly and the calculated gravity anomalies trend toward positive values at elevations higher than 2000 m. This indicates that the velocities at the shallower depths beneath the mountainous area of the three models are overdetermined, i.e., higher than the real velocities. This ratiocination implies that the crustal thickness beneath the Central Range is less than 55 km which was obtained from the velocity models.

  11. Crustal structure along the DESERT 2000 Transect inferred from 3-D gravity modelling

    Science.gov (United States)

    El-Kelani, R.; Goetze, H.; Rybakov, M.; Hassouneh, M.; Schmidt, S.

    2003-12-01

    A three-dimensional interpretation of the newly compiled Bouguer anomaly map is part of the DESERT 2000 Transect. That is multi-disciplinary and multinational project studying for first time the Dead Sea Transform (DST) fault system (DST) from the Mediterranean Sea to Saudi Arabia across the international border in the NW-SE direction. The negative Bouguer anomalies (with magnitude reached "C130 mGal), located into transform valley, are caused by the internal sedimentary basins filled by the light density young sediments (­Y10 km). A high-resolution 3-D model constrained with the seismic results reveals a possible crustal thickness and density distribution beneath the DST valley. The inferred zone of intrusion coincides with the maximum gravity anomaly over the eastern flank of the DST. The intrusion is displaced at different sectors along the NW-SE direction. The zone of the maximum crustal thinning (­30 km) is attained in the western sector at the Mediterranean. The southeastern plateau, on the other hand, shows by far the largest crustal thickness in the region (38-42 km). Linked to the left lateral movement of ~ 105 km at the boundary between the African and Arabian plate, and constrained with the DESERT 2000 seismic data, a small asymmetric topography of the Moho beneath the DST was modelled. The thickness and density of the crust suggest that a continental crust underlies the DST. The deep basins, the relatively large nature of the intrusion and the asymmetric topography of the Moho lead to the conclusion that a small-scale asthenospheric upwelling(?) might be responsible for the thinning of the crust and subsequent rifting of the Dead Sea graben during the left lateral movement.

  12. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  13. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  14. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  15. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  16. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  17. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  18. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  19. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  20. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  1. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  2. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  3. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  4. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  5. Trench Parallel Bouguer Anomaly (TPBA): A robust measure for statically detecting asperities along the forearc of subduction zones

    Science.gov (United States)

    Raeesi, M.

    2009-05-01

    During 1970s some researchers noticed that large earthquakes occur repeatedly at the same locations. These observations led to the asperity hypothesis. At the same times some researchers noticed that there was a relationship between the location of great interplate earthquakes and the submarine structures, basins in particular, over the rupture area in the forearc regions. Despite these observations there was no comprehensive and reliable hypothesis explaining the relationship. There were numerous cons and pros to the various hypotheses given in this regard. In their pioneering study, Song and Simons (2003) approached the problem using gravity data. This was a turning point in seismology. Although their approach was correct, appropriate gravity anomaly had to be used in order to reveal the location and extent of the asperities. Following the method of Song and Simons (2003) but using the Bouguer gravity anomaly that we called "Trench Parallel Bouguer Anomaly", TPBA, we found strong, logical, and convincing relation between the TPBA-derived asperities and the slip distribution as well as earthquake distribution, foreshocks and aftershocks in particular. Various parameters with different levels of importance are known that affect the contact between the subducting and the overriding plates, We found that the TPBA can show which are the important factors. Because the TPBA-derived asperities are based on static physical properties (gravity and elevation), they do not suffer from instabilities due to the trade-offs, as it happens for asperities derived in dynamic studies such as waveform inversion. Comparison of the TPBA-derived asperities with rupture processes of the well-studied great earthquakes, reveals the high level of accuracy of the TPBA. This new measure opens a forensic viewpoint on the rupture process along the subduction zones. The TPBA reveals the reason behind 9+ earthquakes and it explains where and why they occur. The TPBA reveals the areas that can

  6. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  7. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  8. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  9. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  10. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  11. Compilation of new and previously published geochemical and modal data for Mesoproterozoic igneous rocks of the St. Francois Mountains, southeast Missouri

    Science.gov (United States)

    du Bray, Edward A.; Day, Warren C.; Meighan, Corey J.

    2018-04-16

    The purpose of this report is to present recently acquired as well as previously published geochemical and modal petrographic data for igneous rocks in the St. Francois Mountains, southeast Missouri, as part of an ongoing effort to understand the regional geology and ore deposits of the Mesoproterozoic basement rocks of southeast Missouri, USA. The report includes geochemical data that is (1) newly acquired by the U.S. Geological Survey and (2) compiled from numerous sources published during the last fifty-five years. These data are required for ongoing petrogenetic investigations of these rocks. Voluminous Mesoproterozoic igneous rocks in the St. Francois Mountains of southeast Missouri constitute the basement buried beneath Paleozoic sedimentary rock that is over 600 meters thick in places. The Mesoproterozoic rocks of southeast Missouri represent a significant component of approximately 1.4 billion-year-old (Ga) igneous rocks that crop out extensively in North America along the southeast margin of Laurentia and subsequent researchers suggested that iron oxide-copper deposits in the St. Francois Mountains are genetically associated with ca. 1.4 Ga magmatism in this region. The geochemical and modal data sets described herein were compiled to support investigations concerning the tectonic setting and petrologic processes responsible for the associated magmatism.

  12. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  13. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  14. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  15. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  16. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  17. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  18. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  19. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  20. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  1. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  2. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  3. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  4. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  5. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  6. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  7. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  8. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  9. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  10. Practicing on Newly Dead

    Directory of Open Access Journals (Sweden)

    Jewel Abraham

    2015-07-01

    Full Text Available A newly dead cadaver simulation is practiced on the physical remains of the dead before the onset of rigor mortis. This technique has potential benefits for providing real-life in-situ experience for novice providers in health care practices. Evolving ethical views in health care brings into question some of the ethical aspects associated with newly dead cadaver simulation in terms of justification for practice, autonomy, consent, and the need of disclosure. A clear statement of policies and procedures on newly dead cadaver simulation has yet to be implemented. Although there are benefits and disadvantages to an in-situ cadaver simulation, such practices should not be carried out in secrecy as there is no compelling evidence that suggests such training as imperative. Secrecy in these practices is a violation of honor code of nursing ethics. As health care providers, practitioners are obliged to be ethically honest and trustworthy to their patients. The author explores the ethical aspects of using newly dead cadaver simulation in training novice nursing providers to gain competency in various lifesaving skills, which otherwise cannot be practiced on a living individual. The author explores multiple views on cadaver simulation in relation to ethical theories and practices such as consent and disclosure to family.

  11. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  12. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  13. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  14. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  15. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  16. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  17. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  18. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  19. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  20. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  1. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  2. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  3. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  4. Newly graduated nurses' use of knowledge sources

    DEFF Research Database (Denmark)

    Voldbjerg, Siri Lygum; Grønkjaer, Mette; Sørensen, Erik Elgaard

    2016-01-01

    AIM: To advance evidence on newly graduated nurses' use of knowledge sources. BACKGROUND: Clinical decisions need to be evidence-based and understanding the knowledge sources that newly graduated nurses use will inform both education and practice. Qualitative studies on newly graduated nurses' use...... underscoring progression in knowledge use and perception of competence and confidence among newly graduated nurses. CONCLUSION: The transition phase, feeling of confidence and ability to use critical thinking and reflection, has a great impact on knowledge sources incorporated in clinical decisions....... The synthesis accentuates that for use of newly graduated nurses' qualifications and skills in evidence-based practice, clinical practice needs to provide a supportive environment which nurtures critical thinking and questions and articulates use of multiple knowledge sources....

  5. Un modelo geodésico para Colombia

    Directory of Open Access Journals (Sweden)

    Sánchez Rodríguez Laura Marlene

    1995-10-01

    Full Text Available

    The processed gravimetric information is compiled in the SIMPLE BOUGUER GRAVITY ANOMALY MAP OF COLOMBIA, EASTERN PANAMA AND ADYACENT MARINE AREAS, published in 1989 by The Geological Society of America Inc. Based on Isostatic anomaly values (approximated obtained for difference between the observed Simple Bouguer anomaly values and itself, processed by means of the Bidimensional Fourier Transform, the Physical-Mathematical Stokes model was applied. With this model can be possible to determine the geoid undulations, relating the normal gravity, the anomaly potential and the gravity anomalies; these undulations are based on the WGS-84 ellipsoid.

    La información gravimétrica procesada esta compilada en el Mapa de anomalía Bouguer simple de Colombia, este de Panamá y aéreas marinas adyacentes, publicado por “The Geological Society of America Inc.” en 1989. A partir de los valores de la anomalía isostática (aproximada, resultante de la diferencia entre la anomalía Bouguer simple observada y la misma, procesada por medio de la Transformada de Fourier Bidimensional, se aplicó el modelo físico-matemático de Stokes, con el cual se determinan las ondulaciones geoidales relacionando la gravedad normal, el potencial anómalo y las anomalías de la gravedad; tales ondulaciones se referenciaron al elipsoide WGS-84.

  6. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  7. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  8. Michigan Magnetic and Gravity Maps and Data: A Website for the Distribution of Data

    Science.gov (United States)

    Daniels, David L.; Kucks, Robert P.; Hill, Patricia L.; Snyder, Stephen L.

    2009-01-01

    This web site provides the best available, public-domain, aeromagnetic and gravity data in the State of Michigan and merges these data into composite grids that are available for downloading. The magnetic grid is compiled from 25 separate magnetic surveys that have been knit together to form a single composite digital grid and map. The magnetic survey grids have been continued to 305 meters (1,000 feet) above ground and merged together to form the State compilation. A separate map shows the location of the aeromagnetic surveys, color-coded to the survey flight-line spacing. In addition, a complete Bouguer gravity anomaly grid and map were generated from more than 20,000 gravity station measurements from 33 surveys. A table provides the facts about each gravity survey where known.

  9. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  10. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  11. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  12. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  13. BOUGUER ANOMALİLERİNDEN ÜSTÜ ÖRTÜLÜ FAYLARIN SAPTANMASI VE İSTANBUL-SİLİVRİ BÖLGESİNİN YERALTI YAPISININ MODELLENMESİ

    Directory of Open Access Journals (Sweden)

    Fethi Ahmet YÜKSEL

    2001-03-01

    Full Text Available Bu çalışmada, İstanbul-Silivri bölgesinin Bouguer anomali haritasında etkileri görünmeyen üstü örtülü düşey süreksizliklerin saptanması için yeni bir yöntem sunulmuştur. Bu yöntem, gözlem değerlerinin II. düşey türev değerleri ile kuramsal bir düşey süreksizlik modeline ait II. düşey türev değerleri arasındaki kros-korelasyon esasına dayanmaktadır. Kros-korelasyon fonksiyonunun maksimum veya minimum değerleri düşey süreksizliklerin orijin noktaları üzerinde oluşmaktadır. Yöntem, bir ve iki boyutlu kuramsal modeller üzerinde test edildikten sonra, Silivri bölgesinin yüzeylenmemiş çizgisellik yapısının modellenmesinde kullanılmıştır.

  14. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  15. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  16. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  17. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  18. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  19. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  20. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  1. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  2. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  3. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  4. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  5. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  6. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  7. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  8. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  9. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  10. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  11. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  12. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  13. Cross-Linking Mast Cell Specific Gangliosides Stimulates the Release of Newly Formed Lipid Mediators and Newly Synthesized Cytokines

    Directory of Open Access Journals (Sweden)

    Edismauro Garcia Freitas Filho

    2016-01-01

    Full Text Available Mast cells are immunoregulatory cells that participate in inflammatory processes. Cross-linking mast cell specific GD1b derived gangliosides by mAbAA4 results in partial activation of mast cells without the release of preformed mediators. The present study examines the release of newly formed and newly synthesized mediators following ganglioside cross-linking. Cross-linking the gangliosides with mAbAA4 released the newly formed lipid mediators, prostaglandins D2 and E2, without release of leukotrienes B4 and C4. The effect of cross-linking these gangliosides on the activation of enzymes in the arachidonate cascade was then investigated. Ganglioside cross-linking resulted in phosphorylation of cytosolic phospholipase A2 and increased expression of cyclooxygenase-2. Translocation of 5-lipoxygenase from the cytosol to the nucleus was not induced by ganglioside cross-linking. Cross-linking of GD1b derived gangliosides also resulted in the release of the newly synthesized mediators, interleukin-4, interleukin-6, and TNF-α. The effect of cross-linking the gangliosides on the MAP kinase pathway was then investigated. Cross-linking the gangliosides induced the phosphorylation of ERK1/2, JNK1/2, and p38 as well as activating both NFκB and NFAT in a Syk-dependent manner. Therefore, cross-linking the mast cell specific GD1b derived gangliosides results in the activation of signaling pathways that culminate with the release of newly formed and newly synthesized mediators.

  14. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  15. Newly Homeless Youth Typically Return Home

    OpenAIRE

    Milburn, Norweeta G.; Rosenthal, Doreen; Rotheram-Borus, Mary Jane; Mallett, Shelley; Batterham, Philip; Rice, Eric; Solorio, Rosa

    2007-01-01

    165 newly homeless adolescents from Melbourne, Australia and 261 from Los Angeles, United States were surveyed and followed for two years. Most newly homeless adolescents returned home (70% U.S., 47% Australia) for significant amounts of time (39% U.S., 17% Australia more than 12 months) within two years of becoming homeless.

  16. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  17. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  18. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  19. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  20. A compilation of consumers' stories: the development of a video to enhance medication adherence in newly transplanted kidney recipients.

    Science.gov (United States)

    Low, Jac Kee; Crawford, Kimberley; Manias, Elizabeth; Williams, Allison

    2016-04-01

    To describe the design, development and evaluation of a consumer-centred video, which was underpinned by the Theory of Planned Behaviour and it was created to educate newly transplanted kidney recipients about the importance of medication adherence. Kidney transplantation is a treatment whereby medication adherence is critical to ensure long-term kidney graft success. To date, many interventions aimed to improve medication adherence in kidney transplantation have been conducted but consumers remain largely uninvolved in the interventional design. Qualitative sequential design. Twenty-two participants who had maintained their kidney transplant for at least 8 months and three participants who had experienced a kidney graft loss due to non-adherence were interviewed from March-May 2014 in Victoria, Australia. These interviews were independently reviewed by two researchers and were used to guide the design of the story plot and to identify storytellers for the video. The first draft of the video was evaluated by a panel of seven experts in the field, one independent educational expert and two consumers using Lynn's content validity questionnaire. The content of the video was regarded as highly relevant and comprehensive, which achieved a score of >3·7 out of a possible 4. The final 18-minute video comprised 15 sections. Topics included medication management, the factors affecting medication adherence and the absolute necessity of adherence to immunosuppressive medications for graft survival. This paper has demonstrated the feasibility of creating a consumer-driven video that supports medication adherence in an engaging way. © 2015 John Wiley & Sons Ltd.

  1. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  2. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  3. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  4. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  5. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  6. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  7. Immunoparesis in newly diagnosed Multiple Myeloma patients

    DEFF Research Database (Denmark)

    Sorrig, Rasmus; Klausen, Tobias W.; Salomo, Morten

    2017-01-01

    Immunoparesis (hypogammaglobulinemia) is associated to an unfavorable prognosis in newly diagnosed Multiple myeloma (MM) patients. However, this finding has not been validated in an unselected population-based cohort. We analyzed 2558 newly diagnosed MM patients in the Danish Multiple Myeloma...

  8. Observational Constraints on the Identification of Shallow Lunar Magmatism: Insights from Floor-Fractured Craters

    Science.gov (United States)

    Jozwiak, L. M.; Head, J. W., III; Neumann, G. A.; Wilson, L.

    2016-01-01

    Floor-fractured craters are a class of lunar crater hypothesized to form in response to the emplacement of a shallow magmatic intrusion beneath the crater floor. The emplacement of a shallow magmatic body should result in a positive Bouguer anomaly relative to unaltered complex craters, a signal which is observed for the average Bouguer anomaly interior to the crater walls. We observe the Bouguer anomaly of floor-fractured craters on an individual basis using the unfiltered Bouguer gravity solution from GRAIL and also a degree 100-600 band-filtered Bouguer gravity solution. The low-magnitude of anomalies arising from shallow magmatic intrusions makes identification using unfiltered Bouguer gravity solutions inconclusive. The observed anomalies in the degree 100-600 Bouguer gravity solution are spatially heterogeneous, although there is spatial correlation between volcanic surface morphologies and positive Bouguer anomalies. We interpret these observations to mean that the spatial heterogeneity observed in the Bouguer signal is the result of variable degrees of magmatic degassing within the intrusions.

  9. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  10. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  11. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  12. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  13. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  14. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  15. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  16. Generational differences among newly licensed registered nurses.

    Science.gov (United States)

    Keepnews, David M; Brewer, Carol S; Kovner, Christine T; Shin, Juh Hyun

    2010-01-01

    Responses of 2369 newly licensed registered nurses from 3 generational cohorts-Baby Boomers, Generation X, and Generation Y-were studied to identify differences in their characteristics, work-related experiences, and attitudes. These responses revealed significant differences among generations in: job satisfaction, organizational commitment, work motivation, work-to-family conflict, family-to-work conflict, distributive justice, promotional opportunities, supervisory support, mentor support, procedural justice, and perceptions of local job opportunities. Health organizations and their leaders need to anticipate intergenerational differences among newly licensed nurses and should provide for supportive working environments that recognize those differences. Orientation and residency programs for newly licensed nurses should be tailored to the varying needs of different generations. Future research should focus on evaluating the effectiveness of orientation and residency programs with regard to different generations so that these programs can be tailored to meet the varying needs of newly licensed nurses at the start of their careers. Copyright 2010 Mosby, Inc. All rights reserved.

  17. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  18. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  19. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  20. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  1. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  2. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  3. Chromium-Containing Traditional Chinese Medicine, Tianmai Xiaoke Tablet, for Newly Diagnosed Type 2 Diabetes Mellitus: A Meta-Analysis and Systematic Review of Randomized Clinical Trials.

    Science.gov (United States)

    Gu, Yuming; Xu, Xuemin; Wang, Zhe; Xu, Yunsheng; Liu, Xiuzhi; Cao, Lejun; Wang, Xueyang; Li, Zhengxin; Feng, Bo

    2018-01-01

    Chromium-containing traditional Chinese medicine Tianmai Xiaoke tablet (TMXKT) is approved for treating newly diagnosed type 2 diabetes mellitus (T2DM) in China. This review aimed to compile the evidence from randomized clinical trials (RCTs) and quantify the effects of TMXKT on newly diagnosed T2DM. Seven online databases were investigated up to March 20, 2017. The meta-analysis included RCTs investigating the treatment of newly diagnosed T2DM, in which TMXKT combined with conventional therapy was compared with placebo or conventional therapy. The risk of bias was evaluated using the Cochrane Collaboration tool. The estimated mean difference (MD) and the standardized mean difference were within 95% confidence intervals (CI) with respect to the interstudy heterogeneity. The outcomes were measured using fasting blood glucose (FBG), 2-h postprandial blood glucose (2hPG), glycosylated hemoglobin A1c (HbA1c), and body mass index (BMI) levels. TMXKT combined with conventional therapy lowered FBG level (MD = -0.68, 95% CI -0.90 to -0.45, P < 0.00001), 2hPG (MD = -1.33, 95% CI -1.86 to -0.79, P < 0.00001), HbA1c (MD = -0.46, 95% CI -0.57 to -0.36, P < 0.00001), and BMI (MD = -0.77, 95% CI -1.12 to -0.41, P < 0.00001). TMXKT combined with conventional therapy is beneficial for patients with newly diagnosed T2DM. However, the effectiveness and safety of TMXKT are uncertain because of the limited number of trials and low methodological quality. Therefore, practitioners should be cautious when applying TMXKT in daily practice. Also, well-designed clinical trials are needed in the future.

  4. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  5. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  6. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  7. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  8. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  9. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  10. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  11. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  12. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  13. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  14. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  15. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  16. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  17. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  18. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  19. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  20. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  1. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  2. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  3. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  4. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  5. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  6. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  7. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  8. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  9. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  10. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  11. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  12. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  13. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  14. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  15. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  16. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  17. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  18. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  19. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  20. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  1. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  2. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  3. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  4. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  5. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  6. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  7. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  8. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  9. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  10. Integrated 3D density modelling and segmentation of the Dead Sea Transform

    Science.gov (United States)

    Götze, H.-J.; El-Kelani, R.; Schmidt, S.; Rybakov, M.; Hassouneh, M.; Förster, H.-J.; Ebbing, J.

    2007-04-01

    A 3D interpretation of the newly compiled Bouguer anomaly in the area of the “Dead Sea Rift” is presented. A high-resolution 3D model constrained with the seismic results reveals the crustal thickness and density distribution beneath the Arava/Araba Valley (AV), the region between the Dead Sea and the Gulf of Aqaba/Elat. The Bouguer anomalies along the axial portion of the AV, as deduced from the modelling results, are mainly caused by deep-seated sedimentary basins ( D > 10 km). An inferred zone of intrusion coincides with the maximum gravity anomaly on the eastern flank of the AV. The intrusion is displaced at different sectors along the NNW-SSE direction. The zone of maximum crustal thinning (depth 30 km) is attained in the western sector at the Mediterranean. The southeastern plateau, on the other hand, shows by far the largest crustal thickness of the region (38-42 km). Linked to the left lateral movement of approx. 105 km at the boundary between the African and Arabian plate, and constrained with recent seismic data, a small asymmetric topography of the Moho beneath the Dead Sea Transform (DST) was modelled. The thickness and density of the crust suggest that the AV is underlain by continental crust. The deep basins, the relatively large intrusion and the asymmetric topography of the Moho lead to the conclusion that a small-scale asthenospheric upwelling could be responsible for the thinning of the crust and subsequent creation of the Dead Sea basin during the left lateral movement. A clear segmentation along the strike of the DST was obtained by curvature analysis: the northern part in the neighbourhood of the Dead Sea is characterised by high curvature of the residual gravity field. Flexural rigidity calculations result in very low values of effective elastic lithospheric thickness ( t e < 5 km). This points to decoupling of crust in the Dead Sea area. In the central, AV the curvature is less pronounced and t e increases to approximately 10 km

  11. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  12. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  13. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  14. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  15. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  16. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  17. Confidence in leadership among the newly qualified.

    Science.gov (United States)

    Bayliss-Pratt, Lisa; Morley, Mary; Bagley, Liz; Alderson, Steven

    2013-10-23

    The Francis report highlighted the importance of strong leadership from health professionals but it is unclear how prepared those who are newly qualified feel to take on a leadership role. We aimed to assess the confidence of newly qualified health professionals working in the West Midlands in the different competencies of the NHS Leadership Framework. Most respondents felt confident in their abilities to demonstrate personal qualities and work with others, but less so at managing or improving services or setting direction.

  18. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  19. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  20. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  1. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  2. The relationship between uranium distribution and some major crustal features in Canada

    International Nuclear Information System (INIS)

    Darnley, A.G.

    1982-01-01

    The availability of reconnaissance scale geochemical maps for large areas of Canada enables spatial associations between major crustal structures and surface uranium content to be identified. Maps of the distribution of uranium for an area greater than 2 million km 2 compiled from airborne gamma-ray spectrometry data are supplemented by maps for uranium, based on stream and lake sediment and some bore hole sampling. These are examined in relation to gravity, aeromagnetic and geological maps. The radioelement distribution can be related in detail to exposed bedrock and surface geology, but in addition there is evidence of the control of uranium distribution by major structural features which are marked by granitoids containing elevated levels of radioelements; several of these granitoids are associated with large negative Bouguer gravity anomalies. The distribution of such granitoids appears to be related to 'megashears', as in the case of the South Mountain batholith in Nova Scotia, or zones of tension. A belt of uranium enrichment, the Athabasca axis which is characterized by uraniferous granitoids with negative Bouguer gravity anomalies and associated tension faulting extends 2500 km northeastward from Edmonton, Alberta to the Melville Peninsula. This structure passes under the Athabasca basin which contains many large uranium deposits. (author)

  3. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  4. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  5. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  6. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  7. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  8. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  9. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  10. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  11. Crustal structure under the central High Atlas Mountains (Morocco) from geological and gravity data

    Science.gov (United States)

    Ayarza, P.; Alvarez-Lobato, F.; Teixell, A.; Arboleya, M. L.; Tesón, E.; Julivert, M.; Charroud, M.

    2005-05-01

    Seismic wide angle and receiver function results together with geological data have been used as constraints to build a gravity-based crustal model of the central High Atlas of Morocco. Integration of a newly acquired set of gravity values with public data allowed us to undertake 2-2.5D gravity modelling along two profiles that cross the entire mountain chain. Modelling suggests moderate crustal thickening, and a general state of Airy isostatic undercompensation. Localized thickening appears restricted to the vicinity of a north-dipping crustal-scale thrust fault, that offsets the Moho discontinuity and defines a small crustal root which accounts for the minimum Bouguer gravity anomaly values. Gravity modelling indicates that this root has a northeasterly strike, slightly oblique to the ENE general orientation of the High Atlas belt. A consequence of the obliquity between the High Atlas borders and its internal and deep structure is the lack of correlation between Bouguer gravity anomaly values and topography. Active buckling affecting the crust, a highly elevated asthenosphere, or a combination of both are addressed as side mechanisms that help to maintain the high elevations of the Atlas mountains.

  12. Problems faced by newly diagnosed diabetes mellitus patients at ...

    African Journals Online (AJOL)

    Diabetes mellitus can be a frightening experience for newly diagnosed patients. The aim of this study was to determine and describe the problems faced by newly diagnosed diabetes mellitus patients at primary healthcare facilities at Mopani district, Limpopo Province. A qualitative, descriptive and contextual research ...

  13. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  14. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  15. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  16. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  17. Newly graduated nurses' use of knowledge sources in clinical decision-making

    DEFF Research Database (Denmark)

    Voldbjerg, Siri Lygum; Grønkjaer, Mette; Wiechula, Rick

    2017-01-01

    AIMS AND OBJECTIVES: To explore which knowledge sources newly graduated nurses' use in clinical decision-making and why and how they are used. BACKGROUND: In spite of an increased educational focus on skills and competencies within evidence based practice newly graduated nurses' ability to use...... approaches to strengthen the knowledgebase used in clinical decision-making. DESIGN AND METHODS: Ethnographic study using participant-observation and individual semi-structured interviews of nine Danish newly graduated nurses in medical and surgical hospital settings. RESULTS: Newly graduates use...... in clinical decision-making. If newly graduates are to be supported in an articulate and reflective use of a variety of sources, they have to be allocated to experienced nurses who model a reflective, articulate and balanced use of knowledge sources. This article is protected by copyright. All rights reserved....

  18. Value of a newly sequenced bacterial genome

    DEFF Research Database (Denmark)

    Barbosa, Eudes; Aburjaile, Flavia F; Ramos, Rommel Tj

    2014-01-01

    and annotation will not be undertaken. It is important to know what is lost when we settle for a draft genome and to determine the "scientific value" of a newly sequenced genome. This review addresses the expected impact of newly sequenced genomes on antibacterial discovery and vaccinology. Also, it discusses...... heightened expectations that NGS would boost antibacterial discovery and vaccine development. Although many possible drug and vaccine targets have been discovered, the success rate of genome-based analysis has remained below expectations. Furthermore, NGS has had consequences for genome quality, resulting...

  19. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  20. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  1. Regulatory and technical reports, compilation for 1979. Volume 4. Bibliographical report Jan-Dec 79

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzie, L.; Aragon, R.

    1980-07-01

    The compilation lists formal regulatory and technical reports issued in 1979 by the U.S. Nuclear Regulatory Commission (NRC) staff and by NRC contractors. The compilation is divided into three major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The first portion of this sequential section lists staff reports, the second portion lists NRC-sponsored conference proceedings, and the third lists contractor reports. Each report citation in the sequential section contains full bibliographic information

  2. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  3. Observational constraints on the identification of shallow lunar magmatism : insights from floor-fractured craters

    OpenAIRE

    Jozwiak, Lauren; Head, James; Neumann, G. A.; Wilson, Lionel

    2017-01-01

    Floor-fractured craters are a class of lunar crater hypothesized to form in response to the emplacement of a shallow magmatic intrusion beneath the crater floor. The emplacement of a shallow magmatic body should result in a positive Bouguer anomaly relative to unaltered complex craters, a signal which is observed for the average Bouguer anomaly interior to the crater walls. We observe the Bouguer anomaly of floor-fractured craters on an individual basis using the unfiltered Bouguer gravity so...

  4. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  5. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  6. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  7. Passage through X-ray protection having the structure of homogeneous fractals

    OpenAIRE

    Churikov Viktor Anatolyevich

    2014-01-01

    In this paper we generalize the law of Bouguer-Lambert in the case of a homogeneous fractal. With detailed analysis in terms of d-output operator generalized law of BouguerLambert-Beer law, which in particular includes the classical law of optics Bouguer-LambertBeer.

  8. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  9. Compilation of the abstracts of nuclear computer codes available at CPD/IPEN

    International Nuclear Information System (INIS)

    Granzotto, A.; Gouveia, A.S. de; Lourencao, E.M.

    1981-06-01

    A compilation of all computer codes available at IPEN in S.Paulo are presented. These computer codes are classified according to Argonne National Laboratory - and Energy Nuclear Agency schedule. (E.G.) [pt

  10. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, Mahmut Taylan [PSU; Choudary, Alok [Northwestern; Thakur, Rajeev [ANL

    2014-03-01

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.

  11. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  12. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  13. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  14. Newly Generated Liquid Waste Processing Alternatives Study, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Landman, William Henry; Bates, Steven Odum; Bonnema, Bruce Edward; Palmer, Stanley Leland; Podgorney, Anna Kristine; Walsh, Stephanie

    2002-09-01

    This report identifies and evaluates three options for treating newly generated liquid waste at the Idaho Nuclear Technology and Engineering Center of the Idaho National Engineering and Environmental Laboratory. The three options are: (a) treat the waste using processing facilities designed for treating sodium-bearing waste, (b) treat the waste using subcontractor-supplied mobile systems, or (c) treat the waste using a special facility designed and constructed for that purpose. In studying these options, engineers concluded that the best approach is to store the newly generated liquid waste until a sodium-bearing waste treatment facility is available and then to co-process the stored inventory of the newly generated waste with the sodium-bearing waste. After the sodium-bearing waste facility completes its mission, two paths are available. The newly generated liquid waste could be treated using the subcontractor-supplied system or the sodium-bearing waste facility or a portion of it. The final decision depends on the design of the sodium-bearing waste treatment facility, which will be completed in coming years.

  15. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  16. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  17. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  18. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    International Nuclear Information System (INIS)

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  19. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  20. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  1. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  2. Nuclear data evaluation and group constant generation for reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Do; Lee, Jong Tae; Min, Byung Joo; Gil, Choong Sup [Korea Atomic Energy Research Inst., Daeduk (Korea, Republic of)

    1991-01-01

    In nuclear or shielding design analysis for reactors or other facilities, nuclear data are one of the primary importances. Research project for nuclear data evaluation and their effective applications has been continuously performed. The objectives of this project are (1) to compile the latest evaluated nuclear data files, (2) to establish their processing code systems, and (3) to evaluate the multi- group constant library using the newly compiled data files and the code systems. As the results of this project, ENDF/B-VI Supplementary File including important nuclides, JENDL-3.1 and JEF-1 were compiled, and ENDF-6 international computer file format for evaluated nuclear data and its processing system NJOY89.31 were tested with ENDF/B-VI data. In order to test an applicability of the newly released data to thermal reactor problems, a number of benchmark calculations were performed, and the results were analyzed. Since preliminary benchmark testing of thermal reactor problems have been made the newly compiled data are expected to be positively used to develop advanced reactors. (Author).

  3. Nuclear data evaluation and group constant generation for reactor analysis

    International Nuclear Information System (INIS)

    Kim, Jung Do; Lee, Jong Tae; Min, Byung Joo; Gil, Choong Sup

    1991-01-01

    In nuclear or shielding design analysis for reactors or other facilities, nuclear data are one of the primary importances. Research project for nuclear data evaluation and their effective applications has been continuously performed. The objectives of this project are (1) to compile the latest evaluated nuclear data files, (2) to establish their processing code systems, and (3) to evaluate the multi- group constant library using the newly compiled data files and the code systems. As the results of this project, ENDF/B-VI Supplementary File including important nuclides, JENDL-3.1 and JEF-1 were compiled, and ENDF-6 international computer file format for evaluated nuclear data and its processing system NJOY89.31 were tested with ENDF/B-VI data. In order to test an applicability of the newly released data to thermal reactor problems, a number of benchmark calculations were performed, and the results were analyzed. Since preliminary benchmark testing of thermal reactor problems have been made the newly compiled data are expected to be positively used to develop advanced reactors. (Author)

  4. Observation of the bone mineral density of newly formed bone using rabbits. Compared with newly formed bone around implants and cortical bone

    International Nuclear Information System (INIS)

    Nakada, Hiroshi; Numata, Yasuko; Sakae, Toshiro; Tamaki, Hiroyuki; Kato, Takao

    2009-01-01

    There have been many studies reporting that newly formed bone around implants is spongy bone. However, although the morphology is reported as being like spongy bone, it is difficult to discriminate whether the bone quality of newly formed bone appears similar to osteoid or cortical bone; therefore, evaluation of bone quality is required. The aims of this study were to measure the bone mineral density (BMD) values of newly formed bone around implants after 4, 8, 16, 24 and 48 weeks, to represent these values on three-dimensional color mapping (3Dmap), and to evaluate the change in bone quality associated with newly formed bone around implants. The animal experimental protocol of this study was approved by the Ethics Committee for Animal Experiments of our University. This experiment used 20 surface treatment implants (Ti-6Al-4V alloy: 3.1 mm in diameter and 30.0 mm in length) by grit-blasting. They were embedded into surgically created flaws in femurs of 20 New Zealand white rabbits (16 weeks old, male). The rabbits were sacrificed with an ear intravenous overdose of pentobarbital sodium under general anesthesia each period, and the femurs were resected. We measured BMD of newly formed bone around implants and cortical bone using Micro-CT, and the BMD distribution map of 3Dmap (TRI/3D Bon BMD, Ratoc System Engineering). The BMD of cortical bone was 1,026.3±44.3 mg/cm 3 at 4 weeks, 1,023.8±40.9 mg/cm 3 at 8 weeks, 1,048.2±45.6 mg/cm 3 at 16 weeks, 1,067.2±60.2 mg/cm 3 at 24 weeks, and 1,069.3±50.7 mg/cm 3 at 48 weeks after implantation, showing a non-significant increase each period. The BMD of newly formed bone around implants was 296.8±25.6 mg/cm 3 at 4 weeks, 525.0±72.4 mg/cm 3 at 8 weeks, 691.2±26.0 mg/cm 3 at 16 weeks, 776.9±27.7 mg/cm 3 at 24 weeks, and 845.2±23.1 mg/cm 3 at 48 weeks after implantation, showing a significant increase after each period. It was revealed that the color scale of newly formed bone was Low level at 4 weeks, and then it

  5. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  6. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  7. Towards droplet size-aware biochemical application compilation for AM-EWOD biochips

    DEFF Research Database (Denmark)

    Pop, Paul; Alistar, Mirela

    2015-01-01

    a droplet size-aware compilation by proposing a routing algorithm that considers the droplet size. Our routing algorithm is developed for a novel digital microfluidic biochip architecture based on Active Matrix Electrowetting on Dielectric, which uses a thin film transistor array for the electrodes. We also...

  8. 27 CFR 478.24 - Compilation of State laws and published ordinances.

    Science.gov (United States)

    2010-04-01

    ... published ordinances. (a) The Director shall annually revise and furnish Federal firearms licensees with a... Director annually revises the compilation and publishes it as “State Laws and Published Ordinances—Firearms... and published ordinances. 478.24 Section 478.24 Alcohol, Tobacco Products, and Firearms BUREAU OF...

  9. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  10. Statistical Compilation of the ICT Sector and Policy Analysis | Page 3 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  11. The practical skills of newly qualified nurses.

    Science.gov (United States)

    Danbjørg, Dorthe Boe; Birkelund, Regner

    2011-02-01

    This paper reports the findings from a study of newly qualified nurses and which subjects the nurses regarded as the most important in order to be able to live up to the requirements of clinical practice, and how they experience their potential for developing practical and moral skills, after the decrease in practical training. A qualitative approach guided the research process and the analysis of the data. The data was collected by participant observation and qualitative interviews with four nurses as informants. The conclusions made in this study are based on the statements and the observations of the newly qualified nurses. Our findings are discussed in relation to the Aristotelian concept and other relevant literature. The main message is that the newly qualified nurses did not feel equipped when they finished their training. This could be interpreted as a direct consequence of the decrease in practical training. Our study also underlines that the way nursing theory is perceived and taught is problematic. The interviews revealed that the nurses think that nursing theories should be applied directly in practice. This misunderstanding is probably also applicable to the teachers of the theories. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  13. Newly graduated nurses' use of knowledge sources: a meta-ethnography.

    Science.gov (United States)

    Voldbjerg, Siri Lygum; Grønkjaer, Mette; Sørensen, Erik Elgaard; Hall, Elisabeth O C

    2016-08-01

    To advance evidence on newly graduated nurses' use of knowledge sources. Clinical decisions need to be evidence-based and understanding the knowledge sources that newly graduated nurses use will inform both education and practice. Qualitative studies on newly graduated nurses' use of knowledge sources are increasing though generated from scattered healthcare contexts. Therefore, a metasynthesis of qualitative research on what knowledge sources new graduates use in decision-making was conducted. Meta-ethnography. Nineteen reports, representing 17 studies, published from 2000-2014 were identified from iterative searches in relevant databases from May 2013-May 2014. Included reports were appraised for quality and Noblit and Hare's meta-ethnography guided the interpretation and synthesis of data. Newly graduated nurses' use of knowledge sources during their first 2-year postgraduation were interpreted in the main theme 'self and others as knowledge sources,' with two subthemes 'doing and following' and 'knowing and doing,' each with several elucidating categories. The metasynthesis revealed a line of argument among the report findings underscoring progression in knowledge use and perception of competence and confidence among newly graduated nurses. The transition phase, feeling of confidence and ability to use critical thinking and reflection, has a great impact on knowledge sources incorporated in clinical decisions. The synthesis accentuates that for use of newly graduated nurses' qualifications and skills in evidence-based practice, clinical practice needs to provide a supportive environment which nurtures critical thinking and questions and articulates use of multiple knowledge sources. © 2016 John Wiley & Sons Ltd.

  14. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  15. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  16. Technique to increase performance of C-program for control systems. Compiler technique for low-cost CPU; Seigyoyo C gengo program no kosokuka gijutsu. Tei cost CPU no tame no gengo compiler gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Y [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    The software of automotive control systems has become increasingly large and complex. High level languages (primarily C) and the compilers become more important to reduce coding time. Most compilers represent real number in the floating point format specified by IEEE standard 754. Most microprocessors in the automotive industry have no hardware for the operation using the IEEE standard due to the cost requirements, resulting in the slow execution speed and large code size. Alternative formats to increase execution speed and reduce code size are proposed. Experimental results for the alternative formats show the improvement in execution speed and code size. 4 refs., 3 figs., 2 tabs.

  17. The impact of organisational culture on the adaptation of newly ...

    African Journals Online (AJOL)

    Usually newly employed nurses find adjusting to a work setting a challenging experience. Their successful adaptation to their work situation is greatly influenced by the socialisation process inherent in the organisational culture. The newly employed nurse often finds that the norms are unclear, confusing and restrictive.

  18. Northern hemisphere mid-latitude geomagnetic anomaly revealed from Levantine Archaeomagnetic Compilation (LAC).

    Science.gov (United States)

    Shaar, R.; Tauxe, L.; Agnon, A.; Ben-Yosef, E.; Hassul, E.

    2015-12-01

    The rich archaeological heritage of Israel and nearby Levantine countries provides a unique opportunity for archaeomagnetic investigation in high resolution. Here we present a summary of our ongoing effort to reconstruct geomagnetic variations of the past several millennia in the Levant at decadal to millennial resolution. This effort at the Southern Levant, namely the "Levantine Archaeomagnetic Compilation" (LAC), presently consists of data from over 650 well-dated archaeological objects including pottery, slag, ovens, and furnaces. In this talk we review the methodological challenges in achieving a robust master secular variation curve with realistic error estimations from a large number of different datasets. We present the current status of the compilation, including the southern and western Levant LAC data (Israel, Cyprus, and Jordan) and other published north-eastern Levant data (Syria and southern Turkey), and outline the main findings emerging from these data. The main feature apparent from the new compilation is an extraordinary intensity high that developed over the Levant region during the first two millennia BCE. The climax of this event is a double peak intensity maximum starting at ca. 1000 BCE and ending at ca. 735 BCE, accompanied with at least two events of geomagnetic spikes. Paleomagnetic directions from this period demonstrate anomalies of up to 20 degrees far from the averaged GAD field. This leads us to postulate that the maximum in the intensity is a manifestation of an intense mid-latitude local positive geomagnetic anomaly that persisted for over two centuries.

  19. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  20. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  1. Assessment for markers of nephropathy in newly diagnosed type 2 ...

    African Journals Online (AJOL)

    Objective: To assess for markers of nephropathy in newly diagnosed type 2 diabetics, using blood pressure levels, endogenous creatinine clearance and urinary protein excretion as markers of renal disease. Study design: Ninety newly diagnosed type 2 diabetics were studied within 6 weeks of diagnosis. They were in ...

  2. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  3. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  4. Basic circuit compilation techniques for an ion-trap quantum machine

    International Nuclear Information System (INIS)

    Maslov, Dmitri

    2017-01-01

    We study the problem of compilation of quantum algorithms into optimized physical-level circuits executable in a quantum information processing (QIP) experiment based on trapped atomic ions. We report a complete strategy: starting with an algorithm in the form of a quantum computer program, we compile it into a high-level logical circuit that goes through multiple stages of decomposition into progressively lower-level circuits until we reach the physical execution-level specification. We skip the fault-tolerance layer, as it is not within the scope of this work. The different stages are structured so as to best assist with the overall optimization while taking into account numerous optimization criteria, including minimizing the number of expensive two-qubit gates, minimizing the number of less expensive single-qubit gates, optimizing the runtime, minimizing the overall circuit error, and optimizing classical control sequences. Our approach allows a trade-off between circuit runtime and quantum error, as well as to accommodate future changes in the optimization criteria that may likely arise as a result of the anticipated improvements in the physical-level control of the experiment. (paper)

  5. Compilation of monographs on α-, β-, γ- and X-ray spectrometry

    International Nuclear Information System (INIS)

    Debertin, K.

    1977-11-01

    The working group 'α-, β-, γ-Ray Spectrometry' of the International Committee for Radionuclide Metrology (ICRM) compiled about 35 monographs on α-, β-, γ- and X-ray spectrometry which were published in the years 1970 to 1976. Support was obtained by the Zentralstelle fuer Atomkernenergie-Dokumentation (ZAED) in Karlsruhe. (orig.) [de

  6. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  7. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  8. 14 CFR 26.39 - Newly produced airplanes: Fuel tank flammability.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Newly produced airplanes: Fuel tank... Tank Flammability § 26.39 Newly produced airplanes: Fuel tank flammability. (a) Applicability: This... Series 767 Series (b) Any fuel tank meeting all of the criteria stated in paragraphs (b)(1), (b)(2) and...

  9. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  10. Thoughts and Views on the Compilation of Monolingual Dictionaries in South Africa

    Directory of Open Access Journals (Sweden)

    N.C.P Golele

    2011-10-01

    Full Text Available Abstract: Developing and documenting the eleven official languages of South Africa on all levels of communication in order to fulfil all the roles and uses characteristic of truly official languages is a great challenge. To meet this need various bodies such as the National Lexicography Units have been established by the Pan South African Language Board (PanSALB. As far as dictionary compilation is concerned, acquaintance with the state-of-the-art developments in the theory and practice of lexicography is necessary. The focus of the African languages should be directed onto the compilation of monolingual dictionaries. It is important that these monolingual dictionaries should be usable right from the start on a continuous basis. Continued attention should be given to enlarging the corpora and actual consultation of these corpora on the macro- and microstructural levels. The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily and naturally accomplished. Advanced and continued training in the compilation of monolingual dictionaries should be presented. Keywords: MONOLINGUAL DICTIONARIES, OFFICIAL LANGUAGES, DICTIONARY COMPILATION, CORPORA, NATIONAL LEXICOGRAPHY UNITS, TARGET USERS, DICTIONARY USE, DICTIONARY CULTURE, CORE TERMS Opsomming: Gedagtes en beskouings oor die samestelling van eentalige woordeboeke in Suid-Afrika. Die ontwikkeling en dokumentering van die elf amptelike tale van Suid-Afrika op alle vlakke van kommunikasie om alle rolle en gebruike van werklik amptelike tale te vervul, is 'n groot uitdaging. Om in hierdie behoefte te voorsien, is liggame soos die Nasionale Leksikografie-eenhede deur die Pan Suid-Afrikaanse Taalraad (PanSAT tot stand gebring. Wat

  11. Chrysosplenium japonicum (Saxifragaceae, Newly Recorded from Taiwan

    Directory of Open Access Journals (Sweden)

    Tian-Chuan Hsu

    2011-11-01

    Full Text Available Chrysosplenium japonicum (Maxim. Makino (Saxifragaceae is newly recorded from northeastern Taiwan. Description, color photos and a key to the Chrysosplenium species in Taiwan are provided.

  12. Derivation of River Bathymetry Using Imagery from Unmanned Aerial Vehicles (UAV)

    Science.gov (United States)

    2011-09-01

    1. Beer-Lambert- Bouguer Law.............................................................14 a. The Equation...for the study areas when compared to ground-truth data. 1. Beer-Lambert- Bouguer Law a. The Equation The first method utilizes radiative transfer...theory and expounds upon Equation 2.4. Beer-Lambert- Bouguer Law, or Beer’s Law, describes the exponential absorption of light in water where

  13. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index.

  14. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index.

  15. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    International Nuclear Information System (INIS)

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index

  16. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    International Nuclear Information System (INIS)

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index

  17. Being a team leader: newly registered nurses relate their experiences.

    Science.gov (United States)

    Ekström, Louise; Idvall, Ewa

    2015-01-01

    This paper presents a study that explores how newly qualified registered nurses experience their leadership role in the ward-based nursing care team. A nurse's clinical leadership affects the quality of care provided. Newly qualified nurses experience difficulties during the transition period from student to qualified professional and find it challenging to lead nursing care. Twelve nurses were interviewed and the transcribed texts analysed using qualitative content analysis to assess both manifest and latent content. Five themes were identified: feeling stranded; forming well-functioning teams; learning to lead; having the courage, strength, and desire to lead; and ensuring appropriate care. The findings indicate that many factors limit nurses' leadership but some circumstances are supportive. The leadership prerequisites for newly registered nurses need to improve, emphasizing different ways to create a supportive atmosphere that promotes professional development and job satisfaction. To increase nurse retention and promote quality of care, nurse managers need to clarify expectations and guide and support newly qualified nurses in a planned way. © 2013 John Wiley & Sons Ltd.

  18. Compiler generation and autotuning of communication-avoiding operators for geometric multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Venkat, Anand [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-04-17

    This paper describes a compiler approach to introducing communication-avoiding optimizations in geometric multigrid (GMG), one of the most popular methods for solving partial differential equations. Communication-avoiding optimizations reduce vertical communication through the memory hierarchy and horizontal communication across processes or threads, usually at the expense of introducing redundant computation. We focus on applying these optimizations to the smooth operator, which successively reduces the error and accounts for the largest fraction of the GMG execution time. Our compiler technology applies both novel and known transformations to derive an implementation comparable to manually-tuned code. To make the approach portable, an underlying autotuning system explores the tradeoff between reduced communication and increased computation, as well as tradeoffs in threading schemes, to automatically identify the best implementation for a particular architecture and at each computation phase. Results show that we are able to quadruple the performance of the smooth operation on the finest grids while attaining performance within 94% of manually-tuned code. Overall we improve the overall multigrid solve time by 2.5× without sacrificing programer productivity.

  19. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  20. People newly in love are more responsive to positive feedback.

    Science.gov (United States)

    Brown, Cassandra L; Beninger, Richard J

    2012-06-01

    Passionate love is associated with increased activity in dopamine-rich regions of the brain. Increased dopamine in these regions is associated with a greater tendency to learn from reward in trial-and-error learning tasks. This study examined the prediction that individuals who were newly in love would be better at responding to reward (positive feedback). In test trials, people who were newly in love selected positive outcomes significantly more often than their single (not in love) counterparts but were no better at the task overall. This suggests that people who are newly in love show a bias toward responding to positive feedback, which may reflect a general bias towards reward-seeking.

  1. Proposal for a new self-compiled questionnaire in patients affected by temporo-mandibular joint disorders (TMD).

    Science.gov (United States)

    Agrillo, A; Ramieri, V; Bianca, C; Nastro Siniscalchi, E; Fatone, F M G; Arangio, P

    2010-07-01

    In this work, we propose a self-compiled questionnaire, for those patients showing dysfunctions of the temporomandibular joint. The questionnaire, composed by 33 closed multiple-choice questions, represents one of the steps in the diagnostic procedure, together with the clinical notes compiled by the medical specialist and with the other necessary diagnostic researches. It also has the purpose to make easier anamnesis and clinic procedure and gathering of all informations useful for a right clinical diagnosis, and so for an appropriate therapy.

  2. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  3. The Study of Contemporary Stone-archives Compilation%当代石刻档案编纂的研究

    Institute of Scientific and Technical Information of China (English)

    赵彦昌; 朱效荣

    2016-01-01

    Stone-archives are original records of human activities,which saved consciously by human and use the stone as its carrier.Due to its great value,it have been inspiring people from all walks of life to study the stone-archives compilation.Since the achievements of stone-archives compilation is very rich after the founding of PRC,but there is no one have been summarizing and analyzing this compiling founding systematic.This paper analyzes the present situation of the contemporary stone-archives compilation from the view of evolution and methods and reveals the value of stone-archives compilation achievements.%石刻档案是以石为载体的人们有意识保存起来的人类活动的原始性文字记录,具有重要的史料价值,从而激发了各界对石刻档案的编纂。建国后石刻档案的编纂成果十分丰富,但目前还没有人对这些编纂成果进行过系统的总结和分析。本文从石刻档案编纂的沿革和编纂方法来着手分析当代石刻档案编纂的现状,揭示石刻档案编纂成果的价值。

  4. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  5. ERES--a PC software for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingjin

    1993-01-01

    The major functions and implementation of the software ERES (EXFOR Edit System) are introduced. The ERES is developed for nuclear data compilation in EXFOR (EXchange FORmat) format, running on IBM-PC/XT or IBM-PC/AT. EXFOR is the format for the exchange of experimental neutron data accepted by four neutron data centers in the world

  6. Customised Column Generation for Rostering Problems: Using Compile-time Customisation to create a Flexible C++ Engine for Staff Rostering

    DEFF Research Database (Denmark)

    Mason, Andrew J.; Ryan, David; Hansen, Anders Dohn

    2009-01-01

    , but is difficult to maintain, and incurs the time penalties of run-time customisation. Our new approach is to customise the software at compile time, allowing compiler optimisations to be fully exploited to give faster code. The code has also proven to be easier to read and debug....

  7. Workplace Violence and Job Outcomes of Newly Licensed Nurses

    OpenAIRE

    Chang, Hyoung Eun; Cho, Sung-Hyun

    2016-01-01

    Purpose: The purpose of this study was to examine the prevalence of workplace violence toward newly licensed nurses and the relationship between workplace violence and job outcomes. Methods: An online survey was conducted of newly licensed registered nurses who had obtained their license in 2012 or 2013 in South Korea and had been working for 5–12 months after first being employed. The sample consisted of 312 nurses working in hospitals or clinics. The Copenhagen Psychosocial Questionnaire...

  8. Geophysical Investigation of the Raton Basin.

    Science.gov (United States)

    1982-05-01

    about 45 kin, and the crult thinned from west to east. A basement relief map was constructed from the Bouguer gravity data. Computer techniques were...INTERPRETATION OF THE GRAVITY, MAGNETIC, AND SEISMIC DATA .. ...............17 Free Air Gravity Data. .............. 17 Bouguer Gravity Data...Capulin free air gravity map .... .............. ... 24 9. Plot of the averaged NOAA elevation data vs. the averaged NOAA Bouguer gravity data

  9. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  10. Data compilation of respiration, feeding, and growth rates of marine pelagic organisms

    DEFF Research Database (Denmark)

    2013-01-01

    's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from...

  11. Compilation of properties data for Li{sub 2}TiO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Roux, N [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1998-03-01

    Properties data obtained at CEA for Li{sub 2}TiO{sub 3} are reported. The compilation includes : stability of Li{sub 2}TiO{sub 3} {beta} phase, specific heat, thermal diffusivity, thermal conductivity, linear thermal expansion, thermal creep, interaction with water and acid. (author)

  12. ACE - an algebraic compiler and encoder for the Chalk River datatron computer

    International Nuclear Information System (INIS)

    Kennedy, J.M.; Okazaki, E.A.; Millican, M.

    1960-03-01

    ACE is a program written for the Chalk River Datatron (Burroughs 205) Computer to enable the machine to compile a program for solving a problem from instructions supplied by the user in a notation related much more closely to algebra than to the machine's own code. (author)

  13. ccPDB: compilation and creation of data sets from Protein Data Bank.

    Science.gov (United States)

    Singh, Harinder; Chauhan, Jagat Singh; Gromiha, M Michael; Raghava, Gajendra P S

    2012-01-01

    ccPDB (http://crdd.osdd.net/raghava/ccpdb/) is a database of data sets compiled from the literature and Protein Data Bank (PDB). First, we collected and compiled data sets from the literature used for developing bioinformatics methods to annotate the structure and function of proteins. Second, data sets were derived from the latest release of PDB using standard protocols. Third, we developed a powerful module for creating a wide range of customized data sets from the current release of PDB. This is a flexible module that allows users to create data sets using a simple six step procedure. In addition, a number of web services have been integrated in ccPDB, which include submission of jobs on PDB-based servers, annotation of protein structures and generation of patterns. This database maintains >30 types of data sets such as secondary structure, tight-turns, nucleotide interacting residues, metals interacting residues, DNA/RNA binding residues and so on.

  14. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  15. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  16. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  17. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  18. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  19. Exercise recommendations in patients with newly diagnosed fibromyalgia.

    Science.gov (United States)

    Wilson, Brad; Spencer, Horace; Kortebein, Patrick

    2012-04-01

    To evaluate exercise recommendations in patients newly diagnosed with fibromyalgia. A retrospective chart review. A public university rheumatology clinic. Patients newly diagnosed with fibromyalgia (N = 122). Frequency and type of exercise recommendations. The mean (standard deviation) age of these patients with fibromyalgia was 45 ± 12 years; 91% were women. Exercise was recommended as part of the documented treatment plan in 47% of these patients (57/122); only 3 patients had a documented contraindication for exercise. Aquatic exercise was most frequently recommended (56% [32/57]), followed by combined aquatic-aerobic exercise (26% [15/57]), and, infrequently, aerobic exercise only (5% [3/57]); only 7% of these patients (4/57) were referred for physical therapy. The primary method of communication was verbal discussion (94% [54/57]). Although there is well-documented evidence that exercise is beneficial for patients with fibromyalgia, we found that less than half of patients with newly diagnosed fibromyalgia in our study were provided recommendations to initiate an exercise program as part of their treatment plan. Further investigation of these findings are warranted, including evaluation of other university and community rheumatology practices as well as that of other physicians caring for patients with fibromyalgia. However, our findings indicate that there appears to be an opportunity to provide more specific and practical education regarding the implementation of an exercise regimen for patients with newly diagnosed fibromyalgia. Physiatrists may be particularly well suited to manage the exercise component of patients with fibromyalgia because of their specialized training in exercise prescription. Copyright © 2012 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  20. Compilation of the FY 1999 Department of the Navy Working Capital Fund Financial Statements

    National Research Council Canada - National Science Library

    2000-01-01

    ...) Cleveland Center consistently and accurately compiled and consolidated financial data received from Navy field organizations and other sources to prepare the FY 1999 Navy Working Capital Fund financial statements...

  1. Compilations of measured and calculated physicochemical property values for PCBs, PBDEs, PCDDs and PAHs

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset consists of compilations of measured and calculated physicochemical property values for PCBs, PBDEs, PCDDs and PAHs. The properties included in this...

  2. Regulatory and technical reports (Abstract Index Journal): Annual compilation for 1988

    International Nuclear Information System (INIS)

    1989-05-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  3. Regulatory and technical reports (abstract index journal): Annual compilation for 1986

    International Nuclear Information System (INIS)

    1987-03-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  4. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  5. LISP software generative compilation within the frame of a SLIP system; La compilation generative de programmes LISP dans le cadre d'un systeme SLIP

    Energy Technology Data Exchange (ETDEWEB)

    Sitbon, Andre

    1968-04-24

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language.

  6. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  7. Nilotinib versus imatinib for newly diagnosed chronic myeloid leukemia

    DEFF Research Database (Denmark)

    Saglio, Giuseppe; Kim, Dong-Wook; Issaragrisil, Surapol

    2010-01-01

    Nilotinib has been shown to be a more potent inhibitor of BCR-ABL than imatinib. We evaluated the efficacy and safety of nilotinib, as compared with imatinib, in patients with newly diagnosed Philadelphia chromosome-positive chronic myeloid leukemia (CML) in the chronic phase.......Nilotinib has been shown to be a more potent inhibitor of BCR-ABL than imatinib. We evaluated the efficacy and safety of nilotinib, as compared with imatinib, in patients with newly diagnosed Philadelphia chromosome-positive chronic myeloid leukemia (CML) in the chronic phase....

  8. Transformation of organic N newly added to red soil treated with different cultural practices

    Institute of Scientific and Technical Information of China (English)

    ZhangQin-Zheng; YeQing-Fu; 等

    1998-01-01

    By using 15N tracer method,transformation of organic N,which wqas newly added to red soil treated with different cultural practices,was studied under thelaboratory incubation condition.The experimental results showed that the transformation of N from newly added organic matter and soil native pool during incubation was influenced by cultural practice treatment beforeincubation.Fallow was favorable to the mineralization of newly added organic N and soil N compared with the planting wheat treatment.Planting wheat greatly increased the loss of soil N.Application of fertilizers stimulated the mineralization of newly added organic N and application of organic matter reduced the mineralization,but stimulated microbialtransformation of newly adde4d organic N.

  9. Selecting informative food items for compiling food-frequency questionnaires: Comparison of procedures

    NARCIS (Netherlands)

    Molag, M.L.; Vries, J.H.M. de; Duif, N.; Ocké, M.C.; Dagnelie, P.C.; Goldbohm, R.A.; Veer, P. van 't

    2010-01-01

    The authors automated the selection of foods in a computer system that compiles and processes tailored FFQ. For the selection of food items, several methods are available. The aim of the present study was to compare food lists made by MOM2, which identifies food items with highest between-person

  10. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  11. Run-Time and Compiler Support for Programming in Adaptive Parallel Environments

    Directory of Open Access Journals (Sweden)

    Guy Edjlali

    1997-01-01

    Full Text Available For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have developed a run-time library to provide this support. We discuss how the run-time library can be used by compilers of high-performance Fortran (HPF-like languages to generate code for an adaptive environment. We present performance results for a Navier-Stokes solver and a multigrid template run on a network of workstations and an IBM SP-2. Our experiments show that if the number of processors is not varied frequently, the cost of data redistribution is not significant compared to the time required for the actual computation. Overall, our work establishes the feasibility of compiling HPF for a network of nondedicated workstations, which are likely to be an important resource for parallel programming in the future.

  12. Effective elastic thickness along the conjugate passive margins of India, Madagascar and Antarctica: A re-evaluation using the Hermite multitaper Bouguer coherence application

    Science.gov (United States)

    Ratheesh-Kumar, R. T.; Xiao, Wenjiao

    2018-05-01

    Gondwana correlation studies had rationally positioned the western continental margin of India (WCMI) against the eastern continental margin of Madagascar (ECMM), and the eastern continental margin of India (ECMI) against the eastern Antarctica continental margin (EACM). This contribution computes the effective elastic thickness (Te) of the lithospheres of these once-conjugated continental margins using the multitaper Bouguer coherence method. The results reveal significantly low strength values (Te ∼ 2 km) in the central segment of the WCMI that correlate with consistently low Te values (2-3 km) obtained throughout the entire marginal length of the ECMM. This result is consistent with the previous Te estimates of these margins, and confirms the idea that the low-Te segments in the central part of the WCMI and along the ECMM represents paleo-rift inception points of the lithospheric margins that was thermally and mechanically weakened by the combined action of the Marion hotspot and lithospheric extension during the rifting. The uniformly low-Te value (∼2 km) along the EACM indicates a mechanically weak lithospheric margin, probably due to considerable stretching of the lithosphere, considering the fact that this margin remained almost stationary throughout its rift history. In contrast, the ECMI has comparatively high-Te variations (5-11 km) that lack any correlation with the regional tectonic setting. Using gravity forward and inversion applications, we find a leading order of influence of sediment load on the flexural properties of this marginal lithosphere. The study concludes that the thick pile of the Bengal Fan sediments in the ECMI masks and has erased the signal of the original load-induced topography, and its gravity effect has biased the long-wavelength part of the observed gravity signal. The hence uncorrelated flat topography and deep lithospheric flexure together contribute a bias in the flexure modeling, which likely accounts a relatively high Te

  13. Reporting session of UWTF operation. Compilation of documents

    International Nuclear Information System (INIS)

    Shimizu, Kaoru; Togashi, Akio; Irinouchi, Shigenori

    1999-07-01

    This is the compilation of the papers and OHP transparencies presented, as well as discussions and comments, on the occasion of UWTF reporting session. UWTF stands for The Second Uranium Waste Treatment Facility, which was constructed for compression of metallic wastes and used filters, which are parts of uranium bearing solid wastes generated from Tokai Works, Japan Nuclear Cycle Development Institute. UWTF has been processing wastes since June 4 1998. In the session, based on the one year experience of UWTF operation, the difficulties met and the suggestions to the waste sources are mainly discussed. A brief summary of the UWTF construction, description of waste treatment process, and operation report of fiscal year 1998 are attached. (A. Yamamoto)

  14. The mathematical model of the task of compiling the time-table

    Directory of Open Access Journals (Sweden)

    О.Є. Литвиненко

    2004-01-01

    Full Text Available  The mathematical model of the task of compiling the time-table in High-school has been carried out.  It has been showed, that the task may be reduced to canonical form of extrimal combinatorial tasks with unlinear structure after identical transformations. The algorithm of the task’s decision for realizing the scheme of the directed sorting of variants is indicated.

  15. Irradiation of strawberries. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1994-12-01

    The document contains a compilation of all available scientific and technical data on the irradiation of strawberries. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. Refs, 1 tab

  16. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  17. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  18. Possible origin of Saturn's newly discovered outer ring

    International Nuclear Information System (INIS)

    Moehlmann, D.

    1986-01-01

    Within a planetogonic model the self-gravitationally caused formation of pre-planetary and pre-satellite rings from an earlier thin disk is reported. The theoretically derived orbital radii of these rings are compared with the orbital levels in the planetary system and the satellite systems of Jupiter, Saturn and Uranus. From this comparison it is concluded that at the radial position of Saturn's newly discovered outer ring an early pre-satellite ring of more or less evolved satellites could have existed. These satellites should have been disturbed in their evolution by the gravitation of the neighbouring massive satellite Titan. The comparison also may indicate similarities between the asteroidal belt and the newly discovered outer ring of Saturn

  19. Oral Cancer Knowledge Assessment: Newly Graduated versus Senior Dental Clinicians

    Science.gov (United States)

    Salgado de Souza, Ricardo; Gallego Arias Pecorari, Vanessa; Lauria Dib, Luciano

    2018-01-01

    The present study assessed the level of dentists' knowledge regarding oral cancer in the city of São Paulo, Brazil. A questionnaire was used to compare the level of knowledge among newly graduated and senior clinicians. A total of 20,154 e-mails were correctly delivered to the dentists registered in the database of the Regional Dentistry Council of São Paulo, and 477 (2.36%) responses were received. This sample consisted of 84 newly graduated clinicians and 105 senior clinicians. For the statistical analysis, the chi-square test and the logistic regression analysis were performed with α = 0.05, and the results were described herein. According to their knowledge level, the results were statistically different between the groups, since 19% of the newly graduated clinicians were evaluated with knowledge grade A (excellent) in comparison to 6.7% of the senior clinicians. In spite of the results indicated that newly graduated clinicians' knowledge regarding oral cancer was 2.1 times higher, 34.5% of the professionals in this group had regular or poor knowledge on the subject, and several questions relating to clinical characteristics and risk factors indicated that there still exist some knowledge gaps, demonstrating that there is a need for further studies and information activities addressing oral cancer. PMID:29666649

  20. Advanced Multivariate Inversion Techniques for High Resolution 3D Geophysical Modeling

    Science.gov (United States)

    2011-09-01

    2005). We implemented a method to increase the usefulness of gravity data by filtering the Bouguer anomaly map. Though commonly applied 40 km 30 35...remove the long-wavelength components from the Bouguer gravity map we follow Tessema and Antoine (2004), who use an upward continuation method and...inversion of group velocities and gravity. (a) Top: Group velocities from a representative cell in the model. Bottom: Filtered Bouguer anomalies. (b

  1. MX Siting Investigation. Gravity Survey - Sevier Desert Valley, Utah.

    Science.gov (United States)

    1981-01-24

    Cheyenne, Wyoming. DMAHTC reduces the data to Simple Bouguer Anomaly (see Section A1.4, Appendix Al.0). The Defense Mapping Agency Aerospace Center...Desert Valley, Utah ......... 2 2 Topographic Setting - Sevier Desert Valley, Utah . 3 LIST OF DRAWINGS Drawing Number 1 Complete Bouguer Anomaly...gravity stations were distributed throughout the valley at an approxi- mate interval of 1.4 miles (2.3 km). Drawing 1 is a Complete Bouguer Anomaly

  2. MX Siting Investigation. Gravity Survey - Big Smokey Valley, Nevada.

    Science.gov (United States)

    1980-11-28

    headquartered in Cheyenne, Wyoming. DMAHTC reduces the data to Simple Bouguer Anomaly (see Section A1.4, Appendix Al.0). The Defense Mapping Agency Aerospace...OF TABLES Table Number Page Big Smoky Geotechnical Data ....................... 10 LIST OF DRAWINGS Drawing Number 1 Complete Bouguer Anamaly Contours...reduced them to Simple Bouguer Anomalies (SBA) as described in Appendix A1.0. Up to three levels of terrain corrections were applied to the new

  3. Career Motivation in Newly Licensed Registered Nurses: What Makes Them Remain

    Science.gov (United States)

    Banks, Zarata Mann; Bailey, Jessica H.

    2010-01-01

    Despite vast research on newly licensed registered nurses (RNs), we don't know why some newly licensed registered nurses remain in their current jobs and others leave the nursing profession early in their career. Job satisfaction, the most significant factor emerging from the literature, plays a significant role in nurses' decisions to remain in…

  4. The Concept of "Simultaneous Feedback": Towards a New Methodology for Compiling Dictionaries

    Directory of Open Access Journals (Sweden)

    Gilles-Maurice de Schryver

    2011-10-01

    Full Text Available

    Abstract: Good lexicographers are constantly striving to enhance the quality of their dictionaries. Since dictionaries are ultimately judged by their target users, there is an urgency to provide for the target users' needs. In order to determine such needs more accurately, it has become common practice to submit users of a dictionary to a series of tests to monitor their success in information rehieval. In most cases such feedback unfortunately comes too late so that it can at best be considered for. implementation in the next or revised edition of the dictionary. In this article it is argued that feedback from the target users should be obtained while the compilation of the dictionary is still in progress, a process referred to as "simultaneous feedback". This concept, which offers a new methodology for compiling dictionaries, overcomes the major problem 'of creating and publishing entire dictionaries before feedback from target users can be obtained. By this new methodology, the release of several small-scale parallel dictionaries triggers feedback that is immediately channelled to the compilation process of a main dictionary. As such, the target users constantly guide the compilers during the entire compilation process. After a theoretical presentation of the new concept, the feasibility of simultaneous feedback is illustrated with reference to the creation of a bilingual CiIuba-Dutch leamer's dictionary. It is shown how this main project has been successfully complemented by three parallel projects.

    Keywords: SIMULTANEOUS FEEDBACK, NEW METHOOOLOGY, MAIN DICTIONARY, PARALLEL DICTIONARIES, TARGET USERS' DESIRES, QUESTIONNAIRES, ELECTRONIC CORPORA, WORD-FREQUENCY STUDIES, CONCORDANCES, AFRICAN LANGUAGES, CILUBÀ

    Opsomming: Die konsep van "gelyktydige terugvoering": Onderweg na Innuwe metodologie vir die samestelling van. woordeboeke. Goeie leksikograwestreef voortdurend daama om die gehalte van hul woordeboeke te verbeter

  5. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    Science.gov (United States)

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  6. Meeting and activating the newly unemployed

    DEFF Research Database (Denmark)

    Rotger, Gabriel Pons

    -demanding activity. As intensive activation is usually accompanied by intensive search monitoring, it is important to disentangling the contribution of the costly activation programs from that of caseworker meetings. Using Danish data for the period 2010-13, the paper shows that requiring newly unemployed intensive...... activation, contrary to job search meetings, reduces employment and increases sickness benefit claims....

  7. Mentoring in Early Childhood Education: A Compilation of Thinking, Pedagogy and Practice

    Science.gov (United States)

    Murphy, Caterina, Ed.; Thornton, Kate, Ed.

    2015-01-01

    Mentoring is a fundamental and increasingly important part of professional learning and development for teachers in Aotearoa New Zealand. This book is a much-needed resource for mentors, leaders and teachers in early childhood education. It is the first of its kind: a wide ranging compilation that explores the thinking, pedagogy and practice of…

  8. Localization of the maximum sedimentary power of Bermejo, San Juan Basin, Argentina; Ubicacion de la maxima potencia sedimentaria de la Cuenca del bermejo, San Juan, Argentina

    Energy Technology Data Exchange (ETDEWEB)

    Gimenez, Mario E. [Universidad Nacional de San Juan (Argentina). Inst. Sismologico Ing. F. Volponi]|[Consejo Nacional de Investigaciones Cientificas y Tecnicas, Buenos Aires (Argentina); Introcaso, Antonio [Rosario Univ. Nacional (Argentina). Inst. de Fisica]|[Consejo Nacional de Investigaciones Cientificas y Tecnicas, Buenos Aires (Argentina); Martinez, M. Patricia [Universidad Nacional de San Juan (Argentina). Inst. Sismologico Ing. F. Volponi

    1995-12-31

    We began from a relative Bouguer valver chart (source YPF, Yacimientos Petroliferos Fiscales), which we adjusted using three gravialtimetric profiles and were connected to national gravimetric network (Miguelete station, Bs. As). The working area was extended for better evaluation of regional Bouguer anomalies obtained by means of surface of tendency. We compared this work with Introcaso, 1990, where we obtained similar regional Bouguer that they found with strike of geological structure and we used mathematical filter in 3-D, the major area of work and more number of date. We found that the minimum gravimetric was displacement towards north of Bermejo basin, with valves of residual Bouguer anomalies -80 m Gals. (author). 5 refs., 6 figs

  9. JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX

    Science.gov (United States)

    Omidian, Hossein; Lemieux, Guy G. F.

    2018-04-01

    Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.

  10. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs.

  11. A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills

    Directory of Open Access Journals (Sweden)

    Elena Berg

    2016-12-01

    Full Text Available A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills This paper discusses the importance of a cognitive approach to the evaluation of translator’s skills. The authors set forth their recommendations for the compilation of test materials for the evaluation of translators’ cognitive ability.   Kognitywne podejście do kompilowania tekstów służących ocenie umiejętności tłumacza Artykuł porusza wagę kognitywnego podejścia do ewaluacji umiejętności tłumacza. Autorzy przedstawiają swoje zalecenia co do kompilowania materiałów testowych do ewaluacji kognitywnych zdolności tłumacza.

  12. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    International Nuclear Information System (INIS)

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs

  13. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1984, July-September. Volume 9, No. 3

    International Nuclear Information System (INIS)

    1984-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. These precede the following indexes: Contractor Report Number, Personal Author, Subject, NRC Originating Organization (Staff Reports), NRC Contract Sponsor (Contractor Reports), Contractor, and Licensed Facility

  14. A systematic monograph of the Recent Pentastomida, with a compilation of their hosts

    NARCIS (Netherlands)

    Christoffersen, M.L.; De Assis, J.E.

    2013-01-01

    We compile all published information on the Recent Pentastomida published to date, including complete synonyms, and species distributions. All host species are cited and their names updated. A taxonomical history of the group, a synthesis of phylogenetic information for the taxon, and a summary of

  15. Competence of newly qualified registered nurses from a nursing college

    Directory of Open Access Journals (Sweden)

    BG Morolong

    2005-09-01

    Full Text Available The South African education and training system, through its policy of outcomesbased education and training, has made competency a national priority. In compliance to this national requirement of producing competent learners, the South African Nursing Council ( 1999 B require that the beginner professional nurse practitioners and midwives have the necessary knowledge, skills, attitudes and values which will enable them to render efficient professional service. The health care system also demands competent nurse practitioners to ensure quality in health care. In the light of competency being a national priority and a statutory demand, the research question that emerges is, how competent are the newly qualified registered nurses from a specific nursing college in clinical nursing education? A quantitative, non-experimental contextual design was used to evaluate the competence of newly qualified registered nurses from a specific nursing college. The study was conducted in two phases. The first phase dealt with the development of an instrument together with its manual through the conceptualisation process. The second phase focused on the evaluation of the competency of newly qualified nurses using the instrument based on the steps of the nursing process. A pilot study was conducted to test the feasibility of the items of the instrument. During the evaluation phase, a sample of twenty-six newly qualified nurses was selected by simple random sampling from a target population of thirty-six newly qualified registered nurses. However, six participants withdrew from the study. Data was collected in two general hospitals where the newly qualified registered nurses were working. Observation and questioning were used as data collection techniques in accordance with the developed instrument. Measures were taken to ensure internal validity and reliability of the results. To protect the rights of the participants, the researcher adhered to DENOSA’S (1998

  16. Clinical heterogeneity in newly diagnosed Parkinson's disease

    NARCIS (Netherlands)

    Post, Bart; Speelman, Johannes D.; de Haan, Rob J.

    2008-01-01

    OBJECTIVE: To determine clinical heterogeneity in newly diagnosed Parkinson's disease using cluster analysis and to describe the subgroups in terms of impairment, disability, perceived quality of life, and use of dopaminergic therapy. METHODS: We conducted a k-means cluster analysis in a prospective

  17. De la problématique des articles synopsis dans la compilation des ...

    African Journals Online (AJOL)

    Autrement dit, au Gabon l'attention doit porter sur l'utilisation des articles synopsis dans la compilation des dictionnaires, en vue de refléter dans ses ouvrages la diversité linguistique et culturelle de ce pays. Ceci est d'autant plus vrai que l'utilisation des articles synopsis ne dépend aucunement de la typologie du ...

  18. VizieR Online Data Catalog: Compilation of stellar rotation data (Kovacs, 2018)

    Science.gov (United States)

    Kovacs, G.

    2018-03-01

    The three datasets included in table1-1.dat, table1-2.dat and table1-6.dat respectively, correspond to the type of stars listed in Table 1 in lines 1 [Praesepe], 2 [HJ_host] and 6 [Field(C)]. These data result from the compilation of rotational and other stellar data from the literature. (4 data files).

  19. [Principles of nutrition in patients with newly appointed stoma].

    Science.gov (United States)

    Pachocka, Lucyna Małgorzata; Urbanik, Anna

    2016-01-01

    The treatment of intestinal stoma is often a difficult experience for patients and results in numerous problems in the physical, psychological and social aspects. Therefore, post-operative care of the patient with the newly appointed stoma should be taken by therapeutic team consisting of doctors, nurses, physiotherapists, dieticians, psychologists and social workers. Appropriate nutritional education of patients aims to improve their quality of life and to prevent from unpleasant ailments formed after the operation. The specific type of stoma may decide about certain dietary recommendations. The presented work provides a practical dietary recommendations for patients with newly appointed stoma.

  20. SERKON program for compiling a multigroup library to be used in BETTY calculation

    International Nuclear Information System (INIS)

    Nguyen Phuoc Lan.

    1982-11-01

    A SERKON-type program was written to compile data sets generated by FEDGROUP-3 into a multigroup library for BETTY calculation. A multigroup library was generated from the ENDF/B-IV data file and tested against the TRX-1 and TRX-2 lattices with good results. (author)

  1. Compilations and evaluations of data on the interaction of electromagnetic radiation with matter

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-05-01

    The material contained in this report deals with data on the interaction of electromagnetic radiation with matter, listing major compilations of X-ray, photon and gamma-ray cross sections and attentuation coefficients, as well as selected reports featuring data on compton scattering, photoelectric absorption and pair production

  2. Interpretation of recent gravity profiles over the ophiolite belt, Northern Oman Mountains, United Arab Emirates

    Science.gov (United States)

    Khattab, M. M.

    1993-04-01

    The compiled Bouguer gravity anomaly map over parts of the ophiolite rocks of the Northern Oman Mountains suggests the existence of three partially serpentinized nappes: two along the Gulf of Oman coast with axes near Dadnah, near Fujira and the third 17 km SSE of Masafi. Modeling of the subsurface geology, beneath two gravity profiles (Diba-Kalba and Masafi-Fujira), is based on the occurrence (field evidence) of multiphase low-angle thrusting of the members of the Tethyan lithosphere in northern and Oman Mountains. An assumed crustal model at the Arabian continental margin, beneath the Masafi-Fujira profile, is made to explain an intense gravity gradient. Gravity interpretation is not inconsistent with a gliding mechanism for obduction of the ophiolite on this part of the Arabian continental margin.

  3. Crustal and Upper Mantle Structure from Joint Inversion of Body Wave and Gravity Data

    Science.gov (United States)

    2012-09-01

    We use both free-air and Bouguer gravity anomalies derived from the global gravity model of the GRACE satellite mission. The gravity data provide...relocation analysis. We use both free-air and Bouguer gravity anomalies derived from the global gravity model of the GRACE satellite mission. The gravity...topographic relief this effect needs to be removed; thus, we converted free-air anomalies into Bouguer anomalies assuming a standard density for crustal rocks

  4. MX Siting Investigation Gravity Survey - Wah Wah Valley, Utah.

    Science.gov (United States)

    1981-05-15

    Bouguer Anomaly (see Section A1.4, Appendix A1.0). The Defense Mapping Agency Aerospace Center (DMAAC), St. Louis, Missouri, calculates outer zone...Utah .... 12 LIST OF DRAWINGS Drawing Number 1 Complete Bouguer Anomaly Contours In Pocket at 2 Deptn to Rock - Interpreted from End of Report...DMAHTC/GSS obtained the basic observations for the new stations and reduced them to Simple Bouguer Anomalies (SBA) as described in Appendix A1.0. Up to

  5. Microfluidic very large scale integration (VLSI) modeling, simulation, testing, compilation and physical synthesis

    CERN Document Server

    Pop, Paul; Madsen, Jan

    2016-01-01

    This book presents the state-of-the-art techniques for the modeling, simulation, testing, compilation and physical synthesis of mVLSI biochips. The authors describe a top-down modeling and synthesis methodology for the mVLSI biochips, inspired by microelectronics VLSI methodologies. They introduce a modeling framework for the components and the biochip architecture, and a high-level microfluidic protocol language. Coverage includes a topology graph-based model for the biochip architecture, and a sequencing graph to model for biochemical application, showing how the application model can be obtained from the protocol language. The techniques described facilitate programmability and automation, enabling developers in the emerging, large biochip market. · Presents the current models used for the research on compilation and synthesis techniques of mVLSI biochips in a tutorial fashion; · Includes a set of "benchmarks", that are presented in great detail and includes the source code of several of the techniques p...

  6. Two Newly Discovered Plants in Taiwan

    OpenAIRE

    Tian-Chuan Hsu; Jia-Jung Lin; Shih-Wen Chung

    2009-01-01

    Two herbs are newly discovered in Taiwan. Limnophila fragrans (G. Forst.) Seem. (Scrophulariaceae), native in SE Asia, is recognized from southern lowlands. Anagallis minima (L.) E. H. L. Krause (Primulaceae), native in N America and Europe, was found from northern mountainous region at low altitudes. In this study, descriptions, line drawings, color photos and a distribution map of the two species are provided.

  7. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  8. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  9. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  10. Pin count-aware biochemical application compilation for mVLSI biochips

    DEFF Research Database (Denmark)

    Lander Raagaard, Michael; Pop, Paul

    2015-01-01

    Microfluidic biochips are replacing the conventional biochemical analyzers and are able to integrate the necessary functions for biochemical analysis on-chip. In this paper we are interested in flow-based biochips, in which the fluidic flow manipulated using integrated microvalves, which are cont...... a biochemical application. We focus on the compilation task, where the strategy is to delay operations, without missing their deadlines, such that the sharing of control signals is maximized. The evaluation shows a significant reduction in the number of control pins required....

  11. Newly-formed emotional memories guide selective attention processes: Evidence from event-related potentials.

    Science.gov (United States)

    Schupp, Harald T; Kirmse, Ursula; Schmälzle, Ralf; Flaisch, Tobias; Renner, Britta

    2016-06-20

    Emotional cues can guide selective attention processes. However, emotional stimuli can both activate long-term memory representations reflecting general world knowledge and engage newly formed memory representations representing specific knowledge from the immediate past. Here, the self-completion feature of associative memory was utilized to assess the regulation of attention processes by newly-formed emotional memory. First, new memory representations were formed by presenting pictures depicting a person either in an erotic pose or as a portrait. Afterwards, to activate newly-built memory traces, edited pictures were presented showing only the head region of the person. ERP recordings revealed the emotional regulation of attention by newly-formed memories. Specifically, edited pictures from the erotic compared to the portrait category elicited an early posterior negativity and late positive potential, similar to the findings observed for the original pictures. A control condition showed that the effect was dependent on newly-formed memory traces. Given the large number of new memories formed each day, they presumably make an important contribution to the regulation of attention in everyday life.

  12. Newly-formed emotional memories guide selective attention processes: Evidence from event-related potentials

    Science.gov (United States)

    Schupp, Harald T.; Kirmse, Ursula; Schmälzle, Ralf; Flaisch, Tobias; Renner, Britta

    2016-01-01

    Emotional cues can guide selective attention processes. However, emotional stimuli can both activate long-term memory representations reflecting general world knowledge and engage newly formed memory representations representing specific knowledge from the immediate past. Here, the self-completion feature of associative memory was utilized to assess the regulation of attention processes by newly-formed emotional memory. First, new memory representations were formed by presenting pictures depicting a person either in an erotic pose or as a portrait. Afterwards, to activate newly-built memory traces, edited pictures were presented showing only the head region of the person. ERP recordings revealed the emotional regulation of attention by newly-formed memories. Specifically, edited pictures from the erotic compared to the portrait category elicited an early posterior negativity and late positive potential, similar to the findings observed for the original pictures. A control condition showed that the effect was dependent on newly-formed memory traces. Given the large number of new memories formed each day, they presumably make an important contribution to the regulation of attention in everyday life. PMID:27321471

  13. Code of ethics for the national pharmaceutical system: Codifying and compilation.

    Science.gov (United States)

    Salari, Pooneh; Namazi, Hamidreza; Abdollahi, Mohammad; Khansari, Fatemeh; Nikfar, Shekoufeh; Larijani, Bagher; Araminia, Behin

    2013-05-01

    Pharmacists as one of health-care providers face ethical issues in terms of pharmaceutical care, relationship with patients and cooperation with the health-care team. Other than pharmacy, there are pharmaceutical companies in various fields of manufacturing, importing or distributing that have their own ethical issues. Therefore, pharmacy practice is vulnerable to ethical challenges and needs special code of conducts. On feeling the need, based on a shared project between experts of the ethics from relevant research centers, all the needs were fully recognized and then specified code of conduct for each was written. The code of conduct was subject to comments of all experts involved in the pharmaceutical sector and thus criticized in several meetings. The prepared code of conduct is comprised of professional code of ethics for pharmacists, ethics guideline for pharmaceutical manufacturers, ethics guideline for pharmaceutical importers, ethics guideline for pharmaceutical distributors, and ethics guideline for policy makers. The document was compiled based on the principles of bioethics and professionalism. The compiling the code of ethics for the national pharmaceutical system is the first step in implementing ethics in pharmacy practice and further attempts into teaching the professionalism and the ethical code as the necessary and complementary effort are highly recommended.

  14. Pollen parameters estimates of genetic variability among newly ...

    African Journals Online (AJOL)

    Pollen parameters estimates of genetic variability among newly selected Nigerian roselle (Hibiscus sabdariffa L.) genotypes. ... Estimates of some pollen parameters where used to assess the genetic diversity among ... HOW TO USE AJOL.

  15. Tricks of the trade: time management tips for newly qualified doctors.

    Science.gov (United States)

    Offiah, Gozie; Doherty, Eva

    2018-03-01

    The transition from medical student to doctor is an important milestone. The discovery that their time is no longer their own and that the demands of their job are greater than the time they have available is extremely challenging. At a recent surgical boot camp training programme, 60 first-year surgical trainees who had just completed their internship were invited to reflect on the lessons learnt regarding effective time management and to recommend tips for their newly qualified colleagues. They were asked to identify clinical duties that were considered urgent and important using the time management matrix and the common time traps encountered by newly qualified doctors. The surgical trainees identified several practical tips that ranged from writing a priority list to working on relationships within the team. These tips are generic and so applicable to all newly qualified medial doctors. We hope that awareness of these tips from the outset as against learning them through experience will greatly assist newly qualified doctors. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Chest Radiographic Findings in Newly Diagnosed Pulmonary ...

    African Journals Online (AJOL)

    Five hundred newly diagnosed cases of Pulmonary Tuberculosis were treated with directly observed short-course treatment and 100 of them had chest radiographic examination done. The various chest radiographic patterns in the 100 subjects were studied and included: Fluffy exudative changes 80(80%), fibrosis 70(70%) ...

  17. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  18. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  19. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors

  20. Newly graduated nurses' empowerment regarding professional competence and other work-related factors.

    Science.gov (United States)

    Kuokkanen, Liisa; Leino-Kilpi, Helena; Numminen, Olivia; Isoaho, Hannu; Flinkman, Mervi; Meretoja, Riitta

    2016-01-01

    Although both nurse empowerment and competence are fundamental concepts of describing newly graduated nurses' professional development and job satisfaction, only few studies exist on the relationship between these concepts. Therefore, the purpose of this study was to determine how newly graduated nurses assess their empowerment and to clarify professional competence compared to other work-related factors. A descriptive, cross-sectional and correlational design was applied. The sample comprised newly graduated nurses (n = 318) in Finland. Empowerment was measured using the 19-item Qualities of an Empowered Nurse scale and the Nurse Competence Scale measured nurses' self-assessed generic competence. In addition to demographic data, the background data included employment sector (public/private), job satisfaction, intent to change/leave job, work schedule (shifts/business hours) and assessments of the quality of care in the workplace. The data were analysed statistically by using Spearman's correlation coefficient as well as the One-Way and Multivariate Analysis of Variance. Cronbach's alpha coefficient was used to estimate the internal consistency. Newly graduated nurses perceived their level of empowerment and competence fairly high. The association between nurse empowerment and professional competence was statistically significant. Other variables correlating positively to empowerment included employment sector, age, job satisfaction, intent to change job, work schedule, and satisfaction with the quality of care in the work unit. The study indicates competence had the strongest effect on newly graduated nurses' empowerment. New graduates need support and career opportunities. In the future, nurses' further education and nurse managers' resources for supporting and empowering nurses should respond to the newly graduated nurses' requisites for attractive and meaningful work.

  1. Two Newly Discovered Plants in Taiwan

    Directory of Open Access Journals (Sweden)

    Tian-Chuan Hsu

    2009-11-01

    Full Text Available Two herbs are newly discovered in Taiwan. Limnophila fragrans (G. Forst. Seem. (Scrophulariaceae, native in SE Asia, is recognized from southern lowlands. Anagallis minima (L. E. H. L. Krause (Primulaceae, native in N America and Europe, was found from northern mountainous region at low altitudes. In this study, descriptions, line drawings, color photos and a distribution map of the two species are provided.

  2. Compiling for Novel Scratch Pad Memory based Multicore Architectures for Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Aviral

    2016-02-05

    The objective of this proposal is to develop tools and techniques (in the compiler) to manage data of a task and communication among tasks on the scratch pad memory (SPM) of the core, so that any application (a set of tasks) can be executed efficiently on an SPM based manycore architecture.

  3. Diabetes education and self-management for ongoing and newly diagnosed (DESMOND)

    DEFF Research Database (Denmark)

    Skinner, T. Chas; Carey, Marian E.; Cradock, Sue

    2006-01-01

    diagnosed with Type 2 diabetes changes key illness beliefs and that these changes predict quality of life and metabolic control at 3-month follow-up. Practice implications: Newly diagnosed individuals are open to attending self-management programs and, if the program is theoretically driven, can......Objective: To determine the effects of a structured education program on illness beliefs, quality of life and physical activity in people newly diagnosed with Type 2 diabetes. Methods: Individuals attending a diabetes education and self-management for ongoing and newly diagnosed (DESMOND) program...... in 12 Primary Care Trusts completed questionnaire booklets assessing illness beliefs and quality of life at baseline and 3-month follow-up, metabolic control being assessed through assay of HbA1c. Results: Two hundred and thirty-six individuals attended the structured self-management education sessions...

  4. The Research and Compilation of City Maps in the National Geomatics Atlas of the PEOPLE'S Republic of China

    Science.gov (United States)

    Wang, G.; Wang, D.; Zhou, W.; Chen, M.; Zhao, T.

    2018-04-01

    The research and compilation of new century version of the National Huge Atlas of the People's Republic of China is the special basic work project by Ministry of Science and Technology of the People's Republic of China. Among them, the research and compilation of the National Geomatics Atlas of the People's Republic of China is its main content. The National Geomatics Atlas of China consists of 4 groups of maps and place name index. The 4 groups of maps are separately nationwide thematic map group, provincial fundamental geographical map group, landcover map group and city map group. The city map group is an important component part of the National Geomatics Atlas of China and mainly shows the process of urbanization in China. This paper, aim at design and compilation of 39 city-wide maps, briefly introduces mapping area research and scale design, mapping technical route, content selection and cartographic generalization, symbol design and visualization of map, etc.

  5. Flagellation of Pseudomonas aeruginosa in newly divided cells

    Science.gov (United States)

    Zhao, Kun; Lee, Calvin; Anda, Jaime; Wong, Gerard

    2015-03-01

    For monotrichous bacteria, Pseudomonas aeruginosa, after cell division, one daughter cell inherits the old flagellum from its mother cell, and the other grows a new flagellum during or after cell division. It had been shown that the new flagellum grows at the distal pole of the dividing cell when the two daughter cells haven't completely separated. However, for those daughter cells who grow new flagella after division, it still remains unknown at which pole the new flagellum will grow. Here, by combining our newly developed bacteria family tree tracking techniques with genetic manipulation method, we showed that for the daughter cell who did not inherit the old flagellum, a new flagellum has about 90% chances to grow at the newly formed pole. We proposed a model for flagellation of P. aeruginosa.

  6. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  7. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  8. Perceptions of the clinical competence of newly registered nurses in the North West province

    Directory of Open Access Journals (Sweden)

    M.R. Moeti

    2004-09-01

    Full Text Available The clinical competence of newly registered nurses relating to the care of individual Clients, depends on their ability to correlate theoretical knowledge learned in the classroom with practice and the development of clinical skills. Its foundation lies in the ability to identify and solve problems that emanate from critical thinking, analytical reasoning and reflective practice. It is clear that the quality of clinical exposure plays a leading role in the development of nursing professionals. Nursing skills alone cannot ensure quality care of clients without the application of theory. Facilitation of this theory to practice therefore remains an essential component of nursing education. This study was aimed at identifying areas of incompetence of newly registered nurses (1998- 2001 in the clinical area by determining the newly registered nurses1 and professional nurses1 perceptions of the competence of the newly registered nurses. A quantitative, non-experimental, descriptive survey was used to collect the data regarding the clinical competence of newly registered nurses (1998-2001.

  9. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  10. Compilation of contract research for the Materials Engineering Branch, Division of Engineering: Annual report for FY 1987

    International Nuclear Information System (INIS)

    1988-06-01

    This compilation of annual reports by contractors to the Materials Engineering Branch of the NRC Office of Research concentrates on achievements in safety research for the primary system of commercial light water power reactors, particularly with regard to reactor vessels, primary system piping, steam generators, nondestructive examination of primary components, and in safety research for decommissioning and decontamination, on-site storage, and engineered safety features. This report, covering research conducted during Fiscal Year 1987 is the sixth volume of the series of NUREG-0975, ''Compilation of Contractor Research for the Materials Engineering Branch, Division of Engineering.''

  11. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  12. 'Practising under your own Pin'- a description of the transition experiences of newly qualified midwives.

    Science.gov (United States)

    Avis, Mark; Mallik, Maggie; Fraser, Diane M

    2013-11-01

    Transition experiences of newly qualified midwives were examined in depth during the third phase of a UK evaluation study of midwifery education. The fitness to practise and the retention of newly qualified nursing and midwifery graduates are pressing concerns for health care managers. The advantages of preceptorship are reported in the literature but the content and timing of schemes remain unclear. A semi-structured diary was kept for up to 6 months by 35 newly qualified midwives in 18 work sites covering all countries in the UK. The preceptor and supervisor of midwives for each newly qualified midwife completed short questionnaires about their preceptee's performance, and a further sub-sample of newly qualified midwives and preceptors participated in a semi-structured interview. Data were analysed to elicit aspects of newly qualified midwives transition experiences. Findings confirm that structured preceptorship schemes are not widely available. Newly qualified midwives primarily obtained transition support from members of the midwifery team. Although perceived as competent, there is no demarcation point in becoming confident to practise as a registered practitioner. Implications for managers include the importance of a supportive culture within clinical teams for successful transition and the introduction of structured preceptorship schemes facilitated by appropriate rotation patterns. © 2012 John Wiley & Sons Ltd.

  13. Combustion, performance and emissions characteristics of a newly ...

    Indian Academy of Sciences (India)

    of a newly developed CRDI single cylinder diesel engine. AVINASH ... In case of unit injector and unit pump systems, fuel injection pressure depends on ... nozzle hole diameters were effective in reducing smoke and PM emissions. However ...

  14. Yield and Adaptability Evaluation of Newly Introduced Tomato ...

    African Journals Online (AJOL)

    High yield is a major ambition to tomato plant breeders and farmers. The purpose of the ... Tabora Region on the growth and yield of newly introduced tomato varieties. The tested ..... (1985). Evaluation of some American tomatocultivars grown.

  15. Mentorship for newly appointed physicians: a strategy for enhancing patient safety?

    Science.gov (United States)

    Harrison, Reema; McClean, Serwaa; Lawton, Rebecca; Wright, John; Kay, Clive

    2014-09-01

    Mentorship is an increasingly popular innovation from business and industry that is being applied in health-care contexts. This paper explores the concept of mentorship for newly appointed physicians in their first substantive senior post, and specifically its utilization to enhance patient safety. Semi-structured face to face and telephone interviews with Medical Directors (n = 5), Deputy Medical Directors (n = 4), and Clinical Directors (n = 6) from 9 acute NHS Trusts in the Yorkshire and Humber region in the north of England. A focused thematic analysis was used. A number of beneficial outcomes were associated with mentorship for newly appointed physicians including greater personal and professional support, organizational commitment, and general well-being. Providing newly appointed senior physicians with support through mentorship was considered to enhance the safety of patient care. Mentorship may prevent or reduce active failures, be used to identify threats in the local working environment, and in the longer term, address latent threats to safety within the organization by encouraging a healthier safety culture. Offering mentorship to all newly appointed physicians in their first substantive post in health care may be a useful strategy to support the development of their clinical, professional, and personal skills in this transitional period that may also enhance the safety of patient care.

  16. Newly elected IAEA Board of Governors

    International Nuclear Information System (INIS)

    2000-01-01

    The document gives information about the election of 11 Member States to the IAEA Board of Governors, the 35-member policy-making body, during the 44th regular session of the IAEA's General Conference (18 - 22 September 2000, Austria Center, Vienna). The newly elected Member States are: Argentina, Egypt, Ghana, Ireland, Libyan Arab Jamahiriya, Mexico, Pakistan, Peru, Switzerland, Thailand, Ukraine. The other 24 Member States of the Board are also given

  17. MX Siting Investigation, Gravity Survey - Delamar Valley, Nevada.

    Science.gov (United States)

    1981-07-20

    reduces the data to Simple Bouguer Anomaly (see Section A1.4, Appendix A1.0). The Defense Mapping Agency Aerospace Center (DMAAC), St. Louis, Missouri...DRAWINGS Drawing Number 1 Complete Bouguer Anomaly Contours 2 Depth to Rock -Interpreted from In Pocket at Gravity Data End of Report iv E-TR-33-DM...ErtPX E-TR-3 3-DM 6 2.0 GRAVITY DATA REDUCTION DMAHTC/GSS obtained the basic observations for the new stations and reduced them to Simple Bouguer

  18. A newly developed snack effective for enhancing bone volume

    Directory of Open Access Journals (Sweden)

    Hayashi Hidetaka

    2009-07-01

    Full Text Available Abstract Background The incidence of primary osteoporosis is higher in Japan than in USA and European countries. Recently, the importance of preventive medicine has been gradually recognized in the field of orthopaedic surgery with a concept that peak bone mass should be increased in childhood as much as possible for the prevention of osteoporosis. Under such background, we have developed a new bean snack with an aim to improve bone volume loss. In this study, we examined the effects of a newly developed snack on bone volume and density in osteoporosis model mice. Methods Orchiectomy (ORX and ovariectomy (OVX were performed for C57BL/6J mice of twelve-week-old (Jackson Laboratory, Bar Harbar, ME, USA were used in this experiment. We prepared and given three types of powder diet e.g.: normal calcium diet (NCD, Ca: 0.9%, Clea Japan Co., Tokyo, Japan, low calcium diet (LCD, Ca: 0.63%, Clea Japan Co., and special diet (SCD, Ca: 0.9%. Eighteen weeks after surgery, all the animals were sacrified and prepared for histomorphometric analysis to quantify bone density and bone mineral content. Results As a result of histomorphometric examination, SCD was revealed to enhance bone volume irrespective of age and sex. The bone density was increased significantly in osteoporosis model mice fed the newly developmental snack as compared with the control mice. The bone mineral content was also enhanced significantly. These phenomena were revealed in both sexes. Conclusion It is shown that the newly developed bean snack is highly effective for the improvement of bone volume loss irrespective of sex. We demonstrated that newly developmental snack supplements may be a useful preventive measure for Japanese whose bone mineral density values are less than the ideal condition.

  19. Some measurements of Java-to-bytecode compiler performance in the Java Virtual Machine

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies.

  20. Clinical evaluation of a newly designed orthodontic tooth brush - A clinical study

    Directory of Open Access Journals (Sweden)

    C S Saimbi

    2009-01-01

    In this study, the newly designed orthodontic tooth brush is compared with an ordinary tooth brush. Results of this study show that the newly designed orthodontic tooth brush is superior in its cleaning efficiency as compared to the ordinary tooth brush. The results show that plaque removing capacity of orthodontic tooth brush is nearly 95-99%.