WorldWideScience

Sample records for lines updated compilation

  1. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  2. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  3. The strongest spectral lines of stable elements with other interfering elements in compiled and plotted version

    International Nuclear Information System (INIS)

    Bauer, M.; Weitkamp, C.

    1977-01-01

    The strongest spectra lines of the 85 stable chemical elements have been compiled and plotted along with lines from other elements that may interfere in applications like spectroscopic multielement analysis. For each line a wavelength range of +- 0.25 A.U. around the line of interest has been considered. The tables contain the wavelength, intensity and assignment to an ionization state of the emitting atom, the plots visualize the lines with a doppler broadening corresponding to 8,000 K. (orig.) [de

  4. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  5. NACRE II: an update of the NACRE compilation of charged-particle-induced thermonuclear reaction rates for nuclei with mass number A<16

    International Nuclear Information System (INIS)

    Xu, Y.; Takahashi, K.; Goriely, S.; Arnould, M.; Ohta, M.; Utsunomiya, H.

    2013-01-01

    An update of the NACRE compilation [3] is presented. This new compilation, referred to as NACRE II, reports thermonuclear reaction rates for 34 charged-particle induced, two-body exoergic reactions on nuclides with mass number A 6 ≲T⩽10 10 K range. Along with the ‘adopted’ rates, their low and high limits are provided. The new rates are available in electronic form as part of the Brussels Library (BRUSLIB) of nuclear data. The NACRE II rates also supersede the previous NACRE rates in the Nuclear Network Generator (NETGEN) for astrophysics. [ (http://www.astro.ulb.ac.be/databases.html)

  6. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  7. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  8. Aircraft engine sensor fault diagnostics using an on-line OBEM update method.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.

  9. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  10. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  11. Concepts of incremental updating and versioning

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2004-07-01

    Full Text Available of the work undertaken recently by the Working Group (WG). The WG was voted for a Commission by the General Assembly held at the 21st ICC in Durban, South Africa. The basic problem being addressed by the Commission is that a user compiles their data base... or election). Historically, updates have been provided in bulk, with the new data set replacing the old one. User could: ignore update (if it is not significant enough), manually (and selectively) update their data base, or accept the whole update...

  12. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  13. New uses of sulfur - update

    Energy Technology Data Exchange (ETDEWEB)

    Almond, K.P.

    1995-07-01

    An update to an extensive bibliography on alternate uses of sulfur was presented. Alberta Sulphur Research Ltd., previously compiled a bibliography in volume 24 of this quarterly bulletin. This update provides an additional 44 new publications. The information regarding current research focusses on topics regarding the use of sulfur in oil and gas applications, mining and metallurgy, concretes and other structural materials, waste management, rubber and textile products, asphalts and other paving and highway applications.

  14. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  15. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  16. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  17. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  18. A systematic monograph of the Recent Pentastomida, with a compilation of their hosts

    NARCIS (Netherlands)

    Christoffersen, M.L.; De Assis, J.E.

    2013-01-01

    We compile all published information on the Recent Pentastomida published to date, including complete synonyms, and species distributions. All host species are cited and their names updated. A taxonomical history of the group, a synthesis of phylogenetic information for the taxon, and a summary of

  19. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  20. CO2 line-mixing database and software update and its tests in the 2.1 μm and 4.3 μm regions

    International Nuclear Information System (INIS)

    Lamouroux, J.; Régalia, L.; Thomas, X.; Vander Auwera, J.; Gamache, R.R.; Hartmann, J.-M.

    2015-01-01

    An update of the former version of the database and software for the calculation of CO 2 –air absorption coefficients taking line-mixing into account [Lamouroux et al. J Quant Spectrosc Radiat Transf 2010;111:2321] is described. In this new edition, the data sets were constructed using parameters from the 2012 version of the HITRAN database and recent measurements of line-shape parameters. Among other improvements, speed-dependent profiles can now be used if line-mixing is treated within the first order approximation. This new package is tested using laboratory spectra measured in the 2.1 μm and 4.3 μm spectral regions for various pressures, temperatures and CO 2 concentration conditions. Despite improvements at 4.3 μm at room temperature, the conclusions on the quality of this update are more ambiguous at low temperature and in the 2.1 μm region. Further tests using laboratory and atmospheric spectra are thus required for the evaluation of the performances of this updated package. - Highlights: • High resolution infrared spectroscopy. • CO 2 in air. • Updated tools. • Line mixing database and software

  1. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  2. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  3. Update-in-Place Analysis for True Multidimensional Arrays

    Directory of Open Access Journals (Sweden)

    Steven M. Fitzgerald

    1996-01-01

    Full Text Available Applicative languages have been proposed for defining algorithms for parallel architectures because they are implicitly parallel and lack side effects. However, straightforward implementations of applicative-language compilers may induce large amounts of copying to preserve program semantics. The unnecessary copying of data can increase both the execution time and the memory requirements of an application. To eliminate the unnecessary copying of data, the Sisal compiler uses both build-in-place and update-in-place analyses. These optimizations remove unnecessary array copy operations through compile-time analysis. Both build-in-place and update-in-place are based on hierarchical ragged arrays, i.e., the vector-of-vectors array model. Although this array model is convenient for certain applications, many optimizations are precluded, e.g., vectorization. To compensate for this deficiency, new languages, such as Sisal 2.0, have extended array models that allow for both high-level array operations to be performed and efficient implementations to be devised. In this article, we introduce a new method to perform update-in-place analysis that is applicable to arrays stored either in hierarchical or in contiguous storage. Consequently, the array model that is appropriate for an application can be selected without the loss of performance. Moreover, our analysis is more amenable for distributed memory and large software systems.

  4. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  5. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  6. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  7. Performance of the Line-By-Line Radiative Transfer Model (LBLRTM for temperature, water vapor, and trace gas retrievals: recent updates evaluated with IASI case studies

    Directory of Open Access Journals (Sweden)

    M. J. Alvarado

    2013-07-01

    Full Text Available Modern data assimilation algorithms depend on accurate infrared spectroscopy in order to make use of the information related to temperature, water vapor (H2O, and other trace gases provided by satellite observations. Reducing the uncertainties in our knowledge of spectroscopic line parameters and continuum absorption is thus important to improve the application of satellite data to weather forecasting. Here we present the results of a rigorous validation of spectroscopic updates to an advanced radiative transfer model, the Line-By-Line Radiative Transfer Model (LBLRTM, against a global dataset of 120 near-nadir, over-ocean, nighttime spectra from the Infrared Atmospheric Sounding Interferometer (IASI. We compare calculations from the latest version of LBLRTM (v12.1 to those from a previous version (v9.4+ to determine the impact of spectroscopic updates to the model on spectral residuals as well as retrieved temperature and H2O profiles. We show that the spectroscopy in the CO2 ν2 and ν3 bands is significantly improved in LBLRTM v12.1 relative to v9.4+, and that these spectroscopic updates lead to mean changes of ~0.5 K in the retrieved vertical temperature profiles between the surface and 10 hPa, with the sign of the change and the variability among cases depending on altitude. We also find that temperature retrievals using each of these two CO2 bands are remarkably consistent in LBLRTM v12.1, potentially allowing these bands to be used to retrieve atmospheric temperature simultaneously. The updated H2O spectroscopy in LBLRTM v12.1 substantially improves the a posteriori residuals in the P-branch of the H2O ν2 band, while the improvements in the R-branch are more modest. The H2O amounts retrieved with LBLRTM v12.1 are on average 14% lower between 100 and 200 hPa, 42% higher near 562 hPa, and 31% higher near the surface compared to the amounts retrieved with v9.4+ due to a combination of the different retrieved temperature profiles and the

  8. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  9. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  10. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  11. Recovery and updating of power transmission lines. The experience made by ELETROSUL; Recapacitacao e repotenciacao de linhas de transmissao. A experiencia da ELETROSUL

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Jose Carlos de Saboia [ELETROSUL, Curitiba, PR (Brazil)]. E-mail: jcsaboia@eletrosul.gov.br

    2001-07-01

    This document presents a procedures synthesis for analysis and verification of power transmission line adjustments, aiming their recovery or power updating, it also shows the executed procedures during the stages of preliminary design, final design and construction of the modernized transmission lines by ELETROSUL for the 138, 230 and 500 Kv tensions. The concepts on which the evaluation criteria of the structural components for the transmission lines(metallic supports, foundations in metallic grills and reinforced concrete) have been based are presented, as well as the main aspects of the used methodologies for that. It also makes a comparative analysis related to the used processes for structural analyses of the metallic support in trelliswork (computational process which uses the finite element and graphical methods), as well as the foundations of these supports (structural verifications and others). Also are reported the researches and jobs which involved the power updating of the three 138 and 230 kv transmission lines of ELETROSUL.

  12. Updating of visual orientation in a gravity-based reference frame.

    Science.gov (United States)

    Niehof, Nynke; Tramper, Julian J; Doeller, Christian F; Medendorp, W Pieter

    2017-10-01

    The brain can use multiple reference frames to code line orientation, including head-, object-, and gravity-centered references. If these frames change orientation, their representations must be updated to keep register with actual line orientation. We tested this internal updating during head rotation in roll, exploiting the rod-and-frame effect: The illusory tilt of a vertical line surrounded by a tilted visual frame. If line orientation is stored relative to gravity, these distortions should also affect the updating process. Alternatively, if coding is head- or frame-centered, updating errors should be related to the changes in their orientation. Ten subjects were instructed to memorize the orientation of a briefly flashed line, surrounded by a tilted visual frame, then rotate their head, and subsequently judge the orientation of a second line relative to the memorized first while the frame was upright. Results showed that updating errors were mostly related to the amount of subjective distortion of gravity at both the initial and final head orientation, rather than to the amount of intervening head rotation. In some subjects, a smaller part of the updating error was also related to the change of visual frame orientation. We conclude that the brain relies primarily on a gravity-based reference to remember line orientation during head roll.

  13. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 7, December 2005 (last updated 2005.12.15)

    International Nuclear Information System (INIS)

    2005-12-01

    This Profiles report is based on data collected using the NEWMDB from May to October 2005. The report was first published on line within the NEWMDB December 2005. Starting with NEWMDB version II (introduced January 2004) individual reports can be updated after a Profiles report is published. Please refer to the Profiles bookmark; the page that is accessed via this bookmark lists revisions to individual Profiles (if there are any)

  14. Bibliography on electron transfer processes in ion-ion/atom/molecule collisions, updated 1990

    International Nuclear Information System (INIS)

    Tawara, H.

    1990-08-01

    Following a previous compilation, new bibliographic information on experimental and theoretical studies on electron transfer processes in ion-ion/atom/molecule collisions is up-dated. The references published through 1989 are surveyed. For easy finding references for particular combination of collision partners, a simple list is also provided. Furthermore, for convenience, a copy of the previous compilation (IPPJ-AM-45 (1986)) is included. (author) 1363 refs

  15. Hydrogeological conditions in the Finnsjoen area. Compilation of data and conceptual model

    International Nuclear Information System (INIS)

    Andersson, J.E.; Nordqvist, R.; Nyberg, G.; Smellie, J.; Tiren, S.

    1991-02-01

    In the present report all available data gathered from the Finnsjoen area of potential use for numerical modelling are compiled and discussed. The data have been collected during different phases during the period 1977-1989. This inevitably means that the quality of the measured and interpreted data varies in accordance with the continuous developments of improved equipments and interpretation techniques. The present report is an updated version of the SKB progress report 89-24 with the same title and authors, see introduction. (au)

  16. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  17. First- and Second-Line Targeted Systemic Therapy in Hepatocellular Carcinoma—An Update on Patient Selection and Response Evaluation

    Directory of Open Access Journals (Sweden)

    Johann von Felden

    2016-11-01

    Full Text Available Advanced hepatocellular carcinoma (HCC with vascular invasion and/or extrahepatic spread and preserved liver function, according to stage C of the Barcelona Clinic Liver Cancer (BCLC classification, has a dismal prognosis. The multi-targeted tyrosine-kinase receptor inhibitor (TKI sorafenib is the only proven active substance in systemic HCC therapy for first-line treatment. In this review, we summarize current aspects in patient selection and management of side effects, and provide an update on response evaluation during first-line sorafenib therapy. Since second-line treatment options have been improved with the successful completion of the RESORCE trial, demonstrating a survival benefit for second-line treatment with the TKI regorafenib, response monitoring during first-line therapy will be critical to deliver optimal systemic therapy in HCC. To this regard, specific side effects, in particular worsening of arterial hypertension and diarrhea, might suggest treatment response during first-line sorafenib therapy; however, clear predictive clinical markers, as well as laboratory test or serum markers, are not established. Assessment of radiologic response according to the modified Response Evaluation Criteria in Solid Tumors (mRECIST is helpful to identify patients who do not benefit from sorafenib treatment.

  18. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  19. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  20. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  1. Institutional Ethics Committee Regulations and Current Updates in India.

    Science.gov (United States)

    Mahuli, Amit V; Mahuli, Simpy A; Patil, Shankargouda; Bhandi, Shilpa

    2017-08-01

    The aim of the review is to provide current updates on regulations for ethics committees and researchers in India. Ethical dilemmas in research since time immemorial have been a major concern for researchers worldwide. The question "what makes clinical research ethical" is significant and difficult to answer as multiple factors are involved. The research involving human participants in clinical trials should follow the required rules, regulations, and guidelines in one's own country. It is a dynamic process, and updates have to be learned by researcher and committee members. The review highlights the ethical regulation from the Drug Controller General of India, Clinical Trial Registry of India, and Indian Council of Medical Research guidelines. In this article, the updates on Indian scenario of the Ethical Committee and guidelines are compiled. The review comes handy for clinical researchers and ethics committee members in academic institutions to check on the current updates and keep abreast with the knowledge on regulations of ethics in India.

  2. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  3. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  4. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  5. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  6. Astronomical High Tide Line, Geographic NAD83, NWRC (1995) [hightide_line_NWRC_1995

    Data.gov (United States)

    Louisiana Geographic Information Center — The astronomical high tide line was compiled from National Wetlands Inventory (NWI) 1:24,000-scale habitat maps that were photo-interpreted from color-infrared...

  7. An updated distribution of Solidago × niederederi (Asteraceae in Poland

    Directory of Open Access Journals (Sweden)

    Pliszko Artur

    2017-12-01

    Full Text Available In this paper, an updated map of the distribution of Solidago ×niederederi, a natural hybrid between S. canadensis and S. virgaurea, in Poland is presented using the ATPOL cartogram method. A compiled list of 55 localities of the hybrid within 40 cartogram units (10-km squares is provided and its negative impact on S. virgaurea is highlighted.

  8. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  9. VizieR Online Data Catalog: Compilation of stellar rotation data (Kovacs, 2018)

    Science.gov (United States)

    Kovacs, G.

    2018-03-01

    The three datasets included in table1-1.dat, table1-2.dat and table1-6.dat respectively, correspond to the type of stars listed in Table 1 in lines 1 [Praesepe], 2 [HJ_host] and 6 [Field(C)]. These data result from the compilation of rotational and other stellar data from the literature. (4 data files).

  10. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  11. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  12. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    International Nuclear Information System (INIS)

    Endo, Akira; Yamaguchi, Yasuhiro

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of α particles, β particles, γ rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt γ rays, delayed γ rays, and β particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  13. Compilation of nuclear decay data used for dose calculation. Revised data for radionuclides listed in ICRP Publication 38

    Energy Technology Data Exchange (ETDEWEB)

    Endo, Akira; Yamaguchi, Yasuhiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    New nuclear decay data used for dose calculation have been compiled for 817 radionuclides that are listed in ICRP Publication 38 (Publ. 38) and for 6 additional isomers. The decay data were prepared using decay data sets from the Evaluated Nuclear Structure Data File (ENSDF), the latest version in August 1997. Basic nuclear properties in the decay data sets that are particularly important for calculating energies and intensities of emissions were examined and updated by referring to NUBASE, the database for nuclear and decay properties of nuclides. The reviewed and updated data were half-life, decay mode and its branching ratio, spin and parity of the ground and isomeric states, excitation energy of isomers, and Q value. In addition, possible revisions of partial and incomplete decay data sets were done for their format and syntax errors, level schemes, normalization records, and so on. After that, the decay data sets were processed by EDISTR in order to compute the energies and intensities of {alpha} particles, {beta} particles, {gamma} rays, internal conversion electrons, X rays, and Auger electrons emitted in nuclear transformation. For spontaneously fissioning nuclides, the average energies and intensities of neutrons, fission fragments, prompt {gamma} rays, delayed {gamma} rays, and {beta} particles were also calculated. The compiled data were prepared in two different types of format: Publ. 38 and NUCDECAY formats. Comparison of the compiled decay data with those in Publ. 38 was also presented. The decay data will be widely used for internal and external dose calculations in radiation protection and will be beneficial to a future revision of ICRP Publ. 38. (author)

  14. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  15. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    International Nuclear Information System (INIS)

    Harrington, S.J.

    2011-01-01

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  16. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  17. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  18. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  19. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  20. A compiled checklist of seaweeds of Sudanese Red Sea coast

    Directory of Open Access Journals (Sweden)

    Nahid Abdel Rahim Osman

    2016-02-01

    Full Text Available Objective: To present an updated and compiled checklist of Sudanese seaweeds as an example for the region for conservational as well as developmental purposes. Methods: The checklist was developed based on both field investigations using line transect method at 4 sites along the Red Sea coast of Sudan and review of available studies done on Sudanese seaweeds. Results: In total 114 macroalgal names were recorded and were found to be distributed in 16 orders, 34 families, and 62 genera. The Rhodophyceae macroalgae contained 8 orders, 17 families, 32 genera and 47 species. The Phaeophyceae macroalgae composed of 4 orders, 5 families, 17 genera, and 28 species. The 39 species of the Chlorophyceae macroalgae belong to 2 classes, 4 orders, 12 families, and 14 genera. The present paper proposed the addition of 11 macroalgal taxa to be included in Sudan seaweeds species list. These include 3 red seaweed species, 1 brown seaweed species and 7 green seaweed species. Conclusions: This list is not yet inclusive and it only represents the macroalgal species common to the intertidal areas of Sudan Red Sea coast. Further investigation may reveal the presence of more species. While significant levels of diversity and endemism were revealed for other groups of organisms in the Red Sea region, similar work still has to be performed for seaweeds. Considering the impact of climate change on communities’ structure and composition and the growing risk of maritime transportation through the Red Sea particularly that may originate from oil tankers as well as that may emanate from oil exploration, baseline data on seaweeds are highly required for management purposes.

  1. CLO : The cell line ontology

    NARCIS (Netherlands)

    Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Meehan, Terrence F.; Diehl, Alexander D.; Vempati, Uma D.; Schuerer, Stephan C.; Pang, Chao; Malone, James; Parkinson, Helen; Liu, Yue; Takatsuki, Terue; Saijo, Kaoru; Masuya, Hiroshi; Nakamura, Yukio; Brush, Matthew H.; Haendel, Melissa A.; Zheng, Jie; Stoeckert, Christian J.; Peters, Bjoern; Mungall, Christopher J.; Carey, Thomas E.; States, David J.; Athey, Brian D.; He, Yongqun

    2014-01-01

    Background: Cell lines have been widely used in biomedical research. The community-based Cell Line Ontology (CLO) is a member of the OBO Foundry library that covers the domain of cell lines. Since its publication two years ago, significant updates have been made, including new groups joining the CLO

  2. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  3. Update on the development of cotton gin PM10 emission factors for EPA's AP-42

    Science.gov (United States)

    A cotton ginning industry-supported project was initiated in 2008 to update the U.S. Environmental Protection Agency’s (EPA) Compilation of Air Pollution Emission Factors (AP-42) to include PM10 emission factors. This study develops emission factors from the PM10 emission factor data collected from ...

  4. Bibliography on electron transfer processes in ion-ion/atom/molecule collisions (updated 1993)

    International Nuclear Information System (INIS)

    Tawara, H.

    1993-04-01

    Following our previous compilations [IPPJ-AM-45 (1986), NIFS-DATA-7 (1990)], bibliographic information on experimental and theoretical studies on electron transfer processes in ion-ion/atom/molecule collisions is up-dated. The references published through 1980-1992 are included. For easy finding references for particular combination of collision partners, a simple list is also provided. (author) 1542 refs

  5. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  6. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  7. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  8. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  9. Updated database plus software for line-mixing in CO2 infrared spectra and their test using laboratory spectra in the 1.5-2.3 μm region

    International Nuclear Information System (INIS)

    Lamouroux, J.; Tran, H.; Laraia, A.L.; Gamache, R.R.; Rothman, L.S.; Gordon, I.E.; Hartmann, J.-M.

    2010-01-01

    In a previous series of papers, a model for the calculation of CO 2 -air absorption coefficients taking line-mixing into account and the corresponding database/software package were described and widely tested. In this study, we present an update of this package, based on the 2008 version of HITRAN, the latest currently available. The spectroscopic data for the seven most-abundant isotopologues are taken from HITRAN. When the HITRAN data are not complete up to J''=70, the data files are augmented with spectroscopic parameters from the CDSD-296 database and the high-temperature CDSD-1000 if necessary. Previously missing spectroscopic parameters, the air-induced pressure shifts and CO 2 line broadening coefficients with H 2 O, have been added. The quality of this new database is demonstrated by comparisons of calculated absorptions and measurements using CO 2 high-pressure laboratory spectra in the 1.5-2.3 μm region. The influence of the imperfections and inaccuracies of the spectroscopic parameters from the 2000 version of HITRAN is clearly shown as a big improvement of the residuals is observed by using the new database. The very good agreements between calculated and measured absorption coefficients confirm the necessity of the update presented here and further demonstrate the importance of line-mixing effects, especially for the high pressures investigated here. The application of the updated database/software package to atmospheric spectra should result in an increased accuracy in the retrieval of CO 2 atmospheric amounts. This opens improved perspectives for the space-borne detection of carbon dioxide sources and sinks.

  10. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  11. Light-quark, heavy-quark systems: An update

    International Nuclear Information System (INIS)

    Grinstein, B.

    1993-01-01

    The author reviews many of the recently developed applications of Heavy Quark Effective Theory techniques. After a brief update on Luke's theorm, he describes striking relations between heavy baryon form factors, and how to use them to estimate the accuracy of the extraction of |B cb |. He discusses factorization and compares with experiment. An elementary presentation, with sample applications, of reparametrization invariance comes next. The final and most extensive chapter in this review deals with phenomenological lagrangians that incorporate heavy-quark spin-flavor as well as light quark chiral symmetries. He compiles many interesting results and discuss the validity of the calculations

  12. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  13. On-line Bayesian model updating for structural health monitoring

    Science.gov (United States)

    Rocchetta, Roberto; Broggi, Matteo; Huchet, Quentin; Patelli, Edoardo

    2018-03-01

    Fatigue induced cracks is a dangerous failure mechanism which affects mechanical components subject to alternating load cycles. System health monitoring should be adopted to identify cracks which can jeopardise the structure. Real-time damage detection may fail in the identification of the cracks due to different sources of uncertainty which have been poorly assessed or even fully neglected. In this paper, a novel efficient and robust procedure is used for the detection of cracks locations and lengths in mechanical components. A Bayesian model updating framework is employed, which allows accounting for relevant sources of uncertainty. The idea underpinning the approach is to identify the most probable crack consistent with the experimental measurements. To tackle the computational cost of the Bayesian approach an emulator is adopted for replacing the computationally costly Finite Element model. To improve the overall robustness of the procedure, different numerical likelihoods, measurement noises and imprecision in the value of model parameters are analysed and their effects quantified. The accuracy of the stochastic updating and the efficiency of the numerical procedure are discussed. An experimental aluminium frame and on a numerical model of a typical car suspension arm are used to demonstrate the applicability of the approach.

  14. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  15. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  16. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    Riemer, R.L.

    1992-01-01

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets

  17. Map updates in a dynamic Voronoi data structure

    DEFF Research Database (Denmark)

    Mioc, Darka; Antón Castro, Francesc/François; Gold, C. M.

    2006-01-01

    In this paper we are using local and sequential map updates in the Voronoi data structure, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric...... algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define...

  18. Comprehensive T-matrix Reference Database: A 2009-2011 Update

    Science.gov (United States)

    Zakharova, Nadezhda T.; Videen, G.; Khlebtsov, Nikolai G.

    2012-01-01

    The T-matrix method is one of the most versatile and efficient theoretical techniques widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of peer-reviewed T-matrix publications compiled by us previously and includes the publications that appeared since 2009. It also lists several earlier publications not included in the original database.

  19. Comprehensive T-Matrix Reference Database: A 2007-2009 Update

    Science.gov (United States)

    Mishchenko, Michael I.; Zakharova, Nadia T.; Videen, Gorden; Khlebtsov, Nikolai G.; Wriedt, Thomas

    2010-01-01

    The T-matrix method is among the most versatile, efficient, and widely used theoretical techniques for the numerically exact computation of electromagnetic scattering by homogeneous and composite particles, clusters of particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of T-matrix publications compiled by us previously and includes the publications that appeared since 2007. It also lists several earlier publications not included in the original database.

  20. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  1. Supporting Documentation for the 2008 Update to the Insulation Fact Sheet

    Energy Technology Data Exchange (ETDEWEB)

    Stovall, Therese K [ORNL

    2008-02-01

    The Insulation Fact Sheet provides consumers for general guidance and recommended insulation levels for their home. This fact sheet has been on-line since 1995 and this update addresses new insulation materials, as well as updated costs for energy and materials.

  2. Infrared spectral line parameters of HBr and DBr at elevated temperatures

    International Nuclear Information System (INIS)

    Stocker, R.N.; Goldman, A.

    1976-01-01

    The electric dipole matrix elements for pure rotation and vibration-rotation transitions, with /m/<=40 and v<=v'<=6, having derived for HBr and DBr by using the Rydberg-Klein-Rees (RKR) potentials and numerical solutions of the Schroedinger equation. An improved dipole-moment expansion was determined by fitting these matrix elements to the available experimental data on line intensities. A least squares analysis of the available line position constants gave an improved set of Dunham coefficients good for spectral lines with both large and small quantum numbers v and J. The results were then used to generate a compilation of individual line parameters for the Δv = 1 bands of HBr and DBr at temperatures up to 3000 K. These parameters, together with previously compiled line parameters for HCl, HF, CO and NO, are being used for line-by-line calculations of radiance from a hot source as seen through an atmospheric path. (author)

  3. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  4. UPEML, Computer Independent Emulator of CDC Update Utility

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: UPEML is a machine-portable CDC UPDATE emulation program. It is capable of emulating a significant subset of the standard CDC UPDATE functions, including program library creation and subsequent modification. 2 - Method of solution: UPEML was originally written to facilitate the use of CDC-based scientific packages on alternate computers. In addition to supporting computers such as the VAX/VMS, IBM, and CRAY/COS, Version 3.0 now supports UNIX workstations and the CRAY/UNICOS operating system. Several program bugs have been corrected in Version 3.0. Version 3.0 has several new features including 1) improved error checking, 2) the ability to use *ADDFILE and READ from nested files, 3) creation of compile file on creation, 4) allows identifiers to begin with numbers, and 5) ability to control warning messages and program termination on error conditions. 3 - Restrictions on the complexity of the problem: None noted

  5. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  6. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  7. Light-quark, heavy-quark systems: An update

    Science.gov (United States)

    Grinstein, B.

    1993-06-01

    We review many of the recently developed applications of Heavy Quark Effective Theory techniques. After a brief update on Luke's theorem, we describe striking relations between heavy baryon form factors, and how to use them to estimate the accuracy of the extraction of (vert bar)V(sub cb)(vert bar). We discuss factorization and compare with experiment. An elementary presentation, with sample applications, of reparametrization invariance comes next. The final and most extensive chapter in this review deals with phenomenological lagrangians that incorporate heavy-quark spin-flavor as well as light quark chiral symmetries. We compile many interesting results and discuss the validity of the calculations.

  8. Compilation of reactor physics data of the year 1984, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-12-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1984 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  9. Compilation of reactor physics data of the year 1983, AVR reactor

    International Nuclear Information System (INIS)

    Werner, H.; Bergerfurth, A.; Thomas, F.; Geskes, B.

    1985-06-01

    The 'AVR reactor physics data' is a documentation published once a year, the data presented being obtained by a simulation of reactor operation using the AVR-80 numerical model. This model is constantly updated and improved in response to new results and developments in the field of reactor theory and thermohydraulics, and in response to theoretical or practical modifications of reactor operation or in the computer system. The large variety of measured data available in the AVR reactor simulation system also makes it an ideal testing system for verification of the computing programs presented in the compilation. A survey of the history of operations in 1983 and a short explanation of the computerized simulation methods are followed by tables and graphs that serve as a source of topical data for readers interested in the physics of high-temperature pebble-bed reactors. (orig./HP) [de

  10. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  11. Bibliography on electron transfer processes in ion-ion/atom/molecule collisions. Updated 1997

    International Nuclear Information System (INIS)

    Tawara, H.

    1997-04-01

    Following our previous compilations (IPPJ-AM-45 (1986), NIFS-DATA-7 (1990), NIFS-DATA-20 (1993)), bibliographic information on experimental and theoretical studies on electron transfer processes in ion-ion/atom/molecule collisions is up-dated. The references published through 1954-1996 are listed in the order of the publication year. For easy finding of the references for a combination of collision partners, a simple list is provided. (author)

  12. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  13. Update of the Linac4-PSB Transfer Line

    CERN Document Server

    HEIN, Lutz

    2010-01-01

    The installation of Linac4 represents the first step of the upgrade plans of the CERN accelerator complex for the future in order to raise the available proton flux to attain amongst others the LHC ultimate luminosity. This linac is capable to accelerate H--ions from 45keV to 160MeV, which will be injected into the Proton Synchrotron Booster (PSB). The increase of energy from 50MeV (Linac2) to 160MeV (Linac4) allows to overcome the space charge limitations at the PSB injection, which is the main bottleneck towards higher beam brightness in the downstream accelerator chain. In order to preserve beam quality from the outlet of Linac4 to PSB injection the design of the transfer line becomes crucial. As the location of Linac4 was chosen in view of upgrade scenarios, the construction of a new transfer line is foreseen, see ref.[1] and ref.[2]. Here part of the Linac2-PSB transfer line will be re-used. In the new part of the transfer line the beam is horizontally and vertically adjusted towards the bending magnet B...

  14. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  15. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  16. Human Retroviruses and AIDS. A compilation and analysis of nucleic acid and amino acid sequences: I--II; III--V

    Energy Technology Data Exchange (ETDEWEB)

    Myers, G.; Korber, B. [eds.] [Los Alamos National Lab., NM (United States); Wain-Hobson, S. [ed.] [Laboratory of Molecular Retrovirology, Pasteur Inst.; Smith, R.F. [ed.] [Baylor Coll. of Medicine, Houston, TX (United States). Dept. of Pharmacology; Pavlakis, G.N. [ed.] [National Cancer Inst., Frederick, MD (United States). Cancer Research Facility

    1993-12-31

    This compendium and the accompanying floppy diskettes are the result of an effort to compile and rapidly publish all relevant molecular data concerning the human immunodeficiency viruses (HIV) and related retroviruses. The scope of the compendium and database is best summarized by the five parts that it comprises: (I) HIV and SIV Nucleotide Sequences; (II) Amino Acid Sequences; (III) Analyses; (IV) Related Sequences; and (V) Database Communications. Information within all the parts is updated at least twice in each year, which accounts for the modes of binding and pagination in the compendium.

  17. Updated folate data in the Dutch Food Composition Database and implications for intake estimates

    Directory of Open Access Journals (Sweden)

    Susanne Westenbrink

    2012-04-01

    Full Text Available Background and objective: Nutrient values are influenced by the analytical method used. Food folate measured by high performance liquid chromatography (HPLC or by microbiological assay (MA yield different results, with in general higher results from MA than from HPLC. This leads to the question of how to deal with different analytical methods in compiling standardised and internationally comparable food composition databases? A recent inventory on folate in European food composition databases indicated that currently MA is more widely used than HPCL. Since older Dutch values are produced by HPLC and newer values by MA, analytical methods and procedures for compiling folate data in the Dutch Food Composition Database (NEVO were reconsidered and folate values were updated. This article describes the impact of this revision of folate values in the NEVO database as well as the expected impact on the folate intake assessment in the Dutch National Food Consumption Survey (DNFCS. Design: The folate values were revised by replacing HPLC with MA values from recent Dutch analyses. Previously MA folate values taken from foreign food composition tables had been recalculated to the HPLC level, assuming a 27% lower value from HPLC analyses. These recalculated values were replaced by the original MA values. Dutch HPLC and MA values were compared to each other. Folate intake was assessed for a subgroup within the DNFCS to estimate the impact of the update. Results: In the updated NEVO database nearly all folate values were produced by MA or derived from MA values which resulted in an average increase of 24%. The median habitual folate intake in young children was increased by 11–15% using the updated folate values. Conclusion: The current approach for folate in NEVO resulted in more transparency in data production and documentation and higher comparability among European databases. Results of food consumption surveys are expected to show higher folate intakes

  18. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  19. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  20. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  1. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  2. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  3. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  4. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  5. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  6. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  7. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  8. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  9. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  10. Atomic Data Revisions for Transitions Relevant to Observations of Interstellar, Circumgalactic, and Intergalactic Matter

    Energy Technology Data Exchange (ETDEWEB)

    Cashman, Frances H.; Kulkarni, Varsha P. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC 29208 (United States); Kisielius, Romas; Bogdanovich, Pavel [Institute of Theoretical Physics and Astronomy, Vilnius University, Saulėtekio al. 3, LT-10222 Vilnius (Lithuania); Ferland, Gary J. [Department of Physics and Astronomy, University of Kentucky, Lexington, KY 40506 (United States)

    2017-05-01

    Measurements of element abundances in galaxies from astrophysical spectroscopy depend sensitively on the atomic data used. With the goal of making the latest atomic data accessible to the community, we present a compilation of selected atomic data for resonant absorption lines at wavelengths longward of 911.753 Å (the H i Lyman limit), for key heavy elements (heavier than atomic number 5) of astrophysical interest. In particular, we focus on the transitions of those ions that have been observed in the Milky Way interstellar medium (ISM), the circumgalactic medium (CGM) of the Milky Way and/or other galaxies, and the intergalactic medium (IGM). We provide wavelengths, oscillator strengths, associated accuracy grades, and references to the oscillator strength determinations. We also attempt to compare and assess the recent oscillator strength determinations. For about 22% of the lines that have updated oscillator strength values, the differences between the former values and the updated ones are ≳0.1 dex. Our compilation will be a useful resource for absorption line studies of the ISM, as well as studies of the CGM and IGM traced by sight lines to quasars and gamma-ray bursts. Studies (including those enabled by future generations of extremely large telescopes) of absorption by galaxies against the light of background galaxies will also benefit from our compilation.

  11. Impact of time displaced precipitation estimates for on-line updated models

    DEFF Research Database (Denmark)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2012-01-01

    When an online runoff model is updated from system measurements the requirements to the precipitation estimates change. Using rain gauge data as precipitation input there will be a displacement between the time where the rain intensity hits the gauge and the time where the rain hits the actual...

  12. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  13. Update from the Analysis of High Resolution Propane Spectra and the Interpretation of Titan's Infrared Spectra

    Science.gov (United States)

    Klavans, V.; Nixon, C.; Hewagama, T.; Jennings, D. E.

    2012-01-01

    Titan has an extremely thick atmosphere dominated by nitrogen, but includes a range of trace species such as hydrocarbons and nitriles. One such hydrocarbon is propane (C3H8). Propane has 21 active IR bands covering broad regions of the mid-infrared. Therefore, its ubiquitous signature may potentially mask weaker signatures of other undetected species with important roles in Titan's chemistry. Cassini's Composite Infrared Spectrometer (CIRS) observations of Titan's atmosphere hint at the presence of such molecules. Unfortunately, C3H8 line atlases for the vibration bands V(sub 8), V(sub 21), V(sub 20), and V(sub 7) (869, 922, 1054, and 1157 per centimeter, respectively) are not currently available for subtracting the C3H8 signal to reveal, or constrain, the signature of underlying chemical species. Using spectra previously obtained by Jennings, D. E., et al. at the McMath-Pierce FTIR at Kitt Peak, AZ, as the source and automated analysis utilities developed for this application, we are compiling an atlas of spectroscopic parameters for propane that characterize the ro-vibrational transitions in the above bands. In this paper, we will discuss our efforts for inspecting and fitting the aforementioned bands, present updated results for spectroscopic parameters including absolute line intensities and transition frequencies in HITRAN and GEISA formats, and show how these optical constants will be used in searching for other trace chemical species in Titan's atmosphere. Our line atlas for the V(sub 21) band contains a total number of 2971 lines. The band integrated strength calculated for the V(sub 21) band is 1.003 per centimeter per (centimeter-atm).

  14. Kinetic Line Voronoi Operations and Their Reversibility

    DEFF Research Database (Denmark)

    Mioc, Darka; Anton, François; Gold, Christopher

    2010-01-01

    In Geographic Information Systems the reversibility of map update operations has not been explored yet. In this paper we are using the Voronoi based Quad-edge data structure to define reversible map update operations. The reversibility of the map operations has been formalised at the lowest level...... mechanisms and dynamic map visualisations. In order to use the reversibility within the kinetic Voronoi diagram of points and open oriented line segments, we need to assure that reversing the map commands will produce exactly the changes in the map equivalent to the previous map states. To prove...... that reversing the map update operations produces the exact reverse changes, we show an isomorphism between the set of complex operations on the kinetic Voronoi diagram of points and open oriented line segments and the sets of numbers of new / deleted Voronoi regions induced by these operations, and its...

  15. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  16. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  17. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  18. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  19. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  20. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  1. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  2. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  3. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  4. Compilation of anatomical, physiological and dietary characteristics for a Filipino Reference Man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, C.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Shiraishi, K.

    1998-01-01

    The Asian Reference Man is the study of the biological characteristics of the different ethnic populations in the Asian Region. Its aim is to update the existing International Reference Values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian Reference Man and is represented by the Philippine Nuclear Research Institute. The biological parameters included in the study are the physical, anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are presented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper Compiled Program. (author)

  5. Compilation of anatomical, physiological and dietary characteristics for a Filipino reference man

    International Nuclear Information System (INIS)

    Natera, E.S.; Cuevas, G.D.; Azanon, E.M.; Palattao, M.B.; Espiritu, R.T.; Cobar, M.C.; Palad, L.H.; Torres, B.; Kawamura, H.; Shiraishi, K.

    1995-01-01

    The Asian reference man is a study of the biological characteristics of the different ethnic populations in the Asian region. Its aim is to update the existing international values called ICRP Reference Man which is used for the calculation of radiation exposure. The Philippines is a participant in the study of the formulation of the Asian reference man and is presented by the Philippine Nuclear Research Institute. The biological parameters included in this study are the physical anatomical, physiological and the dietary characteristics representing the Filipino race and customs. The normal Filipino values were obtained from past nationwide and regional surveys, from medical records of private and government institutions and from random sampling of the population. Results of the study are presented in tabulations according to its gender and to its age group. Statistical analysis of the data are represented as the mean, standard deviation and the median using Microsoft Excel Software and Clipper compiled Program. (author). 18 refs., 12 tabs., 1 fig

  6. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  7. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  8. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  9. PPC - an interactive preprocessor/compiler for the DSNP simulation language

    International Nuclear Information System (INIS)

    Mahannah, J.A.; Schor, A.L.

    1986-01-01

    The PPC preprocessor/compiler was developed for the dynamic simulator for nuclear power plant DSNP simulation language. The goal of PPC is to provide an easy-to-use, interactive programming environment that will aid both the beginner and well-seasoned DSNP programmer. PPC simplifies the steps of the simulation development process for any user. All will benefit from the on-line help facilities, easy manipulation of modules, the elimination of syntax errors, and the general systematic approach. PPC is a very structured and modular program that allows for easy expansion and modification. Written entirely in C, it is fast, compact, and portable. Used as a front end, it greatly enhances the DSNP desirability as a simulation tool for education and research

  10. Compilation of streamflow statistics calculated from daily mean streamflow data collected during water years 1901–2015 for selected U.S. Geological Survey streamgages

    Science.gov (United States)

    Granato, Gregory E.; Ries, Kernell G.; Steeves, Peter A.

    2017-10-16

    Streamflow statistics are needed by decision makers for many planning, management, and design activities. The U.S. Geological Survey (USGS) StreamStats Web application provides convenient access to streamflow statistics for many streamgages by accessing the underlying StreamStatsDB database. In 2016, non-interpretive streamflow statistics were compiled for streamgages located throughout the Nation and stored in StreamStatsDB for use with StreamStats and other applications. Two previously published USGS computer programs that were designed to help calculate streamflow statistics were updated to better support StreamStats as part of this effort. These programs are named “GNWISQ” (Get National Water Information System Streamflow (Q) files), updated to version 1.1.1, and “QSTATS” (Streamflow (Q) Statistics), updated to version 1.1.2.Statistics for 20,438 streamgages that had 1 or more complete years of record during water years 1901 through 2015 were calculated from daily mean streamflow data; 19,415 of these streamgages were within the conterminous United States. About 89 percent of the 20,438 streamgages had 3 or more years of record, and about 65 percent had 10 or more years of record. Drainage areas of the 20,438 streamgages ranged from 0.01 to 1,144,500 square miles. The magnitude of annual average streamflow yields (streamflow per square mile) for these streamgages varied by almost six orders of magnitude, from 0.000029 to 34 cubic feet per second per square mile. About 64 percent of these streamgages did not have any zero-flow days during their available period of record. The 18,122 streamgages with 3 or more years of record were included in the StreamStatsDB compilation so they would be available via the StreamStats interface for user-selected streamgages. All the statistics are available in a USGS ScienceBase data release.

  11. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  12. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  13. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  14. ExoMol line lists - VII. The rotation-vibration spectrum of phosphine up to 1500 K

    Science.gov (United States)

    Sousa-Silva, Clara; Al-Refaie, Ahmed F.; Tennyson, Jonathan; Yurchenko, Sergei N.

    2015-01-01

    A comprehensive hot line list is calculated for 31PH3 in its ground electronic state. This line list, called SAlTY, contains almost 16.8 billion transitions between 7.5 million energy levels and it is suitable for simulating spectra up to temperatures of 1500 K. It covers wavelengths longer than 1 μm and includes all transitions to upper states with energies below hc × 18 000 cm-1 and rotational excitation up to J = 46. The line list is computed by variational solution of the Schrödinger equation for the rotation-vibration motion employing the nuclear-motion program TROVE. A previously reported ab initio dipole moment surface is used as well as an updated `spectroscopic' potential energy surface, obtained by refining an existing ab initio surface through least-squares fitting to the experimentally derived energies. Detailed comparisons with other available sources of phosphine transitions confirms SAlTY's accuracy and illustrates the incompleteness of previous experimental and theoretical compilations for temperatures above 300 K. Atmospheric models are expected to severely underestimate the abundance of phosphine in disequilibrium environments, and it is predicted that phosphine will be detectable in the upper troposphere of many substellar objects. This list is suitable for modelling atmospheres of many astrophysical environments, namely carbon stars, Y dwarfs, T dwarfs, hot Jupiters and Solar system gas giant planets. It is available in full from the Strasbourg data centre, CDS, and at www.exomol.com.

  15. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  16. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  17. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  18. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  19. Atomic emission lines in the near ultraviolet; hydrogen through krypton, section 2

    Science.gov (United States)

    Kelly, R. L.

    1979-01-01

    A compilation of spectra from the first 36 elements was prepared from published literature available through October 1977. In most cases, only those lines which were actually observed in emission or absorption are listed. The wavelengths included range from 2000 Angstroms to 3200 Angstroms with some additional lines up to 3500 Angstroms. Only lines of stripped atoms are reported; no molecular bands are included.

  20. Atomic emission lines in the near ultraviolet; hydrogen through krypton, section 1

    Science.gov (United States)

    Kelly, R. L.

    1979-01-01

    A compilation of spectra from the first 36 elements was prepared from published literature available through October 1977. In most cases, only those lines which were actually observed in emission or absorption are listed. The wavelengths included range from 2000 Angstroms to 3200 Angstroms with some additional lines up to 3500 Angstroms. Only lines of stripped atoms are reported; no molecular bands are included.

  1. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    International Nuclear Information System (INIS)

    1994-01-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA's 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire

  2. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  3. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  4. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  5. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  6. A decade of Web Server updates at the Bioinformatics Links Directory: 2003-2012.

    Science.gov (United States)

    Brazas, Michelle D; Yim, David; Yeung, Winston; Ouellette, B F Francis

    2012-07-01

    The 2012 Bioinformatics Links Directory update marks the 10th special Web Server issue from Nucleic Acids Research. Beginning with content from their 2003 publication, the Bioinformatics Links Directory in collaboration with Nucleic Acids Research has compiled and published a comprehensive list of freely accessible, online tools, databases and resource materials for the bioinformatics and life science research communities. The past decade has exhibited significant growth and change in the types of tools, databases and resources being put forth, reflecting both technology changes and the nature of research over that time. With the addition of 90 web server tools and 12 updates from the July 2012 Web Server issue of Nucleic Acids Research, the Bioinformatics Links Directory at http://bioinformatics.ca/links_directory/ now contains an impressive 134 resources, 455 databases and 1205 web server tools, mirroring the continued activity and efforts of our field.

  7. SHINE Virtual Machine Model for In-flight Updates of Critical Mission Software

    Science.gov (United States)

    Plesea, Lucian

    2008-01-01

    This software is a new target for the Spacecraft Health Inference Engine (SHINE) knowledge base that compiles a knowledge base to a language called Tiny C - an interpreted version of C that can be embedded on flight processors. This new target allows portions of a running SHINE knowledge base to be updated on a "live" system without needing to halt and restart the containing SHINE application. This enhancement will directly provide this capability without the risk of software validation problems and can also enable complete integration of BEAM and SHINE into a single application. This innovation enables SHINE deployment in domains where autonomy is used during flight-critical applications that require updates. This capability eliminates the need for halting the application and performing potentially serious total system uploads before resuming the application with the loss of system integrity. This software enables additional applications at JPL (microsensors, embedded mission hardware) and increases the marketability of these applications outside of JPL.

  8. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  9. Catalogue of oscillator strengths for Ti II lines

    International Nuclear Information System (INIS)

    Savanov, I.S.; Huovelin, J.; Tuominen, I.

    1990-01-01

    We have revised the published values of oscillator strengths for ionized titanium. The zero point of gf-values has been established using the lifetime measurements of excited states of atoms. The data on the adopted oscillator strengths for 419 Ti II lines are compiled. Using the adopted gf-values and the analysis by Biemont for the titanium in the solar atmosphere determined from the Ti II lines and the HOLMU model, we obtained the abundance log A(Ti) = 4.96 ± 0.05

  10. Data for Regional Heat flow Studies in and around Japan and its relationship to seismogenic layer

    Science.gov (United States)

    Tanaka, A.

    2017-12-01

    Heat flow is a fundamental parameter to constrain the thermal structure of the lithosphere. It also provides a constraint to lithospheric rheology, which is sensitive to temperature. General features of the heat flow distribution in and around Japan had been revealed by the early 1970's, and heat flow data have been continuously updated by further data compilation from mainly published data and investigations. These include additional data, which were not published individually, but were included in site-specific reports. Also, thermal conductivity measurements were conducted on cores from boreholes using a line-source device with a half-space type box probe and an optical scanning device, and previously unpublished thermal conductivities were compiled. It has been more than 10 years since the last published compilation and analysis of heat flow data of Tanaka et al. (2004), which published all of the heat flow data in the northwestern Pacific area (from 0 to 60oN and from 120 to 160oE) and geothermal gradient data in and around Japan. Because these added data and information are drawn from various sources, the updated database is compiled in each datasets: heat flow, geothermal gradient, and thermal conductivity. The updated and improved database represents considerable improvement to past updates and presents an opportunity to revisit the thermal state of the lithosphere along with other geophysical/geochemical constraints on heat flow extrapolation. The spatial distribution of the cut-off depth of shallow seismicity of Japan using relocated hypocentres during the last decade (Omuralieva et al., 2012) and this updated database are used to quantify the concept of temperature as a fundamental parameter for determining the seismogenic thickness.

  11. Radioactive waste management profiles. Compilation of data from the waste management data base. No. 2, April 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, integration, storage, and retrieval of information relevant to radioactive waste management in Member States. This current report is a summary and compilation of of the data received during the 1991 biennial update which covers the period of January 1991 through March 1993. This Profile report is divided into two main parts. One part describes the Waste Management Data Base system and the type of information it contains. The second part contains data provided by Member States in response to the IAEA`s 1991 WMDB Questionnaire. This report also contains data of Member States that did nor report to the Questionnaire 3 figs, 5 tabs

  12. Application of geostatistical simulation to compile seismotectonic provinces based on earthquake databases (case study: Iran)

    Science.gov (United States)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-04-01

    This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low

  13. Supplement to nuclear EQ sourcebook: A compilation of documents for nuclear equipment qualification

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    In the nuclear power industry, environmental and seismic qualification of safety-related electrical and instrumentation equipment is collectively known as Equipment Qualification (EQ). Related technology, as well as regulatory requirements, have evolved rapidly during the last 15 years. For environmental qualification, what began in 1971 with one trial-use guide (IEEE Std 323-1971), now stands as a full complement of Nuclear Regulatory Commission (NRC) rules, guides, and industry standards. As the original 1992 version of the Nuclear EQ Sourcebook was compiled to serve the user-community need for a complete and exhaustive single source of nuclear EQ documentation, this Supplement is published to provide the user community with nuclear EQ documentation revisions and updates that have been published since the publication of the original volume in May of 1992. The first volume of this publication included documents issued before December, 1991. That first volume was well received and also prompted positive response from industry on its usefulness and on recommendations to improve the completeness and enhance the ease of use. This Supplement, therefore, includes new documents, revisions, and updates issued and published since December 1991 and up to and including March 1993. It incorporates various enhancements in order to better serve the user community. One such enhancement included in this Supplement is the new Equipment Classification Index that is described in the User Guide section of this publication. 37 papers, including Bulletins, Federal rules, Generic Letters, Notices, Regulatory Guides, IEEE Standards, IEEE Recommended Practices, and IEEE Guides, have been processed separately for inclusion on the data base

  14. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  15. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  16. MIT wavelength tables. Volume 2. Wavelengths by element

    International Nuclear Information System (INIS)

    Phelps, F.M. III.

    1982-01-01

    This volume is the first stage of a project to expand and update the MIT wavelength tables first compiled in the 1930's. For 109,325 atomic emission lines, arranged by element, it presents wavelength in air, wavelength in vacuum, wave number and intensity. All data are stored on computer-readable magnetic tape

  17. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  18. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  19. Statistical and perceptual updating: correlated impairments in right brain injury.

    Science.gov (United States)

    Stöttinger, Elisabeth; Filipowicz, Alex; Marandi, Elahe; Quehl, Nadine; Danckert, James; Anderson, Britt

    2014-06-01

    It has been hypothesized that many of the cognitive impairments commonly seen after right brain damage (RBD) can be characterized as a failure to build or update mental models. We (Danckert et al. in Neglect as a disorder of representational updating. NOVA Open Access, New York, 2012a; Cereb Cortex 22:2745-2760, 2012b) were the first to directly assess the association between RBD and updating and found that RBD patients were unable to exploit a strongly biased play strategy in their opponent in the children's game rock, paper, scissors. Given that this game required many other cognitive capacities (i.e., working memory, sustained attention, reward processing), RBD patients could have failed this task for various reasons other than a failure to update. To assess the generality of updating deficits after RBD, we had RBD, left brain-damaged (LBD) patients and healthy controls (HCs) describe line drawings that evolved gradually from one figure (e.g., rabbit) to another (e.g., duck) in addition to the RPS updating task. RBD patients took significantly longer to alter their perceptual report from the initial object to the final object than did LBD patients and HCs. Although both patient groups performed poorly on the RPS task, only the RBD patients showed a significant correlation between the two, very different, updating tasks. We suggest these data indicate a general deficiency in the ability to update mental representations following RBD.

  20. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  1. Assessment of the current status of basic nuclear data compilations

    International Nuclear Information System (INIS)

    1987-03-01

    The Panel on Basic Nuclear Data Compilations believes that it is of paramount importance to achieve as short a cycle time as is reasonably possible in the evaluation and publication of the A-chains. The panel, therefore, has concentrated its efforts on identifying those factors that have tended to increase the cycle time and on finding ways to remove the obstacles. An important step was made during the past year to address reduction of the size of the published evaluations - another factor that can reduce cycle time. The Nuclear Structure and Decay Data (NSDD) network adopted new format guidelines, which generated a 30% reduction by eliminating redundancy and/or duplication. A current problem appears to be the rate at which the A-chains are being evaluated, which, on the average, is only about one-half of what it could be. It is hoped that the situation will improve with an increase in the number of foreign centers and an increase in efficiency as more A-chains are recycled by the same evaluator who did the previous evaluation. Progress has been made in the area of on-line access to the nuclear data files in that a subcommittee report describing the requirements of an on-line system has been produced. 2 tabs

  2. Errata and update to ;Experimental cross sections for L-shell X-ray production and ionization by protons;

    Science.gov (United States)

    Miranda, J.; Lapicki, G.

    2018-01-01

    A compilation of experimental L-shell X-ray production and ionization cross sections induced by proton impact was published recently (Miranda and Lapicki, 2014), collecting 15 439 experimental cross sections. The database covers an energy range from 10 keV to 1 GeV, and targets from 10Ne to 95Am. A correction to several tabulated values that were in error, as well as an update including new data published after 2012 and older references not found previously are given in the present work. The updated data base increased the total number of experimental cross sections by 3.1% to 15 921. A new analysis of the total number of experimental points per year shows that the possible saturation in the cumulative total number of data is increased to 15 950 ± 110 points.

  3. Switzerland; Financial Sector Assessment Program: Factual Update: Insurance Sector Market and Regulatory Developments

    OpenAIRE

    International Monetary Fund

    2007-01-01

    This paper presents a factual update of the Insurance Core Principles including insurance sector market and regulatory developments for Switzerland. Regulatory reforms since 2003 have updated Switzerland’s regulatory and supervisory regime for the insurance industry to bring it in line with international best practices. The Insurance Supervision Law (ISL) has reoriented the regulatory focus and expanded the regulatory scope to include group/conglomerate supervision, corporate governance, risk...

  4. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  5. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  6. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  7. DOE interpretations Guide to OSH standards. Update to the Guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-31

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued it`s support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal in letter written by OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed by DOE. This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.

  8. DOE interpretations Guide to OSH standards. Update to the Guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-31

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued its support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal interpretation letters written by OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed by DOE. This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.

  9. DOE interpretations Guide to OSH standards. Update to the Guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-31

    Reflecting Secretary O`Leary`s focus on occupational safety and health, the Office of Occupational Safety is pleased to provide you with the latest update to the DOE Interpretations Guide to OSH Standards. This Guide was developed in cooperation with the Occupational Safety and Health Administration, which continued its support during this last revision by facilitating access to the interpretations found on the OSHA Computerized Information System (OCIS). This March 31, 1994 update contains 123 formal interpretation letters written OSHA. As a result of the unique requests received by the 1-800 Response Line, this update also contains 38 interpretations developed by DOE. This new occupational safety and health information adds still more important guidance to the four volume reference set that you presently have in your possession.

  10. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  11. The HITRAN2016 molecular spectroscopic database

    Energy Technology Data Exchange (ETDEWEB)

    Gordon, I. E.; Rothman, L. S.; Hill, C.; Kochanov, R. V.; Tan, Y.; Bernath, P. F.; Birk, M.; Boudon, V.; Campargue, A.; Chance, K. V.; Drouin, B. J.; Flaud, J. -M.; Gamache, R. R.; Hodges, J. T.; Jacquemart, D.; Perevalov, V. I.; Perrin, A.; Shine, K. P.; Smith, M. -A. H.; Tennyson, J.; Toon, G. C.; Tran, H.; Tyuterev, V. G.; Barbe, A.; Császár, A. G.; Devi, V. M.; Furtenbacher, T.; Harrison, J. J.; Hartmann, J. -M.; Jolly, A.; Johnson, T. J.; Karman, T.; Kleiner, I.; Kyuberis, A. A.; Loos, J.; Lyulin, O. M.; Massie, S. T.; Mikhailenko, S. N.; Moazzen-Ahmadi, N.; Müller, H. S. P.; Naumenko, O. V.; Nikitin, A. V.; Polyansky, O. L.; Rey, M.; Rotger, M.; Sharpe, S. W.; Sung, K.; Starikova, E.; Tashkun, S. A.; Auwera, J. Vander; Wagner, G.; Wilzewski, J.; Wcisło, P.; Yu, S.; Zak, E. J.

    2017-12-01

    This paper describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is comprised of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additional absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 200 additional significant molecules have been added to the database.

  12. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  13. The Cardassian expansion revisited: constraints from updated Hubble parameter measurements and type Ia supernova data

    Science.gov (United States)

    Magaña, Juan; Amante, Mario H.; Garcia-Aspeitia, Miguel A.; Motta, V.

    2018-05-01

    Motivated by an updated compilation of observational Hubble data (OHD) that consist of 51 points in the redshift range of 0.07 Ia supernova (SN Ia) using the compressed and full joint-light-analysis (JLA) samples (Betoule et al.). We also perform a joint analysis using the combination OHD plus compressed JLA. Our results show that the OC and MPC models are in agreement with the standard cosmology and naturally introduce a cosmological-constant-like extra term in the canonical Friedmann equation with the capability of accelerating the Universe without dark energy.

  14. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  15. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  16. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  17. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  18. Compilation of data for radionuclide transport analysis

    International Nuclear Information System (INIS)

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function

  19. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  20. Updated Life-Cycle Assessment of Aluminum Production and Semi-fabrication for the GREET Model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiang [Argonne National Lab. (ANL), Argonne, IL (United States); Kelly, Jarod C. [Argonne National Lab. (ANL), Argonne, IL (United States); Burnham, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Elgowainy, Amgad [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    This report serves as an update for the life-cycle analysis (LCA) of aluminum production based on the most recent data representing the state-of-the-art of the industry in North America. The 2013 Aluminum Association (AA) LCA report on the environmental footprint of semifinished aluminum products in North America provides the basis for the update (The Aluminum Association, 2013). The scope of this study covers primary aluminum production, secondary aluminum production, as well as aluminum semi-fabrication processes including hot rolling, cold rolling, extrusion and shape casting. This report focuses on energy consumptions, material inputs and criteria air pollutant emissions for each process from the cradle-to-gate of aluminum, which starts from bauxite extraction, and ends with manufacturing of semi-fabricated aluminum products. The life-cycle inventory (LCI) tables compiled are to be incorporated into the vehicle cycle model of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) Model for the release of its 2015 version.

  1. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  2. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  3. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  4. Method for updating pipelined, single port Z-buffer by segments on a scan line

    International Nuclear Information System (INIS)

    Hannah, M.R.

    1990-01-01

    This patent describes, in a raster scan, computer controlled video display system for presenting an image to an observer. Having Z-buffer for storing Z values and a frame buffer for storing pixel values, a method for updating the Z-buffer with new Z values to replace old Z values. It comprises: calculating a new pixel value and a new Z value for each pixel location in pixel locations, performing a Z comparison for each new Z value by comparing the old Z value with the new Z value for each pixel location, the Z comparison being performed sequentially in one direction through the plurality of pixel locations, and updating the Z-buffer only after the Z comparison produces a combination of a fail condition for a current pixel location subsequent to producing a pass condition for a pixel location immediately preceding the current pixel location

  5. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  6. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  7. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  8. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  9. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  10. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  11. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  12. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  13. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  14. A ‘Social Form Of Knowledge’ in Practice: Unofficial Compiling of 1960s Pop Music on CD-R

    Directory of Open Access Journals (Sweden)

    Paul Martin

    2012-01-01

    Full Text Available In this article I explore the ‘unofficial’ (and technically illegal compiling of marginally known 1960s pop records on Compact Disc Recordable (CD-R. I do so by situating it within the proposition by the late Raphael Samuel, that history is ‘social knowledge’ and a practice rather than a profession. I propose that this compiling activity exemplifies this proposition. The core of the paper is centred on a 2007 survey which I conducted via three on-line 1960s music enthusiast discussion forums. I draw on the sixteen responses to demonstrate how the motivations, values and intentions of those respondents engaging in the practice of CD-R compiling are historically and socially centred. In doing so, I seek to problematise the music industry’s undifferentiated condemnation of all copying as theft. I do so by showing how, far from stealing, these CD-R compilers are adding to the musical social knowledge of 1960s pop and rock music. I further situate them within a longer lineage of ‘unofficial listening’ dating back to at least the 1930s. In using the term ‘unofficial’ in both a legal and public historical sense (eg to take issue with a received narrative, I point to wider definitions of what historically has or has not been musically ‘official’ to listen to. I seek also to point to the practice of CD-R compiling as a historical ‘moment’ in technological change, which might otherwise go unremarked upon as the CD-R itself heads towards utilitarian obsolescence. Although, the issues and concepts raised in the paper can be little more than pointed to, it is hoped it might act as one platform for the historical engagement with a subject more commonly discussed in sociological terms. As public historians we should be reflexive and inter-disciplinary and it is with this mind set that this article is written.

  15. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  16. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  17. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  18. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  19. Review and updates of the risk assessment for advanced test reactor operations for operating events and experience

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1996-01-01

    Annual or biannual reviews of the operating history of the Advanced Test Reactor (ATR) at the Idaho National Engineering Laboratory (INEL) have been conducted for the purpose of reviewing and updating the ATR probabilistic safety assessment (PSA) for operating events and operating experience since the first compilation of plant- specific experience data for the ATR PSA which included data for operation from initial power operation in 1969 through 1988. This technical paper briefly discusses the means and some results of these periodic reviews of operating experience and their influence on the ATR PSA

  20. Updated US and Canadian normalization factors for TRACI 2.1

    DEFF Research Database (Denmark)

    Ryberg, Morten; Vieira, Marisa D. M.; Zgola, Melissa

    2014-01-01

    When LCA practitioners perform LCAs, the interpretation of the results can be difficult without a reference point to benchmark the results. Hence, normalization factors are important for relating results to a common reference. The main purpose of this paper was to update the normalization factors...... for the US and US-Canadian regions. The normalization factors were used for highlighting the most contributing substances, thereby enabling practitioners to put more focus on important substances, when compiling the inventory, as well as providing them with normalization factors reflecting the actual...... situation. Normalization factors were calculated using characterization factors from the TRACI 2.1 LCIA model. The inventory was based on US databases on emissions of substances. The Canadian inventory was based on a previous inventory with 2005 as reference, in this inventory the most significant...

  1. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  2. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  3. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  4. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  5. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  6. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  7. Impact of line parameter database, continuum absorption, full grind configuration, and L1B update on GOSAT TIR methane retrieval

    Science.gov (United States)

    Yamada, A.; Saitoh, N.; Nonogaki, R.; Imasu, R.; Shiomi, K.; Kuze, A.

    2016-12-01

    The thermal infrared (TIR) band of Thermal and Near-infrared Sensor for Carbon Observation Fourier Transform Spectrometer (TANSO-FTS) onboard Greenhouse Gases Observing Satellite (GOSAT) observes CH4 profile at wavenumber range from 1210 cm-1 to 1360 cm-1 including CH4 ν4 band. The current retrieval algorithm (V1.0) uses LBLRTM V12.1 with AER V3.1 line database to calculate optical depth. LBLRTM V12.1 include MT_CKD 2.5.2 model to calculate continuum absorption. The continuum absorption has large uncertainty, especially temperature dependent coefficient, between BPS model and MT_CKD model in the wavenumber region of 1210-1250 cm-1(Paynter and Ramaswamy, 2014). The purpose of this study is to assess the impact on CH4 retrieval from the line parameter databases and the uncertainty of continuum absorption. We used AER v1.0 database, HITRAN2004 database, HITRAN2008 database, AER V3.2 database, and HITRAN2012 database (Rothman et al. 2005, 2009, and 2013. Clough et al., 2005). AER V1.0 database is based on HITRAN2000. The CH4 line parameters of AER V3.1 and V3.2 databases are developed from HITRAN2008 including updates until May 2009 with line mixing parameters. We compared the retrieved CH4 with the HIPPO CH4 observation (Wofsy et al., 2012). The difference of AER V3.2 was the smallest and 24.1 ± 45.9 ppbv. The differences of AER V1.0, HITRAN2004, HITRAN2008, and HITRAN2012 were 35.6 ± 46.5 ppbv, 37.6 ± 46.3 ppbv, 32.1 ± 46.1 ppbv, and 35.2 ± 46.0 ppbv, respectively. Compare AER V3.2 case to HITRAN2008 case, the line coupling effect reduced difference by 8.0 ppbv. Median values of Residual difference from HITRAN2008 to AER V1.0, HITRAN2004, AER V3.2, and HITRAN2012 were 0.6 K, 0.1 K, -0.08 K, and 0.08 K, respectively, while median values of transmittance difference were less than 0.0003 and transmittance differences have small wavenumber dependence. We also discuss the retrieval error from the uncertainty of the continuum absorption, the test of full grid

  8. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  9. An on-line modified least-mean-square algorithm for training neurofuzzy controllers.

    Science.gov (United States)

    Tan, Woei Wan

    2007-04-01

    The problem hindering the use of data-driven modelling methods for training controllers on-line is the lack of control over the amount by which the plant is excited. As the operating schedule determines the information available on-line, the knowledge of the process may degrade if the setpoint remains constant for an extended period. This paper proposes an identification algorithm that alleviates "learning interference" by incorporating fuzzy theory into the normalized least-mean-square update rule. The ability of the proposed methodology to achieve faster learning is examined by employing the algorithm to train a neurofuzzy feedforward controller for controlling a liquid level process. Since the proposed identification strategy has similarities with the normalized least-mean-square update rule and the recursive least-square estimator, the on-line learning rates of these algorithms are also compared.

  10. High-resolution Laboratory Measurements of Coronal Lines near the Fe IX Line at 171 Å

    Science.gov (United States)

    Beiersdorfer, Peter; Träbert, Elmar

    2018-02-01

    We present high-resolution laboratory measurements in the spectral region between 165 and 175 Å that focus on the emission from various ions of C, O, F, Ne, S, Ar, Fe, and Ni. This wavelength region is centered on the λ171 Fe IX channel of the Atmospheric Imaging Assembly on the Solar Dynamics Observatory, and we place special emphasis on the weaker emission lines of Fe IX predicted in this region. In general, our measurements show a multitude of weak lines missing in the current databases, where the emission lines of Ni are probably most in need of further identification and reclassification. We also find that the wavelengths of some of the known lines need updating. Using the multi-reference Møller–Plesset method for wavelength predictions and collisional-radiative modeling of the line intensities, we have made tentative assignments of more than a dozen lines to the spectrum of Fe IX, some of which have formerly been identified as Fe VII, Fe XIV, or Fe XVI lines. Several Fe features remain unassigned, although they appear to be either Fe VII or Fe X lines. Further work will be needed to complete and correct the spectral line lists in this wavelength region.

  11. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  12. Compilation and evaluation of high energy γ-ray standards from nuclear reactions. Work performed under the coordinated research project 'Update of X- and γ-ray decay data standards for detector calibration'

    International Nuclear Information System (INIS)

    Marcinkowski, A.; Marianski, B.

    1999-02-01

    The report presents the following aspects needed for the compilation and evaluation of high energy γ-ray standards from nuclear reactions: evaluation of emission probabilities of γ-rays with energies 4.44 MeV and 15.11 MeV from 12 C * , preparation of the list of reactions suitable for production of the above mentioned excited radionuclide, and compilation and evaluation of cross sections for these reactions, including inelastic proton scattering on 12 C and radiative capture on 11 B

  13. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  14. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  15. Dynamic stabbing queries with sub-logarithmic local updates for overlapping intervals : Proc. 12th International Computer Science Symposium in Russia

    NARCIS (Netherlands)

    Khramtcova, Elena; Löffler, Maarten

    2017-01-01

    We present a data structure to maintain a set of intervals on the real line subject to fast insertions and deletions of the intervals, stabbing queries, and local updates. Intuitively, a local update replaces an interval by another one of roughly the same size and location. We investigate whether

  16. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  17. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  18. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  19. Automating Phase Change Lines and Their Labels Using Microsoft Excel(R).

    Science.gov (United States)

    Deochand, Neil

    2017-09-01

    Many researchers have rallied against drawn in graphical elements and offered ways to avoid them, especially regarding the insertion of phase change lines (Deochand, Costello, & Fuqua, 2015; Dubuque, 2015; Vanselow & Bourret, 2012). However, few have offered a solution to automating the phase labels, which are often utilized in behavior analytic graphical displays (Deochand et al., 2015). Despite the fact that Microsoft Excel® is extensively utilized by behavior analysts, solutions to resolve issues in our graphing practices are not always apparent or user-friendly. Considering the insertion of phase change lines and their labels constitute a repetitious and laborious endeavor, any minimization in the steps to accomplish these graphical elements could offer substantial time-savings to the field. The purpose of this report is to provide an updated way (and templates in the supplemental materials) to add phase change lines with their respective labels, which stay embedded to the graph when they are moved or updated.

  20. Linux command line and shell scripting bible

    CERN Document Server

    Blum, Richard

    2014-01-01

    Talk directly to your system for a faster workflow with automation capability Linux Command Line and Shell Scripting Bible is your essential Linux guide. With detailed instruction and abundant examples, this book teaches you how to bypass the graphical interface and communicate directly with your computer, saving time and expanding capability. This third edition incorporates thirty pages of new functional examples that are fully updated to align with the latest Linux features. Beginning with command line fundamentals, the book moves into shell scripting and shows you the practical application

  1. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  2. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  3. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  4. AmeriFlux Network Data Activities: updates, progress and plans

    Science.gov (United States)

    Yang, B.; Boden, T.; Krassovski, M.; Song, X.

    2013-12-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory serves as the long-term data repository for the AmeriFlux network. Datasets currently available include hourly or half-hourly meteorological and flux observations, biological measurement records, and synthesis data products. In this presentation, we provide an update of this network database including a comprehensive review and evaluation of the biological data from about 70 sites, development of a new product for flux uncertainty estimates, and re-formatting of Level-2 standard files. In 2013, we also provided data support to two synthesis studies --- 2012 drought synthesis and FACE synthesis. Issues related to data quality and solutions in compiling datasets for these synthesis studies will be discussed. We will also present our work plans in developing and producing other high-level products, such as derivation of phenology from the available measurements at flux sites.

  5. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  6. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  7. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  8. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  9. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  10. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  11. Updated embrittlement trend curve for reactor pressure vessel steels

    International Nuclear Information System (INIS)

    Kirk, M.; Santos, C.; Eason, E.; Wright, J.; Odette, G.R.

    2003-01-01

    The reactor pressure vessels of commercial nuclear power plants are subject to embrittlement due to exposure to high energy neutrons from the core. Irradiation embrittlement of RPV belt-line materials is currently evaluated using US Regulatory Guide 1.99 Revision 2 (RG 1.99 Rev 2), which presents methods for estimating the Charpy transition temperature shift (ΔT30) at 30 ft-lb (41 J) and the drop in Charpy upper shelf energy (ΔUSE). A more recent embrittlement model, based on a broader database and more recent research results, is presented in NUREG/CR-6551. The objective of this paper is to describe the most recent update to the embrittlement model in NUREG/CR-6551, based upon additional data and increased understanding of embrittlement mechanisms. The updated ΔT30 and USE models include fluence, copper, nickel, phosphorous content, and product form; the ΔT30 model also includes coolant temperature, irradiation time (or flux), and a long-time term. The models were developed using multi-variable surface fitting techniques, understanding of the ΔT30 mechanisms, and engineering judgment. The updated ΔT30 model reduces scatter significantly relative to RG 1.99 Rev 2 on the currently available database for plates, forgings, and welds. This updated embrittlement trend curve will form the basis of revision 3 to Regulatory Guide 1.99. (author)

  12. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  13. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  14. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  15. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  16. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  17. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  18. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  19. Human retroviruses and AIDS 1996. A compilation and analysis of nucleic acid and amino acid sequences

    Energy Technology Data Exchange (ETDEWEB)

    Myers, G.; Foley, B.; Korber, B. [eds.] [Los Alamos National Lab., NM (United States). Theoretical Div.; Mellors, J.W. [ed.] [Univ. of Pittsburgh, PA (United States); Jeang, K.T. [ed.] [National Institutes of Health, Bethesda, MD (United States). Molecular Virology Section; Wain-Hobson, S. [Pasteur Inst., Paris (France)] [ed.

    1997-04-01

    This compendium and the accompanying floppy diskettes are the result of an effort to compile and rapidly publish all relevant molecular data concerning the human immunodeficiency viruses (HIV) and related retroviruses. The scope of the compendium and database is best summarized by the five parts that it comprises: (1) Nuclear Acid Alignments and Sequences; (2) Amino Acid Alignments; (3) Analysis; (4) Related Sequences; and (5) Database Communications. Information within all the parts is updated throughout the year on the Web site, http://hiv-web.lanl.gov. While this publication could take the form of a review or sequence monograph, it is not so conceived. Instead, the literature from which the database is derived has simply been summarized and some elementary computational analyses have been performed upon the data. Interpretation and commentary have been avoided insofar as possible so that the reader can form his or her own judgments concerning the complex information. In addition to the general descriptions of the parts of the compendium, the user should read the individual introductions for each part.

  20. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  1. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  2. Linux Command Line and Shell Scripting Bible

    CERN Document Server

    Blum, Richard

    2011-01-01

    The authoritative guide to Linux command line and shell scripting?completely updated and revised [it's not a guide to Linux as a whole ? just to scripting] The Linux command line allows you to type specific Linux commands directly to the system so that you can easily manipulate files and query system resources, thereby permitting you to automate commonly used functions and even schedule those programs to run automatically. This new edition is packed with new and revised content, reflecting the many changes to new Linux versions, including coverage of alternative shells to the default bash shel

  3. Real quartic surfaces containing 16 skew lines

    Directory of Open Access Journals (Sweden)

    Isidro Nieto

    2004-01-01

    Full Text Available It is well known that there is an open three-dimensional subvariety Ms of the Grassmannian of lines in ℙ3 which parametrizes smooth irreducible complex surfaces of degree 4 which are Heisenberg invariant, and each quartic contains 32 lines but only 16 skew lines, being determined by its configuration of lines, are called a double 16. We consider here the problem of visualizing in a computer the real Heisenberg invariant quartic surface and the real double 16. We construct a family of points l∈Ms parametrized by a two-dimensional semialgebraic variety such that under a change of coordinates of l into its Plüecker, coordinates transform into the real coordinates for a line L in ℙ3, which is then used to construct a program in Maple 7. The program allows us to draw the quartic surface and the set of transversal lines to L. Additionally, we include a table of a group of examples. For each test example we specify a parameter, the viewing angle of the image, compilation time, and other visual properties of the real surface and its real double 16. We include at the end of the paper an example showing the surface containing the double 16.

  4. Updated Results from the Michigan Titan Thermospheric General Circulation Model (TTGCM)

    Science.gov (United States)

    Bell, J. M.; Bougher, S. W.; de Lahaye, V.; Waite, J. H.; Ridley, A.

    2006-05-01

    This paper presents updated results from the Michigan Titan Thermospheric General Circulation Model (TTGCM) that was recently unveiled in operational form (Bell et al 2005 Spring AGU). Since then, we have incorporated a suite of chemical reactions for the major neutral constituents in Titan's upper atmosphere (N2, CH4). Additionally, some selected minor neutral constituents and major ionic species are also supported in the framework. At this time, HCN, which remains one of the critical thermally active species in the upper atmosphere, remains specified at all altitudes, utilizing profiles derived from recent Cassini-Huygen's measurements. In addition to these improvements, a parallel effort is underway to develop a non-hydrostatic Titan Thermospheric General Circulation Model for further comparisons. In this work, we emphasize the impacts of self-consistent chemistry on the results of the updated TTGCM relative to its frozen chemistry predecessor. Meanwhile, the thermosphere's thermodynamics remains determined by the interplay of solar EUV forcing and HCN rotational cooling, which is calculated by a full line- by-line radiative transfer routine along the lines of Yelle (1991) and Mueller-Wodarg (2000, 2002). In addition to these primary drivers, a treatment of magnetospheric heating is further tested. The model's results will be compared with both the Cassini INMS data and the model of Mueller-Wodarg (2000,2002).

  5. Canada's oil sands : opportunities and challenges to 2015 : an update

    International Nuclear Information System (INIS)

    2006-06-01

    This report updated an energy market assessment compiled and published by the National Energy Board (NEB) in 2004. Major changes resulting from recent developments in the oil sands industry were presented. The report was compiled from a series of informal meetings and discussions with a cross-section of oil sands stakeholders. Influences on recent oil sands development and production growth included market development and pipelines; rising capital and labour costs; operating costs; environmental impact management; high crude oil prices; rising global energy demand; technology innovations; and a more stable investment climate. A comparison of key assumptions between the current analysis and the 2004 report was presented, along with estimates of operating and supply costs for various types of oil sands recovery methods. Potential markets for oil sands production were reviewed. Environmental and socio-economic impacts on the industry included the larger than anticipated water withdrawals from the Athabasca River for mining operations; and uncertainties over land reclamation methods. The industry has also been impacted by a limited supply of skilled workers in Alberta. It was observed that the potential for building cogeneration capacity has decreased since the 2004 report. It was concluded that the oil sands industry will continue to grow rapidly, but the rate of development will depend on the balance that is reached between the opposing forces that affect the oil sands. Natural gas costs, high oil prices, air emissions management issues and water usage will continue to be of concern. 6 tabs., 7 figs

  6. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  7. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  8. NCCN Guidelines® Insights Bladder Cancer, Version 2.2016 Featured Updates to the NCCN Guidelines

    Science.gov (United States)

    Clark, Peter E.; Spiess, Philippe E.; Agarwal, Neeraj; Bangs, Rick; Boorjian, Stephen A.; Buyyounouski, Mark K.; Efstathiou, Jason A.; Flaig, Thomas W.; Friedlander, Terence; Greenberg, Richard E.; Guru, Khurshid A.; Hahn, Noah; Herr, Harry W.; Hoimes, Christopher; Inman, Brant A.; Kader, A. Karim; Kibel, Adam S.; Kuzel, Timothy M.; Lele, Subodh M.; Meeks, Joshua J.; Michalski, Jeff; Montgomery, Jeffrey S.; Pagliaro, Lance C.; Pal, Sumanta K.; Patterson, Anthony; Petrylak, Daniel; Plimack, Elizabeth R.; Pohar, Kamal S.; Porter, Michael P.; Sexton, Wade J.; Siefker-Radtke, Arlene O.; Sonpavde, Guru; Tward, Jonathan; Wile, Geoffrey; Dwyer, Mary A.; Smith, Courtney

    2017-01-01

    These NCCN Guidelines Insights discuss the major recent updates to the NCCN Guidelines for Bladder Cancer based on the review of the evidence in conjunction with the expert opinion of the panel. Recent updates include (1) refining the recommendation of intravesical bacillus Calmette-Guérin, (2) strengthening the recommendations for perioperative systemic chemotherapy, and (3) incorporating immunotherapy into second-line therapy for locally advanced or metastatic disease. These NCCN Guidelines Insights further discuss factors that affect integration of these recommendations into clinical practice. PMID:27697976

  9. Sprucing up the site - update

    CERN Multimedia

    2009-01-01

    As mentioned in a previous article the Bulletin will be publishing regular short updates following the consolidation work going on around the CERN sites: All internal lighting is being replaced in the office buildings on the Prevessin site. Work has started in building 866 and will move to 864 and 865 later. New energy-efficient lights are being installed, which will reduce electricity consumption by 30 -50%, and in the common areas like corridors the lighting will be switched on by motion sensors. Also in the Prevessin site, the lines in the car parks are being repainted. This will continue in the Meyrin site later. Work has started in Building 30 to completely refurbish the AT Auditorium.

  10. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  11. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  12. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  13. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  14. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  15. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  16. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  17. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  18. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  19. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  20. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  1. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  2. AN UPDATED ULTRAVIOLET CATALOG OF GALEX NEARBY GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yu; Zou, Hu; Liu, JiFeng; Wang, Song, E-mail: ybai@nao.cas.cn, E-mail: zouhu@nao.cas.cn, E-mail: jfliu@nao.cas.cn, E-mail: songw@nao.cas.cn [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, 20A Datun Road, Chaoyang Distict, 100012 Beijing (China)

    2015-09-15

    The ultraviolet (UV) catalog of nearby galaxies compiled by Gil de Paz et al. presents the integrated photometry and surface brightness profiles for 1034 nearby galaxies observed by GALEX. We provide an updated catalog of 4138 nearby galaxies based on the latest Genral Release (GR6/GR7) of GALEX. These galaxies are selected from HyperLeda with apparent diameters larger than 1′. From the surface brightness profiles accurately measured using the deep NUV and FUV images, we have calculated the asymptotic magnitudes, aperture (D25) magnitudes, colors, structural parameters (effective radii and concentration indices), luminosities, and effective surface brightness for these galaxies. Archival optical and infrared photometry from HyperLeda, 2MASS, and IRAS are also integrated into the catalog. Our parameter measurements and some analyses are consistent with those of Paz et al. The (FUV − K) color provides a good criterion to distinguish between early- and late-type galaxies, which can be improved further using the concentration indices. The IRX–β relation is reformulated with our UV-selected nearby galaxies.

  3. GOSAT-2014 methane spectral line list

    International Nuclear Information System (INIS)

    Nikitin, A.V.; Lyulin, O.M.; Mikhailenko, S.N.; Perevalov, V.I.; Filippov, N.N.; Grigoriev, I.M.; Morino, I.; Yoshida, Y.; Matsunaga, T.

    2015-01-01

    The updated methane spectral line list GOSAT-2014 for the 5550–6240 cm −1 region with the intensity cutoff of 5×10 –25 cm/molecule at 296 K is presented. The line list is based on the extensive measurements of the methane spectral line parameters performed at different temperatures and pressures of methane without and with buffer gases N 2 , O 2 and air. It contains the following spectral line parameters of about 12150 transitions: line position, line intensity, energy of lower state, air-induced and self-pressure-induced broadening and shift coefficients and temperature exponent of air-broadening coefficient. The accuracy of the line positions and intensities are considerably improved in comparison with the previous version GOSAT-2009. The improvement of the line list is done mainly due to the involving to the line position and intensity retrieval of six new spectra recorded with short path way (8.75 cm). The air-broadening and air-shift coefficients for the J-manifolds of the 2ν 3 (F 2 ) band are refitted using the new more precise values of the line positions and intensities. The line assignment is considerably extended. The lower state J-value was assigned to 6397 lines representing 94.4% of integrated intensity of the considering wavenumber region. The complete assignment was done for 2750 lines. - Highlights: • The upgrade of the GOSAT methane line list in the 5550–6240 cm −1 region is done. • 12,146 experimental methane line positions and intensities are retrieved. • 6376 lower energy levels for methane lines are determined

  4. He II lines in the spectrum of zeta Puppis

    International Nuclear Information System (INIS)

    Snijders, M.A.J.; Underhill, A.B.

    1975-01-01

    Equivalents widths of He II lines in the series n=2,3,4 and 5 are compiled and compared with predictions from plane-parallel, static model atmospheres using a non-LTE theory of line formation. The agreement between observation and prediction for a (50,000,4.0) model atmosphere is good for the upper members of the n=3 and the n=5 series, but the two lines of the n=2 series which are observed and the upper members of the n=4 series (4→15,4→17, etc.) are stronger than predicted. Well-determined profiles of lines from the n=3 series indicate v sin i=200 km s -1 . Profiles of the higher members of the n=4 series, however, do not match the predictions, the observed line cores being deeper than predicted. The n=4 level appears to be more overpopulated at moderate depths in the atmosphere than the non-LTE calculations with plane-parallel layers indicate. This may be due to an overlap of the H and He II lines in the even-even series caused by macroturbulent velocities of the hydrogen atoms and helium atoms

  5. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  6. Risk assessment data bank design at the Savannah River Site

    International Nuclear Information System (INIS)

    Townsend, C.S.; Johnson, K.B.

    1992-01-01

    The Savannah River Site has designed and implemented a database system containing a series of compilations of incidents used primarily for risk assessment. Four databases have been designed and implemented using advanced database management system computer software. These databases exist for reprocessing, fuel fabrication, waste management, and the Savannah River Technology Center. They are combined into one system caged the Risk Assessment Methodology (RAM) Fault Tree Data Banks. This paper will discuss the logical design of the data, the menus, and the operating platform. Built-in updating features, such as batch and on-line data entry; data validation methods; automatic update features; and expert system programs, will also be discussed. User functions, such as on-line search/view/report and statistical functions, will be presented. Security features and backup and recovery methods will also be covered

  7. A Performance Study on Synchronous and Asynchronous Update Rules for A Plug-In Direct Particle Swarm Repetitive Controller

    Directory of Open Access Journals (Sweden)

    Ufnalski Bartlomiej

    2014-12-01

    Full Text Available In this paper two different update schemes for the recently developed plug-in direct particle swarm repetitive controller (PDPSRC are investigated and compared. The proposed approach employs the particle swarm optimizer (PSO to solve in on-line mode a dynamic optimization problem (DOP related to the control task in the constant-amplitude constant-frequency voltage-source inverter (CACF VSI with an LC output filter. The effectiveness of synchronous and asynchronous update rules, both commonly used in static optimization problems (SOPs, is assessed and compared in the case of PDPSRC. The performance of the controller, when synthesized using each of the update schemes, is studied numerically.

  8. Update on Intra-Arterial Chemotherapy for Retinoblastoma

    Directory of Open Access Journals (Sweden)

    Mario Zanaty

    2014-01-01

    Full Text Available The tools for managing retinoblastoma have been increasing in the past decade. While globe-salvage still relies heavily on intravenous chemotherapy, tumors in advanced stage that failed chemotherapy are now referred for intra-arterial chemotherapy (IAC to avoid enucleation. However, IAC still has many obstacles to overcome. We present an update on the indications, complications, limitations, success, and technical aspects of IAC. Given its safety and high efficacy, it is expected that IAC will replace conventional strategies and will become a first-line option even for tumors that are amenable for other strategies.

  9. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  10. Update of GRASP/Ada reverse engineering tools for Ada

    Science.gov (United States)

    Cross, James H., II

    1993-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical

  11. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  12. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  13. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  14. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File

  15. GAME: GAlaxy Machine learning for Emission lines

    Science.gov (United States)

    Ucci, G.; Ferrara, A.; Pallottini, A.; Gallerani, S.

    2018-06-01

    We present an updated, optimized version of GAME (GAlaxy Machine learning for Emission lines), a code designed to infer key interstellar medium physical properties from emission line intensities of ultraviolet /optical/far-infrared galaxy spectra. The improvements concern (a) an enlarged spectral library including Pop III stars, (b) the inclusion of spectral noise in the training procedure, and (c) an accurate evaluation of uncertainties. We extensively validate the optimized code and compare its performance against empirical methods and other available emission line codes (PYQZ and HII-CHI-MISTRY) on a sample of 62 SDSS stacked galaxy spectra and 75 observed HII regions. Very good agreement is found for metallicity. However, ionization parameters derived by GAME tend to be higher. We show that this is due to the use of too limited libraries in the other codes. The main advantages of GAME are the simultaneous use of all the measured spectral lines and the extremely short computational times. We finally discuss the code potential and limitations.

  16. GAME: GAlaxy Machine learning for Emission lines

    Science.gov (United States)

    Ucci, G.; Ferrara, A.; Pallottini, A.; Gallerani, S.

    2018-03-01

    We present an updated, optimized version of GAME (GAlaxy Machine learning for Emission lines), a code designed to infer key interstellar medium physical properties from emission line intensities of UV/optical/far infrared galaxy spectra. The improvements concern: (a) an enlarged spectral library including Pop III stars; (b) the inclusion of spectral noise in the training procedure, and (c) an accurate evaluation of uncertainties. We extensively validate the optimized code and compare its performance against empirical methods and other available emission line codes (pyqz and HII-CHI-mistry) on a sample of 62 SDSS stacked galaxy spectra and 75 observed HII regions. Very good agreement is found for metallicity. However, ionization parameters derived by GAME tend to be higher. We show that this is due to the use of too limited libraries in the other codes. The main advantages of GAME are the simultaneous use of all the measured spectral lines, and the extremely short computational times. We finally discuss the code potential and limitations.

  17. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  18. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  19. Ontario regulatory update

    International Nuclear Information System (INIS)

    Thompson, P.

    1998-01-01

    This paper provides a summary of recent events which when combined add up to a gradual but unmistakable movement of the energy sector in Ontario towards a fully competitive market. Some of the events precipitating this movement towards competition include the passing of the Energy Competition Act of 1998 (Bill 35), electricity deregulation, regulatory reform of the natural gas sector, and changes to the consumer protection legislation. The role of the Ontario Energy Board was also updated to bring it in line with the demands of the competitive marketplace. Among the new roles that the Board will assume are to facilitate competition, to maintain fair and reasonable rates, and to facilitate rational expansion. Another objective is to provide opportunities for including energy efficiency in government policies. Implications of the changes in the OEB's mandate for market participants were also discussed, including (1) regulated gas sales and delivery mechanisms, (2) transactional services, (3) contract restructuring, (4) consumer protection, (5) supervision of competitive market participants, and (6) market surveillance

  20. Collinearity analysis of Brassica A and C genomes based on an updated inferred unigene order

    Directory of Open Access Journals (Sweden)

    Ian Bancroft

    2015-06-01

    Full Text Available This data article includes SNP scoring across lines of the Brassica napus TNDH population based on Illumina sequencing of mRNA, expanded to 75 lines. The 21, 323 mapped markers defined 887 recombination bins, representing an updated genetic linkage map for the species. Based on this new map, 5 genome sequence scaffolds were split and the order and orientation of scaffolds updated to establish a new pseudomolecule specification. The order of unigenes and SNP array probes within these pseudomolecules was determined. Unigenes were assessed for sequence similarity to the A and C genomes. The 57, 246 that mapped to both enabled the collinearity of the A and C genomes to be illustrated graphically. Although the great majority was in collinear positions, some were not. Analyses of 60 such instances are presented, suggesting that the breakdown in collinearity was largely due to either the absence of the homoeologue on one genome (resulting in sequence match to a paralogue or multiple similar sequences being present. The mRNAseq datasets for the TNDH lines are available from the SRA repository (ERA283648; the remaining datasets are supplied with this article.

  1. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  2. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  3. Spectro web: oscillator strength measurements of atomic absorption lines in the sun and procyon

    International Nuclear Information System (INIS)

    Lobel, A

    2008-01-01

    We update the online SpectroWeb database of spectral standard reference stars with 1178 oscillator strength values of atomic absorption lines observed in the optical spectrum of the Sun and Procyon (α CMi A). The updated line oscillator strengths are measured with best fits to the disk-integrated KPNO-FTS spectrum of the Sun observed between 4000 A and 6800 A using state-of-the-art detailed spectral synthesis calculations. A subset of 660 line oscillator strengths is validated with synthetic spectrum calculations of Procyon observed with ESO-UVES between 4700 A and 6800 A. The new log(gf)-values in SpectroWeb are improvements upon the values offered in the online Vienna Atomic Line Database (VALD). We find for neutral iron-group elements, such as Fe I, Ni I, Cr I, and Ti I, a statistically significant over-estimation of the VALD log((gf)-values for weak absorption lines with normalized central line depths below 15 %. For abundant lighter elements (e.g. Mg I and Ca I) this trend is statistically not significantly detectable, with the exception of Si I for which the log(gf)-values of 60 weak and medium-strong lines are substantially decreased to best fit the observed spectra. The newly measured log(gf)-values are available in the SpectroWeb database at http://spectra.freeshell.org, which interactively displays the observed and computed stellar spectra, together with corresponding atomic line data.

  4. Harmonisation and updatability based on valid fundamental data of the German electricity mix. 3. rev. ed.

    International Nuclear Information System (INIS)

    Viebahn, Peter; Patyk, Andreas; Fritsche, Uwe R.

    2008-01-01

    Almost every product requires electricity for its manufacture, and the electricity mix used for this is a point of interest in life cycle assessments. Energy-related processes play an important role in life cycle assessments, which in turn are of major significance for product valuations. The Life Cycle Data Network has now carried out a study dedicated to generating a fundamental data record on ''Germany's electricity mix'' which describes the electricity mix supplied by German public power plants. This is the first time that a standardised data record has been made available which was compiled by common accord of all players concerned, whose data stem from quality assured sources and which can be updated year by year. (orig./GL)

  5. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  6. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  7. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  8. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  9. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  10. Compilation of field-scale caisson data on solute transport in the unsaturated zone

    International Nuclear Information System (INIS)

    Polzer, W.L.; Essington, E.H.; Fuentes, H.R.; Nyhan, J.W.

    1986-11-01

    Los Alamos National Laboratory has conducted technical support studies to assess siting requirements mandated by Nuclear Regulatory Commission in 10 CFR Part 61. Field-scale transport studies were conducted under unsaturated moisture conditions and under steady and unsteady flow conditions in large caissons located and operated in a natural (field) environment. Moisture content, temperature, flow rate, base-line chemical, tracer influent, and tracer breakthrough data collected during tracer migration studies in the caisson are compiled in tables and graphs. Data suggest that the imposition of a period of drainage (influent solution flow was stopped) may cause an increase in tracer concentration in the soil solution at various sampling points in the caisson. Evaporation during drainage and diffusion of the tracers from immobile to mobile water are two phenomena that could explain the increase. Data also suggest that heterogeneity of sorption sites may increase the variability in transport of sorbing tracers compared with nonsorbing tracers

  11. Working Memory Updating Latency Reflects the Cost of Switching between Maintenance and Updating Modes of Operation

    Science.gov (United States)

    Kessler, Yoav; Oberauer, Klaus

    2014-01-01

    Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…

  12. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  13. Compilation of TFTR materials data

    International Nuclear Information System (INIS)

    Havener, W.J.

    1975-12-01

    In order to document the key thermophysical property data used in the conceptual design of Tokamak Fusion Test Reactor (TFTR) systems and components, a series of data packages has been prepared. It is expected that data for additional materials will be added and the information already provided will be updated to provide a project-wide data base

  14. FRMAC Updates

    International Nuclear Information System (INIS)

    Mueller, P.

    1995-01-01

    This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given

  15. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  16. Update on diagnostic strategies of pulmonary embolism

    International Nuclear Information System (INIS)

    Kauczor, H.U.; Heussel, C.P.; Thelen, M.

    1999-01-01

    Acute pulmonary embolism is a frequent disease with non-specific findings, high mortality, and multiple therapeutic options. A definitive diagnosis must be established by accurate, non-invasive, easily performed, cost-effective, and widely available imaging modalities. Conventional diagnostic strategies have relied on ventilation-perfusion scintigraphy complemented by venous imaging. If the results are inconclusive, pulmonary angiography, which is regarded as the gold standard, is to be performed. Recently, marked improvements in CT and MRI and shortcomings of scintigraphy led to an update of the diagnostic strategy. Spiral CT is successfully employed as a second-line procedure to clarify indeterminate scintigraphic results avoiding pulmonary angiography. It can also be used as a first-line screening tool if service and expertise is provided. Venous imaging is indicated if CT is inconclusive. The MRI technique can be applied as an alternative second-line test if spiral CT is not available or is contraindicated. It has the greatest potential for further developments and refinements. Echocardiography should be used as a first-line bedside examination in critical patients. If inconclusive stabilized patients undergo spiral CT, unstable patients should be referred for pulmonary angiography. Chronic thromboembolic pulmonary hypertension is a rare sequela of acute pulmonary embolism which can be cured surgically. Morphology, complications, and differential diagnoses are better illustrated by spiral CT and MRA, whereas invasive acquisition of hemodynamic data is the sole advantage of angiography. (orig.)

  17. Sweet Syndrome: A Review and Update.

    Science.gov (United States)

    Villarreal-Villarreal, C D; Ocampo-Candiani, J; Villarreal-Martínez, A

    2016-06-01

    Sweet syndrome is the most representative entity of febrile neutrophilic dermatoses. It typically presents in patients with pirexya, neutrophilia, painful tender erytomatous papules, nodules and plaques often distributed asymmetrically. Frequent sites include the face, neck and upper extremities. Affected sites show a characteristical neutrophilic infiltrate in the upper dermis. Its etiology remains elucidated, but it seems that can be mediated by a hypersensitivity reaction in which cytokines, followed by infiltration of neutrophils, may be involved. Systemic corticosteroids are the first-line of treatment in most cases. We present a concise review of the pathogenesis, classification, diagnosis and treatment update of this entity. Copyright © 2015 AEDV. Published by Elsevier España, S.L.U. All rights reserved.

  18. ATOMIC DATA FOR ABSORPTION-LINES FROM THE GROUND-LEVEL AT WAVELENGTHS GREATER-THAN-228-ANGSTROM

    NARCIS (Netherlands)

    VERNER, DA; BARTHEL, PD; TYTLER, D

    1994-01-01

    We list wavelengths, statistical weigths and oscillator strengths for 2249 spectral lines arising from the ground states of atoms and ions. The compilation covers all wavelengths longward of the HeII Lyman limit at 227.838 Angstrom and all the ion states of all elements from hydrogen to bismuth (Z =

  19. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  20. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  1. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  2. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  3. Allocentrically implied target locations are updated in an eye-centred reference frame.

    Science.gov (United States)

    Thompson, Aidan A; Glover, Christopher V; Henriques, Denise Y P

    2012-04-18

    When reaching to remembered target locations following an intervening eye movement a systematic pattern of error is found indicating eye-centred updating of visuospatial memory. Here we investigated if implicit targets, defined only by allocentric visual cues, are also updated in an eye-centred reference frame as explicit targets are. Participants viewed vertical bars separated by varying distances, and horizontal lines of equivalently varying lengths, implying a "target" location at the midpoint of the stimulus. After determining the implied "target" location from only the allocentric stimuli provided, participants saccaded to an eccentric location, and reached to the remembered "target" location. Irrespective of the type of stimulus reaching errors to these implicit targets are gaze-dependent, and do not differ from those found when reaching to remembered explicit targets. Implicit target locations are coded and updated as a function of relative gaze direction with respect to those implied locations just as explicit targets are, even though no target is specifically represented. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. [An update on myasthenia gravis].

    Science.gov (United States)

    Martínez Torre, S; Gómez Molinero, I; Martínez Girón, R

    2018-03-16

    Myasthenia gravis is one of the most common disorders that affect neuromuscular transmission. It is currently one of the most understood and characterised autoimmune disorders Its typical symptoms are fluctuating weakness and fatigue that affects a combination of ocular muscles, bulbar functions, as well as limb and respiratory muscles, which are due to an immune attack against the postsynaptic membrane of the neuromuscular junction. The diagnosis of myasthenia gravis is based on clinical and serological test. It is a disease that can be effectively controlled with the current therapeutic lines, even achieving a complete remission. An update of this interesting disorder is now presented. Copyright © 2018 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Critically Evaluated Energy Levels, Spectral Lines, Transition Probabilities, and Intensities of Neutral Vanadium (V i)

    Energy Technology Data Exchange (ETDEWEB)

    Saloman, Edward B. [Dakota Consulting, Inc., 1110 Bonifant Street, Suite 310, Silver Spring, MD 20910 (United States); Kramida, Alexander [National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States)

    2017-08-01

    The energy levels, observed spectral lines, and transition probabilities of the neutral vanadium atom, V i, have been compiled. Also included are values for some forbidden lines that may be of interest to the astrophysical community. Experimental Landé g -factors and leading percentage compositions for the levels are included where available, as well as wavelengths calculated from the energy levels (Ritz wavelengths). Wavelengths are reported for 3985 transitions, and 549 energy levels are determined. The observed relative intensities normalized to a common scale are provided.

  6. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  7. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  8. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  9. The pH dependent surface charging and points of zero charge. VII. Update.

    Science.gov (United States)

    Kosmulski, Marek

    2018-01-01

    The pristine points of zero charge (PZC) and isoelectric points (IEP) of metal oxides and IEP of other materials from the recent literature, and a few older results (overlooked in previous searches) are summarized. This study is an update of the previous compilations by the same author [Surface Charging and Points of Zero Charge, CRC, Boca Raton, 2009; J. Colloid Interface Sci. 337 (2009) 439; 353 (2011) 1; 426 (2014) 209]. The field has been very active, but most PZC and IEP are reported for materials, which are very well-documented already (silica, alumina, titania, iron oxides). IEP of (nominally) Gd 2 O 3 , NaTaO 3 , and SrTiO 3 have been reported in the recent literature. Their IEP were not reported in older studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Memory updating and mental arithmetic

    Directory of Open Access Journals (Sweden)

    Cheng-Ching eHan

    2016-02-01

    Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.

  11. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  12. An update on neurotoxin products and administration methods.

    Science.gov (United States)

    Lanoue, Julien; Dong, Joanna; Do, Timothy; Goldenberg, Gary

    2016-09-01

    Since onabotulinumtoxinA for nonsurgical aesthetic enhancement of glabellar lines was initially reported, the popularity of botulinum neurotoxin (BoNT) products among both clinicians and consumers has rapidly grown, and we have seen several additional BoNT formulations enter the market. As the demand for minimally invasive cosmetic procedures continues to increase, we will see the introduction of additional formulations of BoNT products as well as new delivery devices and administration techniques. In this article, we provide a brief update on current and upcoming BoNT products and also review the literature on novel administration methods based on recently published studies.

  13. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    International Nuclear Information System (INIS)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs

  14. Compilation of selected marine radioecological data for the US Subseabed Program: Summaries of available radioecological concentration factors and biological half-lives

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, L.S.; Marietta, M.G.; Jackson, D.W.

    1987-04-01

    The US Subseabed Disposal Program has compiled an extensive concentration factor and biological half-life data base from the international marine radioecological literature. A microcomputer-based data management system has been implemented to provide statistical and graphic summaries of these data. The data base is constructed in a manner which allows subsets to be sorted using a number of interstudy variables such as organism category, tissue/organ category, geographic location (for in situ studies), and several laboratory-related conditions (e.g., exposure time and exposure concentration). This report updates earlier reviews and provides summaries of the tabulated data. In addition to the concentration factor/biological half-life data base, we provide an outline of other published marine radioecological works. Our goal is to present these data in a form that enables those concerned with predictive assessment of radiation dose in the marine environment to make a more judicious selection of data for a given application. 555 refs., 19 figs., 7 tabs.

  15. Comparison of the updated solutions of the 6th dynamic AER Benchmark - main steam line break in a NPP with WWER-440

    International Nuclear Information System (INIS)

    Kliem, S.

    2003-01-01

    The 6 th dynamic AER Benchmark is used for the systematic validation of coupled 3D neutron kinetic/thermal hydraulic system codes. It was defined at The 10 th AER-Symposium. In this benchmark, a hypothetical double ended break of one main steam line at full power in a WWER-440 plant is investigated. The main thermal hydraulic features are the consideration of incomplete coolant mixing in the lower and upper plenum of the reactor pressure vessel and an asymmetric operation of the feed water system. For the tuning of the different nuclear cross section data used by the participants, an isothermal re-criticality temperature was defined. The paper gives an overview on the behaviour of the main thermal hydraulic and neutron kinetic parameters in the provided solutions. The differences in the updated solution in comparison to the previous ones are described. Improvements in the modelling of the transient led to a better agreement of a part of the results while for another part the deviations rose up. The sensitivity of the core power behaviour on the secondary side modelling is discussed in detail (Authors)

  16. CHOgenome.org 2.0: Genome resources and website updates.

    Science.gov (United States)

    Kremkow, Benjamin G; Baik, Jong Youn; MacDonald, Madolyn L; Lee, Kelvin H

    2015-07-01

    Chinese hamster ovary (CHO) cells are a major host cell line for the production of therapeutic proteins, and CHO cell and Chinese hamster (CH) genomes have recently been sequenced using next-generation sequencing methods. CHOgenome.org was launched in 2011 (version 1.0) to serve as a database repository and to provide bioinformatics tools for the CHO community. CHOgenome.org (version 1.0) maintained GenBank CHO-K1 genome data, identified CHO-omics literature, and provided a CHO-specific BLAST service. Recent major updates to CHOgenome.org (version 2.0) include new sequence and annotation databases for both CHO and CH genomes, a more user-friendly website, and new research tools, including a proteome browser and a genome viewer. CHO cell-line specific sequences and annotations facilitate cell line development opportunities, several of which are discussed. Moving forward, CHOgenome.org will host the increasing amount of CHO-omics data and continue to make useful bioinformatics tools available to the CHO community. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  18. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  19. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  20. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  1. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  2. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  3. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  4. Molecular line parameters for the atmospheric trace molecule spectroscopy experiment

    Science.gov (United States)

    Brown, L. R.; Farmer, C. B.; Toth, R. A.; Rinsland, Curtis P.

    1987-01-01

    During its first mission in 1985 onboard Spacelab 3, the ATMOS (atmospheric trace molecule spectroscopy) instrument, a high speed Fourier transform spectrometer, produced a large number of high resolution infrared solar absorption spectra recorded in the occultation mode. The analysis and interpretation of these data in terms of composition, chemistry, and dynamics of the earth's upper atmosphere required good knowledge of the molecular line parameters for those species giving rise to the absorptions in the atmospheric spectra. This paper describes the spectroscopic line parameter database compiled for the ATMOS experiment and referenced in other papers describing ATMOS results. With over 400,000 entries, the linelist catalogs parameters of 46 minor and trace species in the 1-10,000/cm region.

  5. New compilers speed up applications for Intel-based systems; Intel Compilers pave the way for Intel's Hyper-threading technology

    CERN Multimedia

    2002-01-01

    "Intel Corporation today introduced updated tools to help software developers optimize applications for Intel's expanding family of architectures with key innovations such as Intel's Hyper Threading Technology (1 page).

  6. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  7. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  8. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  9. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  10. Portuguese recommendations for the use of methotrexate in rheumatic diseases - 2016 update.

    Science.gov (United States)

    Duarte, Ana Catarina; Santos-Faria, Daniela; Gonçalves, Maria João; Sepriano, Alexandre; Mourão, Ana Filipa; Duarte, Cátia; Neves, Joana Sousa; Águeda, Ana Filipa; Ribeiro, Pedro Avila; Daniel, Alexandra; Neto, Adriano; Cordeiro, Ana; Rodrigues, Ana; Barcelos, Anabela; Silva, Cândida; Ponte, Cristina; Vieira-Sousa, Elsa; Teixeira, Filipa; Oliveira-Ramos, Filipa; Araújo, Filipe; Barcelos, Filipe; Canhão, Helena; Santos, Helena; Ramos, João; Polido-Pereira, Joaquim; Tavares-Costa, José; Melo Gomes, José António; Cunha-Miranda, Luís; Costa, Lúcia; Cerqueira, Marcos; Cruz, Margarida; Santos, Maria José; Bernardes, Miguel; Oliveira, Paula; Abreu, Pedro; Figueira, Ricardo; Barros, Rita; Falcão, Sandra; Pinto, Patrícia; Pimenta, Sofia; Capela, Susana; Teixeira, Vitor; Fonseca, João Eurico

    2017-01-01

    Methotrexate (MTX) is the first-line drug in the treatment of rheumatoid arthritis (RA) and the most commonly prescribed disease modifying anti-rheumatic drug. Moreover, it is also used as an adjuvant drug in patients under biologic therapies, enhancing the efficacy of biologic agents. To review the literature and update the Portuguese recommendations for the use of MTX in rheumatic diseases first published in 2009. The first Portuguese guidelines for the use of MTX in rheumatic diseases were published in 2009 and were integrated in the multinational 3E Initiative (Evidence Expertise Exchange) project. The Portuguese rheumatologists based on literature evidence and consensus opinion formulated 13 recommendations. At a national meeting, the recommendations included in this document were further discussed and updated. The document resulting from this meeting circulated to all Portuguese rheumatologists, who anonymously voted online on the level of agreement with the updated recommendations. Results presented in this article are mainly in accordance with previous guidelines, with some new information regarding hepatitis B infection during MTX treatment, pulmonary toxicity monitoring, hepatotoxicity management, association with hematologic neoplasms, combination therapy and tuberculosis screening during treatment. The present recommendations combine scientific evidence with expert opinion and attained desirable agreement among Portuguese rheumatologists. The regular update of these recommendations is essential in order to keep them a valid and useful tool in daily practice.

  11. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  12. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  13. An Update to the Budget and Economic Outlook: 2014 to 2024

    Science.gov (United States)

    2014-08-01

    line) reflect the data available and projec- tions made before July 30. U.S. dollar and the currencies of major U.S. trading part- ners using shares of...U.S. trade as weights) in 2015 and later years will reduce the foreign- currency price of U.S. exports and increase the domestic price of U.S. imports...UPDATE TO THE BUDGET AND ECONOMIC OUTLOOK: 2014 TO 2024 AUGUST 2014 CBOEconomic Projections The economic projections were prepared by the Macroeconomic

  14. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  15. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  16. Bibliography of references to avian botulism: Update

    Science.gov (United States)

    Wilson, Sonoma S.; Locke, Louis N.

    1982-01-01

    This bibliography, first compiled in 1970 (Allen and Wilson 1977) and published in 1977 in response to many requests for information on avian botulism, has been updated to include the literature published through 1980.In general, only articles dealing primarily with the avian disease are included, as opposed to those concerned with the various aspects of the biology of Clostridium botulinum, either type C or E. A few exceptions, such as Bengtson's report of the first isolation and description of the type C organism, are included for their historical interest. Progress reports and other administrative documents not available for distribution on request are excluded, as are most textbook accounts, which are generally summaries of work published elsewhere.This bibliography was a cooperative effort by the National Wildlife Health Laboratory, U.S. Fish and Wildlife Service, and the U.S. National Park Service. The National Park Service provided partial funding for the work through Contract No. 89100-0491.Although the authors attempted to list every important reference, they make no claim to complete coverage of the published literature. The authors will be grateful to users of the bibliography who call attention to errors or omissions.Wayne I. Jensen (Retired)Milton Friend, Director, National Wildlife Health Laboratory

  17. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  18. Updated Status and Performance at the Fourth HST COS FUV Lifetime Position

    Science.gov (United States)

    Taylor, Joanna M.; De Rosa, Gisella; Fix, Mees B.; Fox, Andrew; Indriolo, Nick; James, Bethan; Jedrzejewski, Robert I.; Oliveira, Cristina M.; Penton, Steven V.; Plesha, Rachel; Proffitt, Charles R.; Rafelski, Marc; Roman-Duval, Julia; Sahnow, David J.; Snyder, Elaine M.; Sonnentrucker, Paule; White, James

    2017-06-01

    To mitigate the adverse effects of gain sag on the spectral quality and accuracy of Hubble Space Telescope’s Cosmic Origins Spectrograph FUV observations, COS FUV spectra will be moved from Lifetime Position 3 (LP3) to a new pristine location on the detectors at LP4 in July 2017. To achieve maximal spectral resolution while preserving detector area, the spectra will be shifted in the cross-dispersion (XD) direction by -2.5" (about -31 pixels) from LP3 or -5” (about 62 pixels) from the original LP1. At LP4, the wavelength calibration lamp spectrum can overlap with the previously gain-sagged LP2 PSA spectrum location. If lamp lines fall in the gain sag holes from LP2, it can cause line ratios to change and the wavelength calibration to fail. As a result, we have updated the Wavecal Parameters Reference Table and CalCOS to address this issue. Additionally, it was necessary to extend the current geometric correction in order to encompass the entire LP4 location. Here we present 2-D template profiles and 1-D spectral trace centroids derived at LP4 as well as LP4-related updates to the wavelength calibration, and geometric correction.

  19. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  20. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  1. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  2. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  3. The HITRAN 2008 molecular spectroscopic database

    International Nuclear Information System (INIS)

    Rothman, L.S.; Gordon, I.E.; Barbe, A.; Benner, D.Chris; Bernath, P.F.; Birk, M.; Boudon, V.; Brown, L.R.; Campargue, A.; Champion, J.-P.; Chance, K.; Coudert, L.H.; Dana, V.; Devi, V.M.; Fally, S.; Flaud, J.-M.

    2009-01-01

    This paper describes the status of the 2008 edition of the HITRAN molecular spectroscopic database. The new edition is the first official public release since the 2004 edition, although a number of crucial updates had been made available online since 2004. The HITRAN compilation consists of several components that serve as input for radiative-transfer calculation codes: individual line parameters for the microwave through visible spectra of molecules in the gas phase; absorption cross-sections for molecules having dense spectral features, i.e. spectra in which the individual lines are not resolved; individual line parameters and absorption cross-sections for bands in the ultraviolet; refractive indices of aerosols, tables and files of general properties associated with the database; and database management software. The line-by-line portion of the database contains spectroscopic parameters for 42 molecules including many of their isotopologues.

  4. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  5. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch  N6 Prévessin Site Ap...

  6. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  7. A compilation of radionuclide transfer factors for the plant, meat, milk, and aquatic food pathways and the suggested default values for the RESRAD code

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.Y.; Biwer, B.M.; Yu, C.

    1993-08-01

    The ongoing development and revision of the RESRAD computer code at Argonne National Laboratory requires update of radionuclide transfer factors for the plant, meat, milk, and aquatic food pathways. Default values for these transfer factors used in published radiological assessment reports are compiled and compared with values used in RESRAD. The differences among the reported default values used in different radiological assessment codes and reports are also discussed. In data comparisons, values used in more recent reports are given more weight because more recent experimental work tends to be conducted under better-defined laboratory or field conditions. A new default value is suggested for RESRAD if one of the following conditions is met: (1) values used in recent reports are an order of magnitude higher or lower than the default value currently used in RESRAD, or (2) the same default value is used in several recent radiological assessment reports.

  8. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  9. Ames-2016 line lists for 13 isotopologues of CO2: Updates, consistency, and remaining issues

    Science.gov (United States)

    Huang (黄新川), Xinchuan; Schwenke, David W.; Freedman, Richard S.; Lee, Timothy J.

    2017-12-01

    A new 626-based Ames-2 PES refinement and Ames-2016 line lists for 13 CO2 isotopologues are reported. A consistent σRMS = ±0.02 cm-1 is established for hundreds of isotopologue band origins using the Ames-2 PES. Ames-2016 line lists are computed at 296 K, 1000 K and 4000 K using the Ames-2 PES and the same DMS-N2 dipole surface used previously, with J up to 150, E‧ up to 24,000 cm-1 or 18,000 cm-1 and appropriate intensity cutoffs. The lists are compared to the CDSD-296, CDSD-4000 databases, UCL line lists, and a few recent highly accurate CO2 intensity measurements. Both agreements and discrepancies are discussed. Compared to the old Ames CO2 lists, the Ames-2016 line lists have line position deviations reduced by 50% or more, which consequently leads to more reliable intensities. The line shape parameters in the Ames-2016 line lists are predicted using the newly assigned conventional vibrational polyad quantum numbers for rovibrational levels below 12,000 cm-1 so the quality of the line shape parameters is similar to that of CDSD or HITRAN. This study further proves that a semi-empirically refined PES (Ames-1 and Ames-2) coupled with a high quality ab initio DMS (DMS-N2 and UCL) may generate IR predictions with consistent accuracy and is thus helpful in the analysis of laboratory spectra and simulations of various isotopologues. The Ames-2016 lists based on DMS-N2 have reached the ∼1% intensity prediction accuracy level for the recent 626 30013-00001 and 20013-00001 bands, but further quantification and improvements require sub-percent or sub-half-percent accurate experimental intensities. The inter-isotopologue consistency of the intensity prediction accuracies should have reached better than 1-3% for regular bands not affected by resonances. Since the Effective Dipole Models (EDM) in CDSD and HITRAN have 1-20% or even larger uncertainties, we show that the Ames lists can provide better alternative IR data for many hard-to-determine isotopologue bands

  10. Update on ocular toxicity of ethambutol

    Directory of Open Access Journals (Sweden)

    Priscilla Makunyane

    2016-08-01

    Full Text Available The purpose of this review is to update clinicians on available literature on the ocular toxicity of ethambutol and the type of eye care to be provided to patients treated with these medications. Ethambutol is a commonly used first-line anti-tuberculosis drug. Since its first use in the 1960s, ocular toxicity is described as related to dose and duration, and it is reversible on therapy discontinuation. However, the reversibility of the toxic optic neuropathy remains controversial. The mechanism of ocular toxicity owing to ethambutol is still under investigation. Other than discontinuing the drug, no specific treatment is available for the optic neuropathy caused by ethambutol. Doctors prescribing ethambutol should be aware of the ocular toxicity, and the drug should be used with proper patient education and ophthalmic monitoring.

  11. Photospheric Ca and Mg line-strength variations in G29-38

    International Nuclear Information System (INIS)

    Hippel, Ted von; Thompson, Susan E; Reach, W T; Mullally, F; Kilic, Mukremin; Nitta, Atsuko

    2009-01-01

    Temporal variations in metal-line strengths in H-atmosphere white dwarfs hold the potential to test the timescales of gravitational settling theory. These short timescales, in turn, require that DAZs are currently accreting. Such temporal variations would also indicate that accretion from a circumstellar dust disk can be episodic. We are compiling increasing evidence for time-variable Ca and Mg line-strength variations in the best studied DAZ, G29-38. Our evidence to date supports the gravitational settling timescales of Koester and Wilken (2006) and episodic accretion from G29-38's debris disk. Furthermore, we have detected evidence for time-variable accretion with a timescale = 24 hours, and typical variability of ∼4% during the 100 days of our autumn 2007 monitoring campaign.

  12. Photospheric Ca and Mg line-strength variations in G29-38

    Energy Technology Data Exchange (ETDEWEB)

    Hippel, Ted von [Physics Department, Siena College, Loudonville, NY (United States); Thompson, Susan E [Department of Physics and Astronomy, University of Delaware, Newark, DE (United States); Reach, W T [IPAC, California Institute of Technology, Pasadena, CA (United States); Mullally, F [Department of Astronomy, Princeton University, Princeton, NJ (United States); Kilic, Mukremin [Center for Astrophysics, Harvard University, Cambridge, MA (United States); Nitta, Atsuko, E-mail: tvonhippel@siena.ed, E-mail: sthomp@physics.udel.ed, E-mail: reach@ipac.caltech.ed, E-mail: fergal@astro.princeton.ed, E-mail: kilic@astronomy.ohio-state.ed, E-mail: anitta@gemini.ed [Gemini Observatory, Hilo, HI (United States)

    2009-06-01

    Temporal variations in metal-line strengths in H-atmosphere white dwarfs hold the potential to test the timescales of gravitational settling theory. These short timescales, in turn, require that DAZs are currently accreting. Such temporal variations would also indicate that accretion from a circumstellar dust disk can be episodic. We are compiling increasing evidence for time-variable Ca and Mg line-strength variations in the best studied DAZ, G29-38. Our evidence to date supports the gravitational settling timescales of Koester and Wilken (2006) and episodic accretion from G29-38's debris disk. Furthermore, we have detected evidence for time-variable accretion with a timescale = 24 hours, and typical variability of approx4% during the 100 days of our autumn 2007 monitoring campaign.

  13. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  14. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  15. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  16. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  17. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  18. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  19. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  20. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  1. Development of JAEA sorption database (JAEA-SDB). Update of sorption/QA data in FY2015

    International Nuclear Information System (INIS)

    Tachi, Yukio; Suyama, Tadahiro

    2016-03-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) and rocks are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in these barrier materials is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the sorption and diffusion processes and develop databases compiling reliable data and mechanistic/predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, Japan Atomic Energy Agency (JAEA) has developed databases of sorption and diffusion parameters in bentonites and rocks. These sorption and diffusion databases (SDB/DDB) were firstly developed as an important basis for the H12 PA of high-level radioactive waste disposal, and have been provided through the Web. JAEA has been continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the databases for possible applications to PA-related parameter setting. The present report focuses on improving and updating of the sorption database (JAEA-SDB) as basis of integrated approach for PA-related K d setting and mechanistic sorption model development. This includes an overview of database structure, contents and functions including additional data evaluation function focusing on statistical data evaluation and grouping of data related to potential perturbations. K d data and their QA results are updated by focusing our recent activities on the K d setting and mechanistic model development. As a result, 11,206 K d data from 83 references were added, total number of K d values in the JAEA-SDB reached about 58,000. The QA/classified K d data reached about 60% for all K d data in JAEA-SDB. The updated JAEA-SDB is expected to make it possible to obtain quick overview of the available data, and to

  2. Updating Recursive XML Views of Relations

    DEFF Research Database (Denmark)

    Choi, Byron; Cong, Gao; Fan, Wenfei

    2009-01-01

    This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...

  3. Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm

    Science.gov (United States)

    Rac-Lubashevsky, Rachel; Kessler, Yoav

    2016-01-01

    Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…

  4. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  5. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  6. Hydrogeological structure model of the Olkiluoto Site. Update in 2010

    International Nuclear Information System (INIS)

    Vaittinen, T.; Ahokas, H.; Nummela, J.; Paulamaeki, S.

    2011-09-01

    As part of the programme for the final disposal of spent nuclear fuel, a hydrogeological structure model containing the hydraulically significant zones on Olkiluoto Island has been compiled. The structure model describes the deterministic site scale zones that dominate the groundwater flow. The main objective of the study is to provide the geometry and the hydrogeological properties related to the groundwater flow for the zones and the sparsely fractured bedrock to be used in the numerical modelling of groundwater flow and geochemical transport and thereby in the safety assessment. Also, these zones should be taken into account in the repository layout and in the construction of the disposal facility and they have a long-term impact on the evolution of the site and the safety of the disposal repository. The previous hydrogeological model was compiled in 2008 and this updated version is based on data available at the end of May 2010. The updating was based on new hydrogeological observations and a systematic approach covering all drillholes to assess measured fracture transmissivities typical of the site-scale hydrogeological zones. New data consisted of head observations and interpreted pressure and flow responses caused by field activities. Essential background data for the modelling included the ductile deformation model and the site scale brittle deformation zones modelled in the geological model version 2.0. The GSM combine both geological and geophysical investigation data on the site. As a result of the modelling campaign, hydrogeological zones HZ001, HZ008, HZ19A, HZ19B, HZ19C, HZ20A, HZ20B, HZ21, HZ21B, HZ039, HZ099, OL-BFZ100, and HZ146 were included in the structure model. Compared with the previous model, zone HZ004 was replaced with zone HZ146 and zone HZ039 was introduced for the first time. Alternative zone HZ21B was included in the basic model. For the modelled zones, both the zone intersections, describing the fractures with dominating groundwater

  7. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  8. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  9. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  10. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  11. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  12. A novel method to alleviate flash-line defects in coining process

    KAUST Repository

    Xu, Jiangping

    2013-04-01

    We employ a finite element framework based on a dynamic explicit algorithm to predict the flash-line defects in the coining process. The distribution of the flash-line is obtained by building a radial friction work model at the element level. The elasto-plastic behavior of porous materials undergoing large deformations is considered where the constitutive level updates are the result of a local variational minimization problem. We study the material flow at different strokes of the die across the entire coining process and observe that the change in the flow direction of the material in the rim region may contribute to the flash lines. Our proposed framework shows that a part of the rim region in which the flash-line defects appear is consistent with the reported experimental results. We also propose a novel method of redesigning the rim geometry of the workpiece to alleviate the flash-line defects which also shows good agreement with experiments. © 2012 Elsevier Inc. All rights reserved.

  13. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  14. Massachusetts shoreline change project: a GIS compilation of vector shorelines and associated shoreline change data for the 2013 update

    Science.gov (United States)

    Smith, Theresa L.; Himmelstoss, Emily A.; Thieler, E. Robert

    2013-01-01

    Identifying the rates and trends associated with the position of the shoreline through time presents vital information on potential impacts these changes may have on coastal populations and infrastructure, and supports informed coastal management decisions. This report publishes the historical shoreline data used to assess the scale and timing of erosion and accretion along the Massachusetts coast from New Hampshire to Rhode Island including all of Cape Cod, Martha’s Vineyard, Nantucket and the Elizabeth Islands. This data is an update to the Massachusetts Office of Coastal Zone Management Shoreline Change Project. Shoreline positions from the past 164 years (1845 to 2009) were used to compute the shoreline change rates. These data include a combined length of 1,804 kilometers of new shoreline data derived from color orthophoto imagery collected in 2008 and 2009, and topographic lidar collected in 2007. These new shorelines have been added to previously published historic shoreline data from the Massachusetts Office of Coastal Zone Management and the U.S. Geological Survey. A detailed report containing a discussion of the shoreline change data presented here and a summary of the resulting rates is available and cited at the end of the Introduction section of this report.

  15. Human rights principles in developing and updating policies and laws on mental health

    OpenAIRE

    Schulze, M.

    2016-01-01

    The World Health Organization's Mental Health Action Plan 2013?2020 stipulates human rights as a cross-cutting principle (WHO, 2013) and foresees global targets to update policies as well as mental health laws in line with international and regional human rights instruments. The international human rights agreements repeatedly refer to health, including mental health. The most pertinent provisions related to mental health are enshrined in the 2006 Convention on the Rights of Persons with Disa...

  16. Evaluation of angular distributions and production cross-sections for discrete gamma lines in iron

    International Nuclear Information System (INIS)

    Savin, M.V.; Livke, A.V.; Zvenigorodskij, A.G.

    2001-01-01

    The experimental data were compiled and the angular distributions and production cross-sections for the E γ = 846.8, 1238.3 and 1810.8 keV discrete gamma-lines evaluated. The Legendre polynomial coefficients describing the angular distributions in the energy range up to E n = 14.0 MeV and cross-section values in the E n = 0.85-19.0 MeV range were evaluated. (author)

  17. Updating systematic reviews: an international survey.

    Directory of Open Access Journals (Sweden)

    Chantelle Garritty

    Full Text Available BACKGROUND: Systematic reviews (SRs should be up to date to maintain their importance in informing healthcare policy and practice. However, little guidance is available about when and how to update SRs. Moreover, the updating policies and practices of organizations that commission or produce SRs are unclear. METHODOLOGY/PRINCIPAL FINDINGS: The objective was to describe the updating practices and policies of agencies that sponsor or conduct SRs. An Internet-based survey was administered to a purposive non-random sample of 195 healthcare organizations within the international SR community. Survey results were analyzed using descriptive statistics. The completed response rate was 58% (n = 114 from across 26 countries with 70% (75/107 of participants identified as producers of SRs. Among responders, 79% (84/107 characterized the importance of updating as high or very-high and 57% (60/106 of organizations reported to have a formal policy for updating. However, only 29% (35/106 of organizations made reference to a written policy document. Several groups (62/105; 59% reported updating practices as irregular, and over half (53/103 of organizational respondents estimated that more than 50% of their respective SRs were likely out of date. Authors of the original SR (42/106; 40% were most often deemed responsible for ensuring SRs were current. Barriers to updating included resource constraints, reviewer motivation, lack of academic credit, and limited publishing formats. Most respondents (70/100; 70% indicated that they supported centralization of updating efforts across institutions or agencies. Furthermore, 84% (83/99 of respondents indicated they favoured the development of a central registry of SRs, analogous to efforts within the clinical trials community. CONCLUSIONS/SIGNIFICANCE: Most organizations that sponsor and/or carry out SRs consider updating important. Despite this recognition, updating practices are not regular, and many organizations lack

  18. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  19. Infectious Diseases in Older Adults of Long-Term Care Facilities: Update on Approach to Diagnosis and Management.

    Science.gov (United States)

    Jump, Robin L P; Crnich, Christopher J; Mody, Lona; Bradley, Suzanne F; Nicolle, Lindsay E; Yoshikawa, Thomas T

    2018-04-01

    The diagnosis, treatment, and prevention of infectious diseases in older adults in long-term care facilities (LTCFs), particularly nursing facilities, remains a challenge for all health providers who care for this population. This review provides updated information on the currently most important challenges of infectious diseases in LTCFs. With the increasing prescribing of antibiotics in older adults, particularly in LTCFs, the topic of antibiotic stewardship is presented in this review. Following this discussion, salient points on clinical relevance, clinical presentation, diagnostic approach, therapy, and prevention are discussed for skin and soft tissue infections, infectious diarrhea (Clostridium difficile and norovirus infections), bacterial pneumonia, and urinary tract infection, as well as some of the newer approaches to preventive interventions in the LTCF setting. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  20. SCORPION II persistent surveillance system update

    Science.gov (United States)

    Coster, Michael; Chambers, Jon

    2010-04-01

    This paper updates the improvements and benefits demonstrated in the next generation Northrop Grumman SCORPION II family of persistent surveillance and target recognition systems produced by the Xetron Campus in Cincinnati, Ohio. SCORPION II reduces the size, weight, and cost of all SCORPION components in a flexible, field programmable system that is easier to conceal and enables integration of over fifty different Unattended Ground Sensor (UGS) and camera types from a variety of manufacturers, with a modular approach to supporting multiple Line of Sight (LOS) and Beyond Line of Sight (BLOS) communications interfaces. Since 1998 Northrop Grumman has been integrating best in class sensors with its proven universal modular Gateway to provide encrypted data exfiltration to Common Operational Picture (COP) systems and remote sensor command and control. In addition to feeding COP systems, SCORPION and SCORPION II data can be directly processed using a common sensor status graphical user interface (GUI) that allows for viewing and analysis of images and sensor data from up to seven hundred SCORPION system gateways on single or multiple displays. This GUI enables a large amount of sensor data and imagery to be used for actionable intelligence as well as remote sensor command and control by a minimum number of analysts.

  1. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  2. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  3. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  4. Quantifying Update Effects in Citizen-Oriented Software

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-02-01

    Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.

  5. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  6. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  7. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  8. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  9. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  10. Development of JAEA sorption database (JAEA-SDB). Update of data evaluation functions and sorption/QA data

    International Nuclear Information System (INIS)

    Tachi, Yukio; Suyama, Tadahiro; Ochs, Michael; Ganter, Charlotte

    2011-03-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) and rocks are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the sorption and diffusion processes and develop database compiling reliable data and mechanistic/predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, Japan Atomic Energy Agency (JAEA) has developed databases of sorption and diffusion parameters in buffer materials (bentonite) and rocks. These sorption and diffusion databases (SDB/DDB) were firstly developed as an important basis for the H12 PA of high-level radioactive waste disposal, and have been provided through the Web. JAEA has been continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the databases for possible applications to PA-related parameter setting. The present report focuses on developing and updating of the sorption database (JAEA-SDB) as basis of integrated approach for PA-related K d setting. This includes an overview of database structure, contents and functions including additional data evaluation function focusing on multi-parameter dependence, operating method, PA-related applications of the web-based JAEA-SDB. K d data and their QA results are updated by focusing our recent activities on the K d setting and mechanistic model development. As a result, 4,250 K d data from 32 references are added, total K d values in the JAEA-SDB are about 28,540. The QA/classified K d data are about 39% for all K d data in JAEA-SDB. The updated JAEA-SDB is expected to make it possible to obtain quick overview of the available data, and to have suitable access to the respective data

  11. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  12. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity.

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A; Kim, Sungjoon; Wilson, Christopher J; Lehár, Joseph; Kryukov, Gregory V; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F; Monahan, John E; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H; Cheng, Jill; Yu, Guoying K; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P; Gabriel, Stacey B; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E; Weber, Barbara L; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L; Meyerson, Matthew; Golub, Todd R; Morrissey, Michael P; Sellers, William R; Schlegel, Robert; Garraway, Levi A

    2012-03-28

    The systematic translation of cancer genomic data into knowledge of tumour biology and therapeutic possibilities remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacological annotation is available. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacological profiles for 24 anticancer drugs across 479 of the cell lines, this collection allowed identification of genetic, lineage, and gene-expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Together, our results indicate that large, annotated cell-line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of 'personalized' therapeutic regimens.

  13. The Cancer Cell Line Encyclopedia enables predictive modeling of anticancer drug sensitivity

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A.; Kim, Sungjoon; Wilson, Christopher J.; Lehár, Joseph; Kryukov, Gregory V.; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F.; Monahan, John E.; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A.; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H.; Cheng, Jill; Yu, Guoying K.; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D.; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C.; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P.; Gabriel, Stacey B.; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E.; Weber, Barbara L.; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L.; Meyerson, Matthew; Golub, Todd R.; Morrissey, Michael P.; Sellers, William R.; Schlegel, Robert; Garraway, Levi A.

    2012-01-01

    The systematic translation of cancer genomic data into knowledge of tumor biology and therapeutic avenues remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacologic annotation is available1. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number, and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacologic profiles for 24 anticancer drugs across 479 of the lines, this collection allowed identification of genetic, lineage, and gene expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Altogether, our results suggest that large, annotated cell line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of “personalized” therapeutic regimens2. PMID:22460905

  14. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  15. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  16. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  17. Updating of working memory: lingering bindings.

    Science.gov (United States)

    Oberauer, Klaus; Vockenberg, Kerstin

    2009-05-01

    Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.

  18. Measuring Child Poverty in South Africa: Sensitivity to the Choice of Equivalence Scale and an Updated Profile

    Science.gov (United States)

    Streak, Judith Christine; Yu, Derek; Van der Berg, Servaas

    2009-01-01

    This paper offers evidence on the sensitivity of child poverty in South Africa to changes in the adult equivalence scale (AES) and updates the child poverty profile based on the Income and Expenditure Survey 2005/06. Setting the poverty line at the 40th percentile of households calculated with different AESs the scope and composition of child…

  19. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  20. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  1. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  2. 49 CFR 1002.3 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating fees. Each fee shall be updated by updating the cost components comprising the fee. Cost... direct labor costs are direct labor costs determined by the cost study set forth in Revision of Fees For... by total office costs for the Offices directly associated with user fee activity. Actual updating of...

  3. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  4. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  5. Remote mechanical C line

    International Nuclear Information System (INIS)

    Nuttall, K.R.; Gardner, P.R.

    1991-01-01

    Westinghouse Hanford Company is developing a desk-top simulation based training program on the operation of the Remote Mechanical C (RMC) Line process in the Plutonium Finishing Plant on the Hanford site, Richland, Washington. Simulations display aod contioually update current values of system parameters on computer graphics of RMC line equipment. Students are able to operate a variety of controllers to maintain proper system status. Programmed faults, selectable by the course instructor, can be used to test student responses to off-normal events. Prior to operation of the simulation, students are given computer-based tutorials on the function, processes, operation, and error conditions associated with individual components. By including the capability of operating each individual component - valves, heaters, agitators, etc. - the computer-based training (CBT) lessons become an interactive training manual. From one perspective RMC represents one step in the diffusion of the well-known and well-documented simulator training activities for nuclear reactor operators to other training programs, equally critical, perhaps, but less well scrutinized in the past. Because of the slowly responding nature of the actual process, RMC can retain many of the capabilities of practice and testing in a simulated work environment while avoiding the cost of a full scale simulator and the exposure and waste developed by practice runs of the RMC line. From another perspective RMC suggests training advances even beyond the most faithful simulators. For example, by integrating CBT lessons with the simulation, RMC permits students to focus in on specific processes occurring inside chosen components. In effect, the interactive training manual is available on-line with the simulation itself. Cost are also discussed

  6. Yucca Mountain Project bibliography, July--December 1988: An update: Civilian Radioactive Waste Management Program

    International Nuclear Information System (INIS)

    Tamura, A.T.; Lorenz, J.J.

    1989-04-01

    This update contains information on the Yucca Mountain Project that was added to the Energy Data Base during the last six months of 1988. The update also includes a new section which provides information about publications on the Energy Data Base that were not sponsored by the project but have some relevance to it. This section covers the period 1977 to 1988. Prior to August 5, 1988, this project was called the Nevada Nuclear Waste Storage Investigations. The update is categorized by principal project participating organizations, and items are arranged in chronological order. Participant-sponsored subcontractor reports, meeting papers, and journal articles are included with sponsoring organization. Previous information on this project can be found in the Nevada Nuclear Waste Storage Investigations bibliographies: DOE/TIC-3406, which covers the years 1977 to 1985; DOE/OSTI-3406(Suppl.1), which covers 1986 and 1987; and the Yucca Mountain Project Bibliography, DOE/OSTI-3406(Suppl.1)(Add. 1), which covers the first six months of 1988. All entries in these publications are searchable on-line on the NNW data base file which can be accessed through the Integrated Technical Information System (ITIS) of the US Department of Energy

  7. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  8. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  9. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  10. Updated cannulation technique for tissue plasminogen activator injection into peripapillary retinal vein for central retinal vein occlusion.

    Science.gov (United States)

    van Overdam, Koen A; Missotten, Tom; Spielberg, Leigh H

    2015-12-01

    To update the surgical technique in which a vitrectomy is performed and a retinal branch vein is cannulated and infused with recombinant tissue plasminogen activator (RTPA) to treat central retinal vein occlusion (CRVO) in patients who present with very low visual acuity (VA). Twelve consecutive patients (12 eyes) with CRVO and low VA (logMAR >1.00) at presentation were treated using this method. Cannulation of a peripapillary retinal vein and stable injection of RTPA was successfully performed without surgery-related complications in all 12 eyes. At 12 months after surgery, 8 of the 12 patients (67%) experienced at least one line of improvement in best corrected visual acuity; 6 of the 12 (50%) improved ≥5 lines and 2 (17%) improved ≥8 lines. After additional grid laser and/or subconjunctival or intravitreal corticosteroids, the mean decrease in central foveal thickness was 260 μm, and the mean total macular volume decreased from 12.10 mm(3) to 9.24 mm(3) . Four patients received panretinal photocoagulation to treat either iris neovascularization (n = 2) or neovascularization of the retina and/or disc (n = 2). Administration of RTPA via a peripapillary vein using this updated technique provides an alternative or additional treatment option for patients with very low VA after CRVO. © 2015 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  11. Expression image data of Drosophila GAL4 enhancer trap lines - GETDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us GETDB Expression image data of Drosophila GAL4 enhancer trap lines Data detail Data name Exp...ta contents 3,075 expression image data by developmental stages of Drosophila Images are classified into the...escription Download License Update History of This Database Site Policy | Contact Us Expression image data of Drosophila GAL4 enhancer trap lines - GETDB | LSDB Archive ... ...ression image data of Drosophila GAL4 enhancer trap lines DOI 10.18908/lsdba.nbdc00236-004 Description of da

  12. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  13. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  14. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  15. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  16. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  17. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  18. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  19. Bibliography of work on the heterogeneous photocatalytic removal of hazardous compounds from water and air: Update Number 1 to June, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Blake, D.M.

    1995-11-01

    This report is an update of a bibliography, published in May, 1994, of research performed on the photocatalytic oxidation of organic or inorganic compounds in air or water and on the photocatalytic reduction of metal-containing ions in water. The general focus of the research is on removing hazardous contaminants from air water to meet environmental or health regulations. The processes covered are based on the application of heterogeneous photocatalysts. The current state-of-the-art in catalysts are forms of titanium dioxide or modifications of titanium dioxide, but work on other heterogeneous catalysts is also included in this compilation. This update contains 574 references, most published between January, 1993 and June, 1995, but some references are from earlier work that were not included in the previous report. A new section has been added which gives information about companies that are active in providing products based on photocatalytic processes or that can provide pilot, demonstration, or commercial-scale water- or air-treatment systems. Key words, assigned by the author of this report, have been included with the citations in the listing of the bibliography.

  20. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  1. EPOCA/EUR-OCEANS data compilation on the biological and biogeochemical responses to ocean acidification

    Directory of Open Access Journals (Sweden)

    A.-M. Nisumaa

    2010-07-01

    Full Text Available The uptake of anthropogenic CO2 by the oceans has led to a rise in the oceanic partial pressure of CO2, and to a decrease in pH and carbonate ion concentration. This modification of the marine carbonate system is referred to as ocean acidification. Numerous papers report the effects of ocean acidification on marine organisms and communities but few have provided details concerning full carbonate chemistry and complementary observations. Additionally, carbonate system variables are often reported in different units, calculated using different sets of dissociation constants and on different pH scales. Hence the direct comparison of experimental results has been problematic and often misleading. The need was identified to (1 gather data on carbonate chemistry, biological and biogeochemical properties, and other ancillary data from published experimental data, (2 transform the information into common framework, and (3 make data freely available. The present paper is the outcome of an effort to integrate ocean carbonate chemistry data from the literature which has been supported by the European Network of Excellence for Ocean Ecosystems Analysis (EUR-OCEANS and the European Project on Ocean Acidification (EPOCA. A total of 185 papers were identified, 100 contained enough information to readily compute carbonate chemistry variables, and 81 data sets were archived at PANGAEA – The Publishing Network for Geoscientific & Environmental Data. This data compilation is regularly updated as an ongoing mission of EPOCA.

    Data access: http://doi.pangaea.de/10.1594/PANGAEA.735138

  2. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  3. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  4. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  5. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  6. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  7. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  8. GSD Update: Year in Review: Spotlight on 2017 Research by the Grassland, Shrubland and Desert Ecosystems Science Program

    Science.gov (United States)

    Deborah M. Finch

    2018-01-01

    In this issue of the GSD Update, we feature selected studies of the RMRS Grassland, Shrubland and Desert Ecosystems Science Program (GSD) that focus on the theme of fire. Significant results of recent research and science delivery by GSD scientists are highlighted. We feature program research that lines up with the strategic priorities and goals of the USDA Forest...

  9. GSD Update: Year in Review: Spotlight on 2013 research by the Grassland, Shrubland and Desert Ecosystems Science Program

    Science.gov (United States)

    Deborah M. Finch

    2014-01-01

    In this issue of the GSD Update, we take a look back at selected studies of the Grassland, Shrubland and Desert Ecosystems Science Program (GSD) that depict its strengths and focus areas. Significant results of recent research and science delivery by GSD scientists are highlighted. We feature program research that lines up with the strategic research priorities of the...

  10. GSD Update: Year in Review: Spotlight on 2015 Research by the Grassland, Shrubland and Desert Ecosystems Science Program

    Science.gov (United States)

    Deborah. Finch

    2016-01-01

    In this issue of the GSD Update, we take a look back at selected studies of the Grassland, Shrubland and Desert Ecosystems Science Program (GSD) that depict its strengths and focus areas. Significant results of recent research and science delivery by GSD scientists are highlighted. We feature program research that lines up with the strategic research...

  11. Support for Astronaut's View of Mexican/ Central American Fires and on-Line Earth Observations Training Manual

    Science.gov (United States)

    Kaminski, Charles F., Jr.

    1999-01-01

    A small project to compile remote sensing and in-site data to review the processes leading to the May 1998 Mexican/Central American fires was undertaken. A web page based on this project was assembled. The second project initiated involved an interactive and on-line program that will replace the paper version of the Earth Observations Preflight Training Manual. Technical support was provided to Prof. Marvin Glasser as needed.

  12. Photovoltaic Shading Testbed for Module-Level Power Electronics: 2016 Performance Data Update

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris [National Renewable Energy Lab. (NREL), Golden, CO (United States); Meydbray, Jenya [PV Evolution Labs (PVEL), Davis, CA (United States); Donovan, Matt [PV Evolution Labs (PVEL), Davis, CA (United States)

    2016-09-01

    The 2012 NREL report 'Photovoltaic Shading Testbed for Module-Level Power Electronics' provides a standard methodology for estimating the performance benefit of distributed power electronics under partial shading conditions. Since the release of the report, experiments have been conducted for a number of products and for different system configurations. Drawing from these experiences, updates to the test and analysis methods are recommended. Proposed changes in data processing have the benefit of reducing the sensitivity to measurement errors and weather variability, as well as bringing the updated performance score in line with measured and simulated values of the shade recovery benefit of distributed PV power electronics. Also, due to the emergence of new technologies including sub-module embedded power electronics, the shading method has been extended to include power electronics that operate at a finer granularity than the module level. An update to the method is proposed to account for these emerging technologies that respond to shading differently than module-level devices. The partial shading test remains a repeatable test procedure that attempts to simulate shading situations as would be experienced by typical residential or commercial rooftop photovoltaic (PV) systems. Performance data for multiple products tested using this method are discussed, based on equipment from Enphase, Solar Edge, Maxim Integrated and SMA. In general, the annual recovery of shading losses from the module-level electronics evaluated is 25-35%, with the major difference between different trials being related to the number of parallel strings in the test installation rather than differences between the equipment tested. Appendix D data has been added in this update.

  13. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  14. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  15. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio.

    Science.gov (United States)

    Rottman, Benjamin Margolin

    2017-02-01

    Whether humans can accurately make decisions in line with Bayes' rule has been one of the most important yet contentious topics in cognitive psychology. Though a number of paradigms have been used for studying Bayesian updating, rarely have subjects been allowed to use their own preexisting beliefs about the prior and the likelihood. A study is reported in which physicians judged the posttest probability of a diagnosis for a patient vignette after receiving a test result, and the physicians' posttest judgments were compared to the normative posttest calculated from their own beliefs in the sensitivity and false positive rate of the test (likelihood ratio) and prior probability of the diagnosis. On the one hand, the posttest judgments were strongly related to the physicians' beliefs about both the prior probability as well as the likelihood ratio, and the priors were used considerably more strongly than in previous research. On the other hand, both the prior and the likelihoods were still not used quite as much as they should have been, and there was evidence of other nonnormative aspects to the updating, such as updating independent of the likelihood beliefs. By focusing on how physicians use their own prior beliefs for Bayesian updating, this study provides insight into how well experts perform probabilistic inference in settings in which they rely upon their own prior beliefs rather than experimenter-provided cues. It suggests that there is reason to be optimistic about experts' abilities, but that there is still considerable need for improvement.

  16. Regulatory and technical reports, compilation for 1979. Volume 4. Bibliographical report Jan-Dec 79

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzie, L.; Aragon, R.

    1980-07-01

    The compilation lists formal regulatory and technical reports issued in 1979 by the U.S. Nuclear Regulatory Commission (NRC) staff and by NRC contractors. The compilation is divided into three major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The first portion of this sequential section lists staff reports, the second portion lists NRC-sponsored conference proceedings, and the third lists contractor reports. Each report citation in the sequential section contains full bibliographic information

  17. Finite element modelling of ionized field quantities around a monopolar HVDC transmission line

    International Nuclear Information System (INIS)

    Jaiswal, Vinay; Thomas, M Joy

    2003-01-01

    In this paper, the Poisson's equation describing the ionized field around an HVDC line is solved using an improved finite element based technique. First order isoparametric quadrilateral elements, together with a modified updating criterion for the space charge distribution, are implemented in the iterative procedure. A novel technique is presented for mesh generation in the presence of space charges. Electric field lines and equipotential lines have been computed using the proposed technique. Total corona current at different applied voltages above corona onset voltage, electric field at the ground plane with and without the presence of space charges and current density at the ground plane have also been computed. The results are in agreement with the experimental values available in the published literature

  18. Low-temperature geothermal water in Utah: A compilation of data for thermal wells and springs through 1993

    Energy Technology Data Exchange (ETDEWEB)

    Blackett, R.E.

    1994-07-01

    The Geothermal Division of DOE initiated the Low-Temperature Geothermal Resources and Technology Transfer Program, following a special appropriation by Congress in 1991, to encourage wider use of lower-temperature geothermal resources through direct-use, geothermal heat-pump, and binary-cycle power conversion technologies. The Oregon Institute of Technology (OIT), the University of Utah Research Institute (UURI), and the Idaho Water Resources Research Institute organized the federally-funded program and enlisted the help of ten western states to carry out phase one. This first phase involves updating the inventory of thermal wells and springs with the help of the participating state agencies. The state resource teams inventory thermal wells and springs, and compile relevant information on each sources. OIT and UURI cooperatively administer the program. OIT provides overall contract management while UURI provides technical direction to the state teams. Phase one of the program focuses on replacing part of GEOTHERM by building a new database of low- and moderate-temperature geothermal systems for use on personal computers. For Utah, this involved (1) identifying sources of geothermal date, (2) designing a database structure, (3) entering the new date; (4) checking for errors, inconsistencies, and duplicate records; (5) organizing the data into reporting formats; and (6) generating a map (1:750,000 scale) of Utah showing the locations and record identification numbers of thermal wells and springs.

  19. HB-Line Material Control and Accountability Measurements at SRS

    International Nuclear Information System (INIS)

    Casella, V.R.

    2003-01-01

    Presently, HB-Line work at the Savannah River Site consists primarily of the stabilization and packaging of nuclear materials for storage and the characterization of materials for disposition in H-Area. In order to ensure compliance with Material Control and Accountability (MC and A) Regulations, accountability measurements are performed throughout the HB-Line processes. Accountability measurements are used to keep track of the nuclear material inventory by constantly updating the amount of material in the MBAs (Material Balance Area) and sub-MBAs. This is done by subtracting the amount of accountable material that is added to a process and by adding the amount of accountable material that is put back in storage. A Physical Inventory is taken and compared to the ''Book Value'' listed in the Nuclear Material Accounting System. The difference (BPID) in the Book Inventory minus the Physical Inventory of a sub-account for bulk material must agree within the measurement errors combined in quadrature to provide assurance that nuclear material is accounted for. This work provides an overview of HB-Line processes and accountability measurements. The Scrap Recovery Line and Neptunium-237/Plutonium-239 Oxide Line are described and sampling and analyses for Phase II are provided. Recommendations for improvements are provided to improve efficiency and cost effectiveness

  20. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  1. Nintedanib plus docetaxel as second-line therapy in patients with non-small-cell lung cancer of adenocarcinoma histology

    DEFF Research Database (Denmark)

    Popat, Sanjay; Mellemgaard, Anders; Reck, Martin

    2017-01-01

    PATIENTS & METHODS: We provide an update to a network meta-analysis evaluating the relative efficacy of nintedanib + docetaxel versus other second-line agents in adenocarcinoma histology non-small-cell lung cancer. RESULTS: Overall similarity of nintedanib + docetaxel versus ramucirumab + docetaxel...

  2. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  3. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  4. DOE/DMS workshop on future synchrotron VUV and x-ray beam Lines

    International Nuclear Information System (INIS)

    Green, P.H.

    1992-03-01

    This document contains an overview of the participating DOE Laboratory beam line interests and the projected science to be addressed on these beam lines, both at new and existing synchrotron facilities. The scientific programs associated with present and planned synchrotron research by DOE Laboratories are discussed in chapters titled ''VUV and Soft X-Ray Research'' and ''Hard X-Ray Research.'' This research encompasses a broad range of the nation's scientific and technical research needs from fundamental to applied, in areas including environmental, biological, and physical sciences; new materials; and energy-related technologies. The projected cost of this proposed construction has been provided in tabular form using a uniform format so that anticipated DOE and outside funding agency contributions for construction and for research and development can be determined. The cost figures are, of course, subject to uncertainties of detailed design requirements and the availability of facility-designed generic components and outside vendors. The report also contains a compendium (as submitted by the beam line proposers) of the design capabilities, the anticipated costs, and the scientific programs of projected beam line construction at the four synchrotron facilities. A summary of the projected cost of these beam lines to be requested of DOE is compiled

  5. Grounding Lines Detecting Using LANDSAT8 Oli and CRYOSAT-2 Data Fusion

    Science.gov (United States)

    Li, F.; Guo, Y.; Zhang, Y.; Zhang, S.

    2018-04-01

    The grounding zone is the region where ice transitions from grounded ice sheet to freely floating ice shelf, grounding lines are actually more of a zone, typically over several kilometers. The mass loss from Antarctica is strongly linked to changes in the ice shelves and their grounding lines, since the variation in the grounding line can result in very rapid changes in glacier and ice-shelf behavior. Based on remote sensing observations, five global Antarctic grounding line products have been released internationally, including MOA, ASAID, ICESat, MEaSUREs, and Synthesized grounding lines. However, the five products could not provide the annual grounding line products of the whole Antarctic, even some products have stopped updating, which limits the time series analysis of Antarctic material balance to a certain extent. Besides, the accurate of single remote-sensing data based grounding line products is far from satisficed. Therefore, we use algorithms to extract grounding lines with SAR and Cryosat-2 data respectively, and combine the results of two kinds of grounding lines to obtain new products, we obtain a mature grounding line extraction algorithm process, so that we can realize the extraction of grounding line of the Antarctic each year in the future. The comparison between fusion results and the MOA product results indicate that there is a maximum deviation of 188.67 meters between the MOA product and the fusion result.

  6. Recent Developments in the NIST Atomic Databases

    Science.gov (United States)

    Kramida, Alexander

    2011-05-01

    New versions of the NIST Atomic Spectra Database (ASD, v. 4.0) and three bibliographic databases (Atomic Energy Levels and Spectra, v. 2.0, Atomic Transition Probabilities, v. 9.0, and Atomic Line Broadening and Shapes, v. 3.0) have recently been released. In this contribution I will describe the main changes in the way users get the data through the Web. The contents of ASD have been significantly extended. In particular, the data on highly ionized tungsten (W III-LXXIV) have been added from a recently published NIST compilation. The tables for Fe I and Fe II have been replaced with newer, much more extensive lists (10000 lines for Fe I). The other updated or new spectra include H, D, T, He I-II, Li I-III, Be I-IV, B I-V, C I-II, N I-II, O I-II, Na I-X, K I-XIX, and Hg I. The new version of ASD now incorporates data on isotopes of several elements. I will describe some of the issues the NIST ASD Team faces when updating the data.

  7. Recent Developments in the NIST Atomic Databases

    International Nuclear Information System (INIS)

    Kramida, Alexander

    2011-01-01

    New versions of the NIST Atomic Spectra Database (ASD, v. 4.0) and three bibliographic databases (Atomic Energy Levels and Spectra, v. 2.0, Atomic Transition Probabilities, v. 9.0, and Atomic Line Broadening and Shapes, v. 3.0) have recently been released. In this contribution I will describe the main changes in the way users get the data through the Web. The contents of ASD have been significantly extended. In particular, the data on highly ionized tungsten (W III-LXXIV) have been added from a recently published NIST compilation. The tables for Fe I and Fe II have been replaced with newer, much more extensive lists (10000 lines for Fe I). The other updated or new spectra include H, D, T, He I-II, Li I-III, Be I-IV, B I-V, C I-II, N I-II, O I-II, Na I-X, K I-XIX, and Hg I. The new version of ASD now incorporates data on isotopes of several elements. I will describe some of the issues the NIST ASD Team faces when updating the data.

  8. Line officers' views on stated USDA Forest Service values and the agency reward system.

    Science.gov (United States)

    James J. Kennedy; Richard W. Haynes; Xiaoping Zhou

    2005-01-01

    To update and expand a study done in 1989 (Kennedy et al. 1992), we surveyed line officers attending the third National Forest Supervisors’ Conference (Chief, Associate Chief, deputy chiefs, regional foresters, directors of International Institute of Tropical Forestry and State and Private Forestry Northeastern Area, and forest supervisors; January 2004) and a 40-...

  9. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  10. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  11. Compilation of the abstracts of nuclear computer codes available at CPD/IPEN

    International Nuclear Information System (INIS)

    Granzotto, A.; Gouveia, A.S. de; Lourencao, E.M.

    1981-06-01

    A compilation of all computer codes available at IPEN in S.Paulo are presented. These computer codes are classified according to Argonne National Laboratory - and Energy Nuclear Agency schedule. (E.G.) [pt

  12. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, Mahmut Taylan [PSU; Choudary, Alok [Northwestern; Thakur, Rajeev [ANL

    2014-03-01

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.

  13. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  14. Updated clinical guidelines experience major reporting limitations

    Directory of Open Access Journals (Sweden)

    Robin W.M. Vernooij

    2017-10-01

    Full Text Available Abstract Background The Checklist for the Reporting of Updated Guidelines (CheckUp was recently developed. However, so far, no systematic assessment of the reporting of updated clinical guidelines (CGs exists. We aimed to examine (1 the completeness of reporting the updating process in CGs and (2 the inter-observer reliability of CheckUp. Methods We conducted a systematic assessment of the reporting of the updating process in a sample of updated CGs using CheckUp. We performed a systematic search to identify updated CGs published in 2015, developed by a professional society, reporting a systematic review of the evidence, and containing at least one recommendation. Three reviewers independently assessed the CGs with CheckUp (16 items. We calculated the median score per item, per domain, and overall, converting scores to a 10-point scale. Multiple linear regression analyses were used to identify differences according to country, type of organisation, scope, and health topic of updated CGs. We calculated the intraclass coefficient (ICC and 95% confidence interval (95% CI for domains and overall score. Results We included in total 60 updated CGs. The median domain score on a 10-point scale for presentation was 5.8 (range 1.7 to 10, for editorial independence 8.3 (range 3.3 to 10, and for methodology 5.7 (range 0 to 10. The median overall score on a 10-point scale was 6.3 (range 3.1 to 10. Presentation and justification items at recommendation level (respectively reported by 27 and 38% of the CGs and the methods used for the external review and implementing changes in practice were particularly poorly reported (both reported by 38% of the CGs. CGs developed by a European or international institution obtained a statistically significant higher overall score compared to North American or Asian institutions (p = 0.014. Finally, the agreement among the reviewers on the overall score was excellent (ICC 0.88, 95% CI 0.75 to 0.95. Conclusions The

  15. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  16. Highlights of the HITRAN2016 database

    Science.gov (United States)

    Gordon, I.; Rothman, L. S.; Hill, C.; Kochanov, R. V.; Tan, Y.

    2016-12-01

    The HITRAN2016 database will be released just before the AGU meeting. It is a titanic effort of world-wide collaboration between experimentalists, theoreticians and atmospheric scientists, who measure, calculate and validate the HITRAN data. The line-by-line lists for almost all of the HITRAN molecules were updated in comparison with the previous compilation HITRAN2012 [1] that has been in use, along with some intermediate updates, since 2012. The extent of the updates ranges from updating a few lines of certain molecules to complete replacements of the lists and introduction of additional isotopologues. Many more vibrational bands were added to the database, extending the spectral coverage and completeness of the datasets. For several molecules, including H2O, CO2 and CH4, the extent of the updates is so complex that separate task groups were assembled to make strategic decisions about the choices of sources for various parameters in different spectral regions. The amount of parameters has also been significantly increased, now incorporating, for instance, non-Voigt line profiles [2]; broadening by gases other than air and "self" [3]; and other phenomena, including line mixing. In addition, the amount of cross-sectional sets in the database has increased dramatically and includes many recent experiments as well as adaptation of the existing databases that were not in HITRAN previously (for instance the PNNL database [4]). The HITRAN2016 edition takes full advantage of the new structure and interface available at www.hitran.org [5] and the HITRAN Application Programming Interface [6]. This poster will provide a summary of the updates, emphasizing details of some of the most important or dramatic improvements. The users of the database will have an opportunity to discuss the updates relevant to their research and request a demonstration on how to work with the database. This work is supported by the NASA PATM (NNX13AI59G), PDART (NNX16AG51G) and AURA (NNX14AI55G

  17. Medicinal Chemistry Insights into Novel HDAC Inhibitors: An Updated Patent Review (2012-2016).

    Science.gov (United States)

    Zhan, Peng; Wang, Xueshun; Liu, Xinyong; Suzuki, Takayoshi

    2017-01-01

    Many laboratories have made intensive efforts to develop potent, selective, and orally bioavailable HDAC inhibitors (HDACIs). Novel HDACIs are being developed with the objective of improving potency and selectivity against specific types of cancers or non-cancer diseases. This updated patent review is an attempt to compile the work of various researchers of HDACIs from 2012 to mid 2016, and to enlighten and surprise both newcomers in this field and devoted medicinal chemists. According to the literature research and the writers' own research experience in the discovery of HDAC inhibitors. The inhibitors possessing new chemical scaffolds have attracted immense interest because they have the ability to improve HDAC isoform specificity and pharmaceutical properties. Focus is given to emerging medicinal chemistry principles and insights into the discovery and development of HDAC inhibitors. The development of effective HDACIs is shifting from trial-and-error approaches to sophisticated strategies. Effective profiling technologies will continue to have important utility. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Optimal updating magnitude in adaptive flat-distribution sampling.

    Science.gov (United States)

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  19. Circular Updates

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Circular Updates are periodic sequentially numbered instructions to debriefing staff and observers informing them of changes or additions to scientific and specimen...

  20. Important update of CERN Mail Services

    CERN Multimedia

    IT Department

    2009-01-01

    The CERN Mail Services are evolving. In the course of June and July 2009, all CERN mailboxes will be updated with a new infrastructure for hosting mailboxes, running Exchange 2007. This update is taking place in order to provide the capacity upgrade for the constantly growing volume of CERN mailboxes. It is also the opportunity to provide a number of improvements to CERN mailboxes: new and improved Outlook Web Access (the web interface used to access your mailbox from a web browser, also known as "webmail"), new features in the Out-of-Office auto-reply assistant, easier spam management... The update will preserve the mailbox configuration and no specific action is required by users. During the next weeks, each mailbox will be individually notified of the upcoming update the day before it takes place. We invite all users to carefully read this notification as it will contain the latest information for this update. The mailbox will be unavailable for a short time during the ni...

  1. The Updating of Geospatial Base Data

    Science.gov (United States)

    Alrajhi, Muhamad N.; Konecny, Gottfried

    2018-04-01

    Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.

  2. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  3. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  4. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    International Nuclear Information System (INIS)

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  5. Updating optical pseudoinverse associative memories.

    Science.gov (United States)

    Telfer, B; Casasent, D

    1989-07-01

    Selected algorithms for adding to and deleting from optical pseudoinverse associative memories are presented and compared. New realizations of pseudoinverse updating methods using vector inner product matrix bordering and reduced-dimensionality Karhunen-Loeve approximations (which have been used for updating optical filters) are described in the context of associative memories. Greville's theorem is reviewed and compared with the Widrow-Hoff algorithm. Kohonen's gradient projection method is expressed in a different form suitable for optical implementation. The data matrix memory is also discussed for comparison purposes. Memory size, speed and ease of updating, and key vector requirements are the comparison criteria used.

  6. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  7. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  8. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  9. Composition and Nutrient Information of Non-Alcoholic Beverages in the Spanish Market: An Update

    Directory of Open Access Journals (Sweden)

    María Serrano Iglesias

    2016-10-01

    Full Text Available The aim of this study was to draw an updated map of the nutrition facts in the different categories of non-alcoholic beverages in the Spanish market based on the information declared on the labels of these products; we expect this first step to justify the need for the coordination and harmonization of food composition tables in Spain so that there will be an updated database available to produce realistic scientific nutrient intake estimates in accordance with the actual market scenario. Materials and Methods: The nutrition facts declared on the labels of non-alcoholic beverages by manufacturers in Spain were compiled and studied. Results: The database included 211 beverages classified in 7 groups with energy/carbohydrate content per 100 mL ranging from 0–55 kcal/0–13 g for soft drinks; 2–60 kcal/0–14.5 g for energy drinks; 24–31 kcal/5.8–7.5 g for sports drinks; 1–32 kcal/0–7.3 g for drinks containing mineral salts in their composition; 14–69 kcal/2.6–17 g for fruit juice, nectar, and grape musts; 43–78 kcal/6.1–14.4 g for vegetable drinks; and 33–88 kcal/3.6–14 g for dairy drinks. Conclusion: The current non-alcoholic beverage market is a dynamic, growing, and highly innovative one, allowing consumers to choose according to their preferences, needs, or level of physical activity at any moment of the day.

  10. Updated layout of the LINAC4 transfer line to the PS Booster (Green Field Option)

    CERN Document Server

    Bellodi, G; Lallement, J B; Lombardi, A M; CERN. Geneva. AB Department

    2008-01-01

    At the time of defining the site of Linac4 and its integration in the complex of existing infrastructure at CERN (together with the plans for a future Superconducting Proton Linac), a series of radiation protection issues emerged that have since prompted a revision of the Linac4 to PSB transfer line layout, as was described in the document AB‐Note‐2007‐037. For radiological safety reasons the distance between the planned SPL tunnel and the basement of building 513 had to be increased, and this led to the decision to lower the Linac4 machine by 2.5m. A vertical ramp was consequently introduced in the transfer line to raise the beam at the same level of LINAC2/PSB for connection to the existing transfer line. A series of error study runs has been carried out on the modified layout to have an estimate of the losses induced by quadrupole alignment and field errors. The two worst cases of each error family have been used as case studies to test the efficiency of possible steering strategies in...

  11. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  12. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  13. Natalizumab treatment for multiple sclerosis: updated recommendations for patient selection and monitoring

    DEFF Research Database (Denmark)

    Kappos, Ludwig; Bates, David; Edan, Gilles

    2011-01-01

    Natalizumab, a highly specific α4-integrin antagonist, is approved for treatment of patients with active relapsing-remitting multiple sclerosis (RRMS). It is generally recommended for individuals who have not responded to a currently available first-line disease-modifying therapy or who have very......, based on additional long-term follow-up of clinical studies and post-marketing observations, including appropriate patient selection and management recommendations.......Natalizumab, a highly specific α4-integrin antagonist, is approved for treatment of patients with active relapsing-remitting multiple sclerosis (RRMS). It is generally recommended for individuals who have not responded to a currently available first-line disease-modifying therapy or who have very...... active disease. The expected benefits of natalizumab treatment have to be weighed against risks, especially the rare but serious adverse event of progressive multifocal leukoencephalopathy. In this Review, we revisit and update previous recommendations on natalizumab for treatment of patients with RRMS...

  14. Towards droplet size-aware biochemical application compilation for AM-EWOD biochips

    DEFF Research Database (Denmark)

    Pop, Paul; Alistar, Mirela

    2015-01-01

    a droplet size-aware compilation by proposing a routing algorithm that considers the droplet size. Our routing algorithm is developed for a novel digital microfluidic biochip architecture based on Active Matrix Electrowetting on Dielectric, which uses a thin film transistor array for the electrodes. We also...

  15. 27 CFR 478.24 - Compilation of State laws and published ordinances.

    Science.gov (United States)

    2010-04-01

    ... published ordinances. (a) The Director shall annually revise and furnish Federal firearms licensees with a... Director annually revises the compilation and publishes it as “State Laws and Published Ordinances—Firearms... and published ordinances. 478.24 Section 478.24 Alcohol, Tobacco Products, and Firearms BUREAU OF...

  16. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  17. Statistical Compilation of the ICT Sector and Policy Analysis | Page 3 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  18. Mount St. Helens Project. Cowlitz River Levee Systems, 2009 Level of Flood Protection Update Summary

    Science.gov (United States)

    2010-02-04

    of Flood Protection Update Summary Draft December 2009 Page F-5 soil in unsaturated region. So those equipotential lines above phreatic surface are...Lexington levee where a 50 percent probability of failure is assumed when the water surface is at the top of the levee and a 100 percent chance of failure...is assumed when the water surface is above the top of the levee. Additionally, for cases where the SWL is determined to be the same elevation as

  19. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  20. 7. Mentor update and support: what do mentors need from an update?

    Science.gov (United States)

    Phillips, Mari; Marshall, Joyce

    2015-04-01

    Mentorship is the 14th series of 'Midwifery basics' targeted at practising midwives. The aim of these articles is to provide information to raise awareness of the impact of the work of midwives on women's experience, and encourage midwives to seek further information through a series of activities relating to the topic. In this seventh article Mari Phillips and Joyce Marshall consider some of the key issues related to mentor update and support and consider what mentors need from their annual update.

  1. Email Updates

    Science.gov (United States)

    ... of this page: https://medlineplus.gov/listserv.html Email Updates To use the sharing features on this ... view your email history or unsubscribe. Prevent MedlinePlus emails from being marked as "spam" or "junk" To ...

  2. Valence-Dependent Belief Updating: Computational Validation

    Directory of Open Access Journals (Sweden)

    Bojana Kuzmanovic

    2017-06-01

    Full Text Available People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates with trials with bad news (worse-than-expected base rates. After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on

  3. Stellar Abundances for Galactic Archaeology Database. IV. Compilation of stars in dwarf galaxies

    Science.gov (United States)

    Suda, Takuma; Hidaka, Jun; Aoki, Wako; Katsuta, Yutaka; Yamada, Shimako; Fujimoto, Masayuki Y.; Ohtani, Yukari; Masuyama, Miyu; Noda, Kazuhiro; Wada, Kentaro

    2017-10-01

    We have constructed a database of stars in Local Group galaxies using the extended version of the SAGA (Stellar Abundances for Galactic Archaeology) database that contains stars in 24 dwarf spheroidal galaxies and ultra-faint dwarfs. The new version of the database includes more than 4500 stars in the Milky Way, by removing the previous metallicity criterion of [Fe/H] ≤ -2.5, and more than 6000 stars in the Local Group galaxies. We examined the validity of using a combined data set for elemental abundances. We also checked the consistency between the derived distances to individual stars and those to galaxies as given in the literature. Using the updated database, the characteristics of stars in dwarf galaxies are discussed. Our statistical analyses of α-element abundances show that the change of the slope of the [α/Fe] relative to [Fe/H] (so-called "knee") occurs at [Fe/H] = -1.0 ± 0.1 for the Milky Way. The knee positions for selected galaxies are derived by applying the same method. The star formation history of individual galaxies is explored using the slope of the cumulative metallicity distribution function. Radial gradients along the four directions are inspected in six galaxies where we find no direction-dependence of metallicity gradients along the major and minor axes. The compilation of all the available data shows a lack of CEMP-s population in dwarf galaxies, while there may be some CEMP-no stars at [Fe/H] ≲ -3 even in the very small sample. The inspection of the relationship between Eu and Ba abundances confirms an anomalously Ba-rich population in Fornax, which indicates a pre-enrichment of interstellar gas with r-process elements. We do not find any evidence of anti-correlations in O-Na and Mg-Al abundances, which characterizes the abundance trends in the Galactic globular clusters.

  4. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  5. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  6. Technique to increase performance of C-program for control systems. Compiler technique for low-cost CPU; Seigyoyo C gengo program no kosokuka gijutsu. Tei cost CPU no tame no gengo compiler gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Y [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    The software of automotive control systems has become increasingly large and complex. High level languages (primarily C) and the compilers become more important to reduce coding time. Most compilers represent real number in the floating point format specified by IEEE standard 754. Most microprocessors in the automotive industry have no hardware for the operation using the IEEE standard due to the cost requirements, resulting in the slow execution speed and large code size. Alternative formats to increase execution speed and reduce code size are proposed. Experimental results for the alternative formats show the improvement in execution speed and code size. 4 refs., 3 figs., 2 tabs.

  7. 49 CFR 360.5 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating the cost components comprising the fee. Cost components shall be updated as follows: (1) Direct... determined by the cost study in Regulations Governing Fees For Service, 1 I.C.C. 2d 60 (1984), or subsequent... by total office costs for the office directly associated with user fee activity. Actual updating of...

  8. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    Science.gov (United States)

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  9. Second-generation speed limit map updating applications

    DEFF Research Database (Denmark)

    Tradisauskas, Nerius; Agerholm, Niels; Juhl, Jens

    2011-01-01

    Intelligent Speed Adaptation is an Intelligent Transport System developed to significantly improve road safety in helping car drivers maintain appropriate driving behaviour. The system works in connection with the speed limits on the road network. It is thus essential to keep the speed limit map...... used in the Intelligent Speed Adaptation scheme updated. The traditional method of updating speed limit maps on the basis of long time interval observations needed to be replaced by a more efficient speed limit updating tool. In a Danish Intelligent Speed Adaptation trial a web-based tool was therefore...... for map updating should preferably be made on the basis of a commercial map provider, 2 such as Google Maps and that the real challenge is to oblige road authorities to carry out updates....

  10. Northern hemisphere mid-latitude geomagnetic anomaly revealed from Levantine Archaeomagnetic Compilation (LAC).

    Science.gov (United States)

    Shaar, R.; Tauxe, L.; Agnon, A.; Ben-Yosef, E.; Hassul, E.

    2015-12-01

    The rich archaeological heritage of Israel and nearby Levantine countries provides a unique opportunity for archaeomagnetic investigation in high resolution. Here we present a summary of our ongoing effort to reconstruct geomagnetic variations of the past several millennia in the Levant at decadal to millennial resolution. This effort at the Southern Levant, namely the "Levantine Archaeomagnetic Compilation" (LAC), presently consists of data from over 650 well-dated archaeological objects including pottery, slag, ovens, and furnaces. In this talk we review the methodological challenges in achieving a robust master secular variation curve with realistic error estimations from a large number of different datasets. We present the current status of the compilation, including the southern and western Levant LAC data (Israel, Cyprus, and Jordan) and other published north-eastern Levant data (Syria and southern Turkey), and outline the main findings emerging from these data. The main feature apparent from the new compilation is an extraordinary intensity high that developed over the Levant region during the first two millennia BCE. The climax of this event is a double peak intensity maximum starting at ca. 1000 BCE and ending at ca. 735 BCE, accompanied with at least two events of geomagnetic spikes. Paleomagnetic directions from this period demonstrate anomalies of up to 20 degrees far from the averaged GAD field. This leads us to postulate that the maximum in the intensity is a manifestation of an intense mid-latitude local positive geomagnetic anomaly that persisted for over two centuries.

  11. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  12. Key articles and guidelines in the management of pulmonary arterial hypertension: 2011 update.

    Science.gov (United States)

    Johnson, Samuel G; Kayser, Steven R; Attridge, Rebecca L; Duvall, Laura; Kiser, Tyree H; Moote, Rebecca; Reed, Brent N; Rodgers, Jo E; Erstad, Brian

    2012-06-01

    Pharmacotherapeutic approaches for the management of pulmonary arterial hypertension (PAH) have expanded greatly in the last 10 years. Pulmonary arterial hypertension is a relatively rare disease and is associated with myriad disease processes. The older term for PAH, primary PAH, has been changed to represent these differences and to distinguish it from postcapillary PAH associated with left-sided heart failure. Limitations in evaluating treatment approaches for PAH include its rarity, the small number of patients included in clinical trials, and issues regarding the use of placebo-controlled trials in a disease with such a high mortality rate if left untreated. Management options include the use of prostacyclin and prostacyclin analogues, endothelin receptor antagonists, and phosphodiesterase inhibitors, as well as traditional background therapy with diuretics, digoxin, calcium channel blockers, and warfarin. Numerous drugs are under investigation to evaluate their possible roles in management. Combination therapy is increasingly becoming a standard approach to therapy, with mounting literature to document effectiveness. Current or emerging roles for the pharmacist in the management of PAH largely involves ensuring access to drug therapy, facilitating specialty pharmacy dispensing, and providing patient counseling. Newer roles may include future drug development, optimized use of investigational drugs, and specialized disease management programs. This compilation includes a series of articles identifying important literature in cardiovascular pharmacotherapy. This bibliography focuses on pharmacotherapeutic management of pulmonary arterial hypertension (PAH). Most of the cited works present the results of significant human clinical studies that have shaped the management of patients with PAH. Limited primary literature is available for some topics, so in addition, consensus documents prepared by expert panels are reviewed. This compilation may serve as a

  13. Low-frequency Carbon Radio Recombination Lines. I. Calculations of Departure Coefficients

    Energy Technology Data Exchange (ETDEWEB)

    Salgado, F.; Morabito, L. K.; Oonk, J. B. R.; Salas, P.; Toribio, M. C.; Röttgering, H. J. A.; Tielens, A. G. G. M. [Leiden Observatory, University of Leiden, P.O. Box 9513, 2300 RA Leiden (Netherlands)

    2017-03-10

    In the first paper of this series, we study the level population problem of recombining carbon ions. We focus our study on high quantum numbers, anticipating observations of carbon radio recombination lines to be carried out by the Low Frequency Array. We solve the level population equation including angular momentum levels with updated collision rates up to high principal quantum numbers. We derive departure coefficients by solving the level population equation in the hydrogenic approximation and including low-temperature dielectronic capture effects. Our results in the hydrogenic approximation agree well with those of previous works. When comparing our results including dielectronic capture, we find differences that we ascribe to updates in the atomic physics (e.g., collision rates) and to the approximate solution method of the statistical equilibrium equations adopted in previous studies. A comparison with observations is discussed in an accompanying article, as radiative transfer effects need to be considered.

  14. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  15. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  16. Basic circuit compilation techniques for an ion-trap quantum machine

    International Nuclear Information System (INIS)

    Maslov, Dmitri

    2017-01-01

    We study the problem of compilation of quantum algorithms into optimized physical-level circuits executable in a quantum information processing (QIP) experiment based on trapped atomic ions. We report a complete strategy: starting with an algorithm in the form of a quantum computer program, we compile it into a high-level logical circuit that goes through multiple stages of decomposition into progressively lower-level circuits until we reach the physical execution-level specification. We skip the fault-tolerance layer, as it is not within the scope of this work. The different stages are structured so as to best assist with the overall optimization while taking into account numerous optimization criteria, including minimizing the number of expensive two-qubit gates, minimizing the number of less expensive single-qubit gates, optimizing the runtime, minimizing the overall circuit error, and optimizing classical control sequences. Our approach allows a trade-off between circuit runtime and quantum error, as well as to accommodate future changes in the optimization criteria that may likely arise as a result of the anticipated improvements in the physical-level control of the experiment. (paper)

  17. Compilation of monographs on α-, β-, γ- and X-ray spectrometry

    International Nuclear Information System (INIS)

    Debertin, K.

    1977-11-01

    The working group 'α-, β-, γ-Ray Spectrometry' of the International Committee for Radionuclide Metrology (ICRM) compiled about 35 monographs on α-, β-, γ- and X-ray spectrometry which were published in the years 1970 to 1976. Support was obtained by the Zentralstelle fuer Atomkernenergie-Dokumentation (ZAED) in Karlsruhe. (orig.) [de

  18. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  19. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  20. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas March 26Update of the voice messaging systemAll CERN sites April 4Updat...