WorldWideScience

Sample records for technology compilation volume

  1. Integrating Parallelizing Compilation Technologies for SMP Clusters

    Institute of Scientific and Technical Information of China (English)

    Xiao-Bing Feng; Li Chen; Yi-Ran Wang; Xiao-Mi An; Lin Ma; Chun-Lei Sang; Zhao-Qing Zhang

    2005-01-01

    In this paper, a source to source parallelizing complier system, AutoPar, is presentd. The system transforms FORTRAN programs to multi-level hybrid MPI/OpenMP parallel programs. Integrated parallel optimizing technologies are utilized extensively to derive an effective program decomposition in the whole program scope. Other features such as synchronization optimization and communication optimization improve the performance scalability of the generated parallel programs, from both intra-node and inter-node. The system makes great effort to boost automation of parallelization.Profiling feedback is used in performance estimation which is the basis of automatic program decomposition. Performance results for eight benchmarks in NPB1.0 from NAS on an SMP cluster are given, and the speedup is desirable. It is noticeable that in the experiment, at most one data distribution directive and a reduction directive are inserted by the user in BT/SP/LU. The compiler is based on ORC, Open Research Compiler. ORC is a powerful compiler infrastructure, with such features as robustness, flexibility and efficiency. Strong analysis capability and well-defined infrastructure of ORC make the system implementation quite fast.

  2. Piping and tubing technology: A compilation

    Science.gov (United States)

    1971-01-01

    A compilation on the devices, techniques, and methods used in piping and tubing technology is presented. Data cover the following: (1) a number of fittings, couplings, and connectors that are useful in joining tubing and piping and various systems, (2) a family of devices used where flexibility and/or vibration damping are necessary, (3) a number of devices found useful in the regulation and control of fluid flow, and (4) shop hints to aid in maintenance and repair procedures such as cleaning, flaring, and swaging of tubes.

  3. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  4. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  5. Ground Operations Aerospace Language (GOAL). Volume 2: Compiler

    Science.gov (United States)

    1973-01-01

    The principal elements and functions of the Ground Operations Aerospace Language (GOAL) compiler are presented. The technique used to transcribe the syntax diagrams into machine processable format for use by the parsing routines is described. An explanation of the parsing technique used to process GOAL source statements is included. The compiler diagnostics and the output reports generated during a GOAL compilation are explained. A description of the GOAL program package is provided.

  6. Compiler writing system detail design specification. Volume 2: Component specification

    Science.gov (United States)

    Arthur, W. J.

    1974-01-01

    The logic modules and data structures composing the Meta-translator module are desribed. This module is responsible for the actual generation of the executable language compiler as a function of the input Meta-language. Machine definitions are also processed and are placed as encoded data on the compiler library data file. The transformation of intermediate language in target language object text is described.

  7. Solid state technology: A compilation. [on semiconductor devices

    Science.gov (United States)

    1973-01-01

    A compilation, covering selected solid state devices developed and integrated into systems by NASA to improve performance, is presented. Data are also given on device shielding in hostile radiation environments.

  8. Java编译程序技术与Java性能%Java Compiler Technology and Java Performance

    Institute of Scientific and Technical Information of China (English)

    冀振燕; 程虎

    2000-01-01

    This paper summarizes Java's compiler technology,and sorts all kinds of Java compilers into five categories:compilers with interpreter technology,compilers with JIT compiler technology,compilers with adaptive optimization technology,native compilers and translators.Their architectures and working principles are described and analyzed in detail.The authors also analyze the effect that compiler technology has on Java performance.%概述了Java编译程序技术,把Java编译程序分成5类:具有解释技术的编译程序;具有及时(JIT)编译技术的编译程序;具有自适应优化技术的编译程序;本地编译程序和翻译程序.详细描述和分析了它们的体系结构和工作原理.同时也分析了编译程序技术对Java性能的影响.

  9. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias

    2015-08-12

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  10. Physical sciences: Thermodynamics, cryogenics, and vacuum technology: A compilation

    Science.gov (United States)

    1974-01-01

    Technological developments which have potential application outside the aerospace community are reported. A variety of thermodynamic devices including heat pipes and cooling systems are described along with methods of handling cryogenic fluids. Vacuum devices are also described. Pata et information is included.

  11. Modern Chemical Technology, Volume 5.

    Science.gov (United States)

    Pecsok, Robert L., Ed.; Chapman, Kenneth, Ed.

    This volume contains chapters 26-31 for the American Chemical Society (ACS) "Modern Chemical Technology" (ChemTeC) instructional material intended to prepare chemical technologists. Chapter 26 reviews oxidation and reduction, including applications in titrations with potassium permanganate and iodometry. Coordination compounds are…

  12. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  13. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  14. Gulf Coast geopressured-geothermal program summary report compilation. Volume 4: Bibliography (annotated only for all major reports)

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    This bibliography contains US Department of Energy sponsored Geopressured-Geothermal reports published after 1984. Reports published prior to 1984 are documented in the Geopressured Geothermal bibliography Volumes 1, 2, and 3 that the Center for Energy Studies at the University of Texas at Austin compiled in May 1985. It represents reports, papers and articles covering topics from the scientific and technical aspects of geopressured geothermal reservoirs to the social, environmental, and legal considerations of exploiting those reservoirs for their energy resources.

  15. Regulatory and technical reports (abstract index journal): Annual compilation for 1997. Volume 22, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  16. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1993 annual. Volume 15

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    This compilation contains 47 ACRS reports submitted to the Commission, Executive Director for Operations, or to the Office of Nuclear Regulatory Research, during calendar year 1993. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area.

  17. A compilation of reports of the Advisory Committee on Reactor Safeguards. 1994 annual. Volume 16

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This compilation contains 30 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1994. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the U.S. Library of Congress. The reports are categorized by the most appropriate generic subject area and by chronological order within subject area.

  18. A compilation of reports of the Advisory Committee on Reactor Safeguards: 1995 annual. Volume 17

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This compilation contains 44 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1995. It also includes a report to the Congress on the NRC Safety Research Program. All reports have been made available to the public through the NRC Public Document Room and the US Library of Congress. The reports are divided into two groups: Part 1: ACRS Reports on Project Reviews, and Part 2: ACRS Reports on Generic Subjects. Part 1 contains ACRS reports by project name and by chronological order within project name. Part 2 categorizes the reports by the most appropriate generic subject area and by chronological order within subject area.

  19. Aerospace Technology Innovation. Volume 9

    Science.gov (United States)

    Turner, Janelle (Editor); Cousins, Liz (Editor)

    2001-01-01

    Commercializing technology is a daunting task. Of every 11 new product ideas, only one will successfully make it to the marketplace. Fully 46% of new product investment becomes sunk in cost. Yet, a few good companies consistently attain an 80% technology commercialization success rate and have lead the way in establishing best practices. The NASA Incubator program consists of nine incubators, each residing near a NASA research center. The purpose of the incubators is to use the best practices is to use the best practices of technology commercialization to help early stage businesses successfully launch new products that incorporate NASA technology.

  20. Modern Chemical Technology, Volume 9.

    Science.gov (United States)

    Pecsok, Robert L.; Chapman, Kenneth

    This volume is one of the series for the Chemical Technician Curriculum Project (ChemTeC) of the American Chemical Society funded by the National Science Foundation. It consists of discussions, exercises, and experiments on the following topics: ion exchange, electrphoresis, dialysis, electrochemistry, corrosion, electrolytic cells, coulometry,…

  1. Aerospace Technology Innovation. Volume 10

    Science.gov (United States)

    Turner, Janelle (Editor); Cousins, Liz (Editor); Bennett, Evonne (Editor); Vendette, Joel (Editor); West, Kenyon (Editor)

    2002-01-01

    Whether finding new applications for existing NASA technologies or developing unique marketing strategies to demonstrate them, NASA's offices are committed to identifying unique partnering opportunities. Through their efforts NASA leverages resources through joint research and development, and gains new insight into the core areas relevant to all NASA field centers. One of the most satisfying aspects of my job comes when I learn of a mission-driven technology that can be spun-off to touch the lives of everyday people. NASA's New Partnerships in Medical Diagnostic Imaging is one such initiative. Not only does it promise to provide greater dividends for the country's investment in aerospace research, but also to enhance the American quality of life. This issue of Innovation highlights the new NASA-sponsored initiative in medical imaging. Early in 2001, NASA announced the launch of the New Partnerships in Medical Diagnostic Imaging initiative to promote the partnership and commercialization of NASA technologies in the medical imaging industry. NASA and the medical imaging industry share a number of crosscutting technologies in areas such as high-performance detectors and image-processing tools. Many of the opportunities for joint development and technology transfer to the medical imaging market also hold the promise for future spin back to NASA.

  2. Software Systems 2--Compiler and Operating Systems Lab--Advanced, Data Processing Technology: 8025.33.

    Science.gov (United States)

    Dade County Public Schools, Miami, FL.

    The course outline has been prepared as a guide to help the student develop the skills and knowledge necessary to succeed in the field of data processing. By learning the purpose and principles of compiler programs and operating systems, the student will become familiar with advanced data processing procedures that are representative of computer…

  3. Coal slurry combustion and technology. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    Volume II contains papers presented at the following sessions of the Coal Slurry Combustion and Technology Symposium: (1) bench-scale testing; (2) pilot testing; (3) combustion; and (4) rheology and characterization. Thirty-three papers have been processed for inclusion in the Energy Data Base. (ATT)

  4. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  5. Computer vision technology in log volume inspection

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Log volume inspection is very important in forestry research and paper making engineering. This paper proposed a novel approach based on computer vision technology to cope with log volume inspection. The needed hardware system was analyzed and the details of the inspection algorithms were given. A fuzzy entropy based on image enhancement algorithm was presented for enhancing the image of the cross-section of log. In many practical applications the cross-section is often partially invisible, and this is the major obstacle for correct inspection. To solve this problem, a robust Hausdorff distance method was proposed to recover the whole cross-section. Experiment results showed that this method was efficient.

  6. The Concert system - Compiler and runtime technology for efficient concurrent object-oriented programming

    Science.gov (United States)

    Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak

    1993-01-01

    Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.

  7. The Concert system - Compiler and runtime technology for efficient concurrent object-oriented programming

    Science.gov (United States)

    Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak

    1993-01-01

    Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.

  8. The science, technology and research network (STARNET) a searchable thematic compilation of web resources

    Science.gov (United States)

    Blados, W.R.; Cotter, G.A.; Hermann, T.

    2007-01-01

    International alliances in space efforts have resulted in a more rapid diffusion of space technology. This, in turn, increases pressure on organizations to push forward with technological developments and to take steps to maximize their inclusion into the research and development (R&D) process and the overall advancement and enhancement of space technology. To cope with this vast and rapidly growing amount of data and information that is vital to the success of the innovation, the Information Management Committee (IMC) of the Research Technology Agency (RTA) developed the science, technology and research network (STARNET). The purpose of this network is to facilitate access to worldwide information elements in terms of science, technology and overall research. It provides a virtual library with special emphasis on international security; a "one stop" information resource for policy makers, program managers, scientists, engineers, researchers and others. ?? 2007 IEEE.

  9. Technology Transfer: A Compilation of Varied Approaches to the Management of Innovation.

    Science.gov (United States)

    1982-12-01

    Intergovernmental Cooperation in Science and Tech- nology--J. E. Clark 89. Department of Defense Technology Transfer Consor- tium: An Overview--G. F...DEPARTMENT OF DEFENSE TECHNOLOGY TRANSFER CONSORTIUM: AN OVERVIEW George F. Linsteadt Abstract The federal R&D laboratories represent a large...agencies who have compatible requirements. The Department of Defense Technology Transfer Consortium, as a subset of the Federal Laboratory Consortium for

  10. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index.

  11. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index.

  12. FY02 Engineering Technology Reports Volume 1: Technology Base

    Energy Technology Data Exchange (ETDEWEB)

    Minichino, C; Meeker, D

    2003-01-28

    Engineering has touched on every challenge, every accomplishment, and every endeavor of Lawrence Livermore National Laboratory during its fifty-year history. In this time of transition to new leadership, Engineering continues to be central to the mission of the Laboratory, returning to the tradition and core values of E. O. Lawrence: science-based engineering--turning scientific concepts into reality. This volume of Engineering Technical Reports summarizes progress on the projects funded for technology-base efforts. Technology-base projects effect the natural transition to reduction-to-practice of scientific or engineering methods that are well understood and established. They represent discipline-oriented, core competency activities that are multi-programmatic in application, nature, and scope. Objectives of technology-base funding include: (1) the development and enhancement of tools and processes to provide Engineering support capability, such as code maintenance and improved fabrication methods; (2) the support of Engineering science and technology infrastructure, such as the installation or integration of a new capability; (3) support for technical and administrative leadership through our technology Centers; and (4) the initial scoping and exploration of selected technology areas with high strategic potential, such as assessment of university, laboratory, and industrial partnerships. Five Centers focus and guide longer-term investments within Engineering. The Centers attract and retain top staff, develop and maintain critical core technologies, and enable programs. Through their technology-base projects, they oversee the application of known engineering approaches and techniques to scientific and technical problems.

  13. Regulatory and technical reports (abstract index journal): Compilation for second quarter 1997 April--June. Volume 22, Number 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  14. Regulatory and technical reports: Abstract index journal. Volume 20, No. 3, Compilation for third quarter 1995, July--September

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-01-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  15. Regulatory and technical reports (abstract index journal): Compilation for first quarter 1996, January--March. Volume 21, Number 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors, proceedings of conferences and workshops, grants, and international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  16. Regulatory and technical reports (abstract index journal), Compilation for third quarter 1993, July--September. Volume 18, No. 3

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors, proceedings of conferences and workshops, grants, and international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  17. Technology-Transformed Dictionary Compilation: Drudgery into Desired Desktop Lexicographer Enchantment.

    Science.gov (United States)

    Reissman, Rose

    1998-01-01

    Describes how grade 3-8 inner-city students created multimedia, multicultural dictionaries. Highlights student reflections on the project using Kid Pix software, and their ideas for future uses for the dictionaries. Argues that technology-driven lexicography can serve as a catalyst for engaging students. (PEN)

  18. Compiling Dictionaries

    African Journals Online (AJOL)

    Information Technology

    quiring efficient techniques. The text corpus .... make the process of compiling a dictionary simpler and more efficient. If we are ever ... need a mass production technique. ..... Mapping semantic relationships in the lexicon using lexical functions.

  19. Electrical hand tools and techniques: A compilation. [utilization of space technology for tools and adapters

    Science.gov (United States)

    1974-01-01

    Space technology utilization for developing tools, adapters, and fixtures and procedures for assembling, installing, and servicing electrical components and equipment are discussed. Some of the items considered are: (1) pivotal screwdriver, (2) termination locator tool for shielded cables, (3) solder application tools, (4) insulation and shield removing tool, and (5) torque wrench adapter for cable connector engaging ring. Diagrams of the various tools and devices are provided.

  20. Engineering Technology Reports, Volume 2: Technology Base FY01

    Energy Technology Data Exchange (ETDEWEB)

    Minichino, C; Meeker, D

    2002-07-01

    Engineering has touched on every challenge, every accomplishment, and every endeavor of Lawrence Livermore National Laboratory during its fifty-year history. In this time of transition to new leadership, Engineering continues to be central to the mission of the Laboratory, returning to the tradition and core values of E.O. Lawrence: science-based engineering--turning scientific concepts into reality. This volume of Engineering Technical Reports summarizes progress on the projects funded for technology-base efforts. Technology-base projects effect the natural transition to reduction-to-practice of scientific or engineering methods that are well understood and established. They represent discipline-oriented, core competency activities that are multi-programmatic in application, nature, and scope. Objectives of technology-base funding include: (1) the development and enhancement of tools and processes to provide Engineering support capability, such as code maintenance and improved fabrication methods; (2) the support of Engineering science and technology infrastructure, such as the installation or integration of a new capability; (3) support for technical and administrative leadership through our technology Centers; (4) the initial scoping and exploration of selected technology areas with high strategic potential, such as assessment of university, laboratory, and industrial partnerships.

  1. New compilers speed up applications for Intel-based systems; Intel Compilers pave the way for Intel's Hyper-threading technology

    CERN Multimedia

    2002-01-01

    "Intel Corporation today introduced updated tools to help software developers optimize applications for Intel's expanding family of architectures with key innovations such as Intel's Hyper Threading Technology (1 page).

  2. Compilation and evaluation of 14-MeV neutron-activation cross sections for nuclear technology applications. Set I

    Energy Technology Data Exchange (ETDEWEB)

    Evain, B.P.; Smith, D.L.; Lucchese, P.

    1985-04-01

    Available 14-MeV experimental neutron activation cross sections are compiled and evaluated for the following reactions of interest for nuclear-energy technology applications: /sup 27/Al(n,p)/sup 27/Mg, Si(n,X)/sup 28/Al, Ti(n,X)/sup 46/Sc, Ti(n,X)/sup 47/Sc, Ti(n,X)/sup 48/Sc, /sup 51/V(n,p)/sup 51/Ti, /sup 51/V(n,..cap alpha..)/sup 48/Sc, Cr(n,X)/sup 52/V, /sup 55/Mn(n,..cap alpha..)/sup 52/V, /sup 55/Mn(n,2n)/sup 54/Mn, Fe(n,X)/sup 54/Mn, /sup 54/Fe(n,..cap alpha..)/sup 51/Cr, /sup 59/Co(n,p)/sup 59/Fe, /sup 59/Co(n,..cap alpha..)/sup 56/Mn, /sup 59/Co(n,2n)/sup 58/Co, /sup 65/Cu(n,p)/sup 65/Ni, Zn(n,X)/sup 64/Cu, /sup 64/Zn(n,2n)/sup 63/Zn, /sup 113/In(n,n')/sup 113m/In, /sup 115/In(n,n') /sup 115m/In. The compiled values are listed and plotted for reference without adjustments. From these collected results those values for which adequate supplementary information on nuclear constants, standards and experimental errors is provided are selected for use in reaction-by-reaction evaluations. These data are adjusted as needed to account for recent revisions in the nuclear constants and cross section standards. The adjusted results are subsequently transformed to equivalent cross sections at 14.7 MeV for the evaluation process. The evaluations are performed utilizing a least-squares method which considers correlations between the experimental data. 440 refs., 41 figs., 46 tabs.

  3. Compilation and evaluation of gas phase diffusion coefficients of reactive trace gases in the atmosphere: volume 1. Inorganic compounds

    Science.gov (United States)

    Tang, M. J.; Cox, R. A.; Kalberer, M.

    2014-09-01

    Diffusion of gas molecules to the surface is the first step for all gas-surface reactions. Gas phase diffusion can influence and sometimes even limit the overall rates of these reactions; however, there is no database of the gas phase diffusion coefficients of atmospheric reactive trace gases. Here we compile and evaluate, for the first time, the diffusivities (pressure-independent diffusion coefficients) of atmospheric inorganic reactive trace gases reported in the literature. The measured diffusivities are then compared with estimated values using a semi-empirical method developed by Fuller et al. (1966). The diffusivities estimated using Fuller's method are typically found to be in good agreement with the measured values within ±30%, and therefore Fuller's method can be used to estimate the diffusivities of trace gases for which experimental data are not available. The two experimental methods used in the atmospheric chemistry community to measure the gas phase diffusion coefficients are also discussed. A different version of this compilation/evaluation, which will be updated when new data become available, is uploaded online (google.com/site/mingjintang/home/diffusion"target="_blank">https://sites.google.com/site/mingjintang/home/diffusion).

  4. ASEAN--USAID Buildings Energy Conservation Project final report. Volume 2, Technology

    Energy Technology Data Exchange (ETDEWEB)

    Levine, M.D.; Busch, J.F. [eds.

    1992-06-01

    This volume reports on research in the area of energy conservation technology applied to commercial buildings in the Association of Southeast Asian Nations (ASEAN) region. Unlike Volume I of this series, this volume is a compilation of original technical papers prepared by different authors in the project. In this regard, this volume is much like a technical journal. The papers that follow report on research conducted by both US and ASEAN researchers. The authors representing Indonesia, Malaysia, Philippines, and Thailand, come from a range of positions in the energy arena, including government energy agencies, electric utilities, and universities. As such, they account for a wide range of perspectives on energy problems and the role that technology can play in solving them. This volume is about using energy more intelligently. In some cases, the effort is towards the use of more advanced technologies, such as low-emittance coatings on window glass, thermal energy storage, or cogeneration. In others, the emphasis is towards reclaiming traditional techniques for rendering energy services, but in new contexts such as lighting office buildings with natural light, or cooling buildings of all types with natural ventilation. Used in its broadest sense, the term ``technology`` encompasses all of the topics addressed in this volume. Along with the more customary associations of technology, such as advanced materials and equipment and the analysis of their performance, this volume treats design concepts and techniques, analysis of ``secondary`` impacts from applying technologies (i.e., unintended impacts, or impacts on parties not directly involved in the purchase and use of the technology), and the collection of primary data used for conducting technical analyses.

  5. A compilation of reports of the Advisory Committee on Reactor Safeguards, 1997 annual, U.S. Nuclear Regulatory Commission. Volume 19

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    This compilation contains 67 ACRS reports submitted to the Commission, or to the Executive Director for Operations, during calendar year 1997. It also includes a report to the Congress on the NRC Safety Research Program. Specific topics include: (1) advanced reactor designs, (2) emergency core cooling systems, (3) fire protection, (4) generic letters and issues, (5) human factors, (6) instrumentation, control and protection systems, (7) materials engineering, (8) probabilistic risk assessment, (9) regulatory guides and procedures, rules, regulations, and (10) safety research, philosophy, technology and criteria.

  6. Source document compilation: Los Alamos investigations related to the environment, engineering, geology, and hydrology, 1961--1990. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Purtymun, W.D. [comp.

    1994-03-01

    This document is a compilation of informal reports, letters, and memorandums regarding geologic and hydrologic studies and investigations such as foundation investigations for structures, drilling or coring for environmental studies, development of water supply, or construction of test or observation wells for monitoring. Also included are replies requested for specific environmental, engineering, geologic, and hydrologic problems. The purpose of this document is to preserve and make the original data available to the environmental studies that are now in progress at Los Alamos and provide a reference for and supplement the LAMS report ``Records of Observation Wells, Test Holes, Test Wells, Supply Wells, Springs, and Surface water stations at Los Alamos: with Reference to the Geology and Hydrology,`` which is in preparation. The informal reports and memorandums are listed chronologically from December 1961 to January 1990. Item 208 is a descriptive history of the US Geological Survey`s activities at Los Alamos from 1946 through 1972. The history includes a list of published and unpublished reports that cover geology, hydrology, water supply, waste disposal, and environmental monitoring in the Los Alamos area.

  7. Source document compilation: Los Alamos investigations related to the environment, engineering, geology, and hydrology, 1961--1990. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Purtymun, W.D. [comp.

    1994-03-01

    This document is a compilation of informal reports, letters, and memorandums regarding geologic and hydrologic studies and investigations such as foundation investigations for structures, drilling or coring for environmental studies, development of water supply, or construction of test or observation wells for monitoring. Also included are replies requested for specific environmental, engineering, geologic, and hydrologic problems. The purpose of this document is to preserve and make the original data available to the environmental studies that are now in progress at Los Alamos and provide a reference for and supplement the LAMS report ``Records of Observation Wells, Test Holes, Test Wells, Supply Wells, Springs, and Surface water stations at Los Alamos: with Reference to the Geology and Hydrology,`` which is in preparation. The informal reports and memorandums are listed chronologically from December 1961 to January 1990. Item 208 is a descriptive history of the US Geological Survey`s activities at Los Alamos from 1946 through 1972. The history includes a list of published and unpublished reports that cover geology, hydrology, water supply, waste disposal, and environmental monitoring in the Los Alamos area.

  8. Compilation of reports from research supported by the Materials Engineering Branch, Division of Engineering: 1991--1993. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Hiser, A.L. [comp.

    1994-06-01

    Since 1965, the Materials Engineering Branch, Division of Engineering, of the Nuclear Regulatory Commission`s Office of Nuclear Regulatory Research, and its predecessors dating back to the Atomic Energy Commission (AEC), has sponsored research programs concerning the integrity of the primary system pressure boundary of light water reactors. The components of concern in these research programs have included the reactor pressure vessel (RPV), steam generators, and the piping. These research programs have covered a broad range of topics, including fracture mechanics analysis and experimental work for RPV and piping applications, inspection method development and qualification, and evaluation of irradiation effects to RPV steels. This report provides as complete a listing as practical of formal technical reports submitted to the NRC by the investigators working on these research programs. This listing includes topical, final and progress reports, and is segmented by topic area. In many cases a report will cover several topics (such as in the case of progress reports of multi-faceted programs), but is listed under only one topic. Therefore, in searching for reports on a specific topic, other related topic areas should be checked also. The separate volumes of this report cover the following periods: Volume 1: 1965--1990 and Volume 2: 1991--1993.

  9. Mineralogy, geochemistry, porosity and redox properties of rocks from Forsmark. Compilation of data from the regional model volume for SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Sandstroem, Bjoern (WSP Sverige AB, Stockholm (Sweden)); Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden))

    2009-11-15

    This report is a compilation of the data acquired during the Forsmark site investigation programme on the mineralogy, geochemistry, redox properties and porosity of different rock types at Forsmark. The aim is to provide a final summary of the available data for use during the SR-Site modelling work. Data presented in this report represent the regional model volume and have previously been published in various SKB reports. The data have been extracted from the SKB database Sicada and are presented as calculated median values, data range and lower/upper quartile. The representativity of all samples used for the calculations have been evaluated and data from samples where there is insufficient control on the rock type have been omitted. Rock samples affected by alteration have been omitted from the unaltered samples and are presented separately based on type of alteration (e.g. oxidised or albitized rock)

  10. Evaluation and compilation of DOE waste package test data; Volume 8: Biannual report, August 1989--January 1990

    Energy Technology Data Exchange (ETDEWEB)

    Interrante, C.G. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of High-Level Waste Management; Fraker, A.C.; Escalante, E. [National Inst. of Standards and Technology (MSEL), Gaithersburg, MD (United States). Metallurgy Div.

    1993-06-01

    This report summarizes evaluations by the National Institute of Standards and Technology (NIST) of some of the Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW) for the six-month period, August 1989--January 1990. This includes reviews of related materials research and plans, information on the Yucca Mountain, Nevada disposal site activities, and other information regarding supporting research and special assistance. Short discussions are given relating to the publications reviewed and complete reviews and evaluations are included. Reports of other work are included in the Appendices.

  11. Educational Media and Technology Yearbook, 1992. Volume 18.

    Science.gov (United States)

    Ely, Donald P., Ed.; Minor, Barbara B., Ed.

    The Educational Media and Technology Yearbook (EMTY) is designed to provide media and instructional technology professionals with an up-to-date, single-source overview and assessment of the field of educational technology. Each volume addresses current issues, notes trends, and provides current listings of and background information about the…

  12. Gulf Coast geopressured-geothermal program summary report compilation. Volume 3: Applied and direct uses, resource feasibility, economics

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Geopressured-geothermal hybrid cycle power plant: design, testing, and operation summary; Feasibility of hydraulic energy recovery from geopressured-geothermal resources: economic analysis of the Pelton turbine; Brine production as an exploration tool for water drive gas reservoirs; Study of supercritical Rankine cycles; Application of the geopressured-geothermal resource to pyrolytic conversion or decomposition/detoxification processes; Conclusions on wet air oxidation, pyrolytic conversion, decomposition/detoxification process; Co-location of medium to heavy oil reservoirs with geopressured-geothermal resources and the feasibility of oil recovery using geopressured-geothermal fluids; Economic analysis; Application of geopressured-geothermal resources to direct uses; Industrial consortium for the utilization of the geopressured-geothermal resource; Power generation; Industrial desalination, gas use and sales, pollutant removal, thermal EOR, sulfur frasching, oil and natural gas pipelining, coal desulfurization and preparation, lumber and concrete products kilning; Agriculture and aquaculture applications; Paper and cane sugar industries; Chemical processing; Environmental considerations for geopressured-geothermal development. 27 figs., 25 tabs.

  13. Recent Progress in Data Engineering and Internet Technology Volume 1

    CERN Document Server

    Gaol, Ford Lumban

    2013-01-01

    The latest inventions in internet technology influence most of business and daily activities. Internet security, internet data management, web search, data grids, cloud computing, and web-based applications play vital roles, especially in business and industry, as more transactions go online and mobile. Issues related to ubiquitous computing are becoming critical.   Internet technology and data engineering should reinforce efficiency and effectiveness of business processes. These technologies should help people make better and more accurate decisions by presenting necessary information and possible consequences for the decisions. Intelligent information systems should help us better understand and manage information with ubiquitous data repository and cloud computing.   This book is a compilation of some recent research findings in Internet Technology and Data Engineering. This book provides state-of-the-art accounts in computational algorithms/tools, database management and database technologies,  intelli...

  14. Recent Progress in Data Engineering and Internet Technology Volume 2

    CERN Document Server

    Gaol, Ford Lumban

    2012-01-01

    The latest inventions in internet technology influence most of business and daily activities. Internet security, internet data management, web search, data grids, cloud computing, and web-based applications play vital roles, especially in business and industry, as more transactions go online and mobile. Issues related to ubiquitous computing are becoming critical.   Internet technology and data engineering should reinforce efficiency and effectiveness of business processes. These technologies should help people make better and more accurate decisions by presenting necessary information and possible consequences for the decisions. Intelligent information systems should help us better understand and manage information with ubiquitous data repository and cloud computing.   This book is a compilation of some recent research findings in Internet Technology and Data Engineering. This book provides state-of-the-art accounts in computational algorithms/tools, database management and database technologies,  intelli...

  15. Evaluation and compilation of DOE waste package test data; Biannual report, February 1989--July 1989: Volume 7

    Energy Technology Data Exchange (ETDEWEB)

    Interrante, C.G. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of High-Level Waste Management; Fraker, A.C.; Escalante, E. [National Inst. of Standards and Technology (IMSE), Gaithersburg, MD (United States). Metallurgy Div.

    1991-12-01

    This report summarizes evaluations by the National Institute of Standards and Technology (NIST) of Department of Energy (DOE) activities on waste packages designed for containment of radioactive high-level nuclear waste (HLW) for the six-month period, February through July 1989. This includes reviews of related materials research and plans, information on the Yucca Mountain, Nevada disposal site activities, and other information regarding supporting research and special assistance. Outlines for planned interpretative reports on the topics of aqueous corrosion of copper, mechanisms of stress corrosion cracking and internal failure modes of Zircaloy cladding are included. For the publications reviewed during this reporting period, short discussions are given to supplement the completed reviews and evaluations. Included in this report is an overall review of a 1984 report on glass leaching mechanisms, as well as reviews for each of the seven chapters of this report.

  16. Computer technology -- 1996: Applications and methodology. PVP-Volume 326

    Energy Technology Data Exchange (ETDEWEB)

    Hulbert, G.M. [ed.] [Univ. of Michigan, Ann Arbor, MI (United States); Hsu, K.H. [ed.] [Babcock and Wilcox, Barberton, OH (United States); Lee, T.W. [ed.] [FMC Corp., Santa Clara, CA (United States); Nicholas, T. [ed.] [USAF Wright Laboratory, Wright-Patterson AFB, OH (United States)

    1996-12-01

    The primary objective of the Computer Technology Committee of the ASME Pressure Vessels and Piping Division is to promote interest and technical exchange in the field of computer technology, related to the design and analysis of pressure vessels and piping. The topics included in this volume are: analysis of bolted joints; nonlinear analysis, applications and methodology; finite element analysis and applications; and behavior of materials. Separate abstracts were prepared for 23 of the papers in this volume.

  17. Technology transfer from NASA to targeted industries, volume 2

    Science.gov (United States)

    Mccain, Wayne; Schroer, Bernard J.; Souder, William E.; Spann, Mary S.; Watters, Harry; Ziemke, M. Carl

    1993-01-01

    This volume contains the following materials to support Volume 1: (1) Survey of Metal Fabrication Industry in Alabama; (2) Survey of Electronics Manufacturing/Assembly Industry in Alabama; (3) Apparel Modular Manufacturing Simulators; (4) Synopsis of a Stereolithography Project; (5) Transferring Modular Manufacturing Technology to an Apparel Firm; (6) Letters of Support; (7) Fact Sheets; (8) Publications; and (9) One Stop Access to NASA Technology Brochure.

  18. Human choice and climate change. Volume 2: Resources and technology

    Energy Technology Data Exchange (ETDEWEB)

    Raynor, S.; Malone, E. [eds.] [Pacific Northwest National Lab., Richland, WA (United States)

    1998-12-31

    This book is Volume 2 of a four-volume set which assesses social science research that is relevant to global climate change from a wide-ranging interdisciplinary perspective. Attention is focused on resources and technology as they relate to climate change. This series is indispensable reading for scientists and engineers wishing to make an effective contribution to the climate change policy debate.

  19. Computer science research and technology volume 3

    CERN Document Server

    Bauer, Janice P

    2011-01-01

    This book presents leading-edge research from across the globe in the field of computer science research, technology and applications. Each contribution has been carefully selected for inclusion based on the significance of the research to this fast-moving and diverse field. Some topics included are: network topology; agile programming; virtualization; and reconfigurable computing.

  20. Oak Ridge K-25 Site Technology Logic Diagram. Volume 2, Technology Logic Diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates envirorunental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. This volume, Volume 2, contains logic diagrams with an index. Volume 3 has been divided into two separate volumes to facilitate handling and use.

  1. USAF Advanced Terrestrial Energy Study. Volume 2. Technology Handbook.

    Science.gov (United States)

    1983-04-01

    volume of Stirling svstems hecause the regenerator determines the dimensions of the system envelone. Table 27. STIRLING SYSTEM VOLUME (CUBIC FEET...Rankine Cycles Batteries Gas Turbines Stirling Engines Thermal Energy Storage 20. ABSTRACT (Confilime 4n roere. olde If neo**WY 41011184100140 & Wee bi A...TECHNOLOGY DESCRIPTIONS 13 Diesels, 13 Gas Turbines. 31 Stirlings , 49 Organic Rankine Cycle; 67 Fuel Cells 83 Photovoltaic Energy Conversion System, 102 Wind

  2. Technology transfer package on seismic base isolation - Volume III

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume III contains supporting materials not included in Volumes I and II.

  3. The Compiler Forest

    OpenAIRE

    Budiu, Mihai; Galenson, Joel; Plotkin, Gordon D.

    2013-01-01

    We address the problem of writing compilers targeting complex execution environments, such as computer clusters composed of machines with multi-core CPUs. To that end we introduce partial compilers. These compilers can pass sub-programs to several child (partial) compilers, combining the code generated by their children to generate the final target code. We define a set of high-level polymorphic operations manipulating both compilers and partial compilers as first-class values. These mechanis...

  4. Engineering Technology Reports, Volume 2: Technology Base FY00

    Energy Technology Data Exchange (ETDEWEB)

    Baron, A L; Langland, R T; Minichino, C

    2001-10-03

    In FY-2000, Engineering at Lawrence Livermore National Laboratory faced significant pressures to meet critical project milestones, and immediate demands to facilitate the reassignment of employees as the National Ignition Facility (the 600-TW laser facility being designed and built at Livermore, and one of the largest R&D construction projects in the world) was in the process of re-baselining its plan while executing full-speed its technology development efforts. This drive for change occurred as an unprecedented level of management and program changes were occurring within LLNL. I am pleased to report that we met many key milestones and achieved numerous technological breakthroughs. This report summarizes our efforts to perform feasibility and reduce-to-practice studies, demonstrations, and/or techniques--as structured through our technology centers. Whether using computational engineering to predict how giant structures like suspension bridges will respond to massive earthquakes or devising a suitcase-sized microtool to detect chemical and biological agents used by terrorists, we have made solid technical progress. Five Centers focus and guide longer-term investments within Engineering, as well as impact all of LLNL. Each Center is responsible for the vitality and growth of the core technologies it represents. My goal is that each Center will be recognized on an international scale for solving compelling national problems requiring breakthrough innovation. The Centers and their leaders are as follows: Center for Complex Distributed Systems--David B. McCallen; Center for Computational Engineering--Kyran D. Mish; Center for Microtechnology--Raymond P. Mariella, Jr.; Center for Nondestructive Characterization--Harry E. Martz, Jr.; and Center for Precision Engineering--Keith Carlisle.

  5. Assessment of control technology for stationary sources. Volume II: control technology data tables. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Minicucci, D.; Herther, M.; Babb, L.; Kuby, W.

    1980-02-01

    This report, the Control Technology Data Tables, is the second volume of the three-volume final report for the contract. It presents in tabular format, qualitative descriptions of control options for the various sources and quantitative information on control technology cost, efficiency, reliability, energy consumption, other environmental impacts and application status. Also included is a code list which classifies the stationary sources examined by industry, process, and emission source.

  6. Oak Ridge K-25 Site Technology Logic Diagram. Volume 1, Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. This Volume, Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume is divided into ten chapters. The first chapter is a brief introduction, and the second chapter details the technical approach of the TLD. These categories are the work activities necessary for successful decontamination and decommissioning, waste management, and remedial action of the K-25 Site. The categories are characterization, decontamination, dismantlement, robotics and automation, remedial action, and waste management. Materials disposition is addressed in Chap. 9. The final chapter contains regulatory compliance information concerning waste management, remedial action, and decontamination and decommissioning.

  7. Principles of compilers

    CERN Document Server

    Su, Yunlin

    2011-01-01

    ""Principles of Compilers: A New Approach to Compilers Including the Algebraic Method"" introduces the ideas of the compilation from the natural intelligence of human beings by comparing similarities and differences between the compilations of natural languages and programming languages. The notation is created to list the source language, target languages, and compiler language, vividly illustrating the multilevel procedure of the compilation in the process. The book thoroughly explains the LL(1) and LR(1) parsing methods to help readers to understand the how and why. It not only covers estab

  8. Oak Ridge National Laboratory Technology Logic Diagram. Volume 2, Technology Logic Diagram: Part B, Remedial Action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Part A of Vols. 1. and 2 focuses on D&D. Part B of Vols. 1 and 2 focuses on the RA of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 (Pts. A, B, and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A, B, and C) contains the TLD data sheets. Remedial action is the focus of Vol. 2, Pt. B, which has been divided into the three necessary subelements of the RA: characterization, RA, and robotics and automation. Each of these sections address general ORNL problems, which are then broken down by problem area/constituents and linked to potential remedial technologies. The diagrams also contain summary information about a technology`s status, its science and technology needs, and its implementation needs.

  9. Technology transfer package on seismic base isolation - Volume II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume II contains the proceedings for the Short Course on Seismic Base Isolation held in Berkeley, California, August 10-14, 1992.

  10. Technology transfer package on seismic base isolation - Volume I

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume I contains the proceedings of the Workshop on Seismic Base Isolation for Department of Energy Facilities held in Marina Del Rey, California, May 13-15, 1992.

  11. Survey of biomass gasification. Volume III. Current technology and research

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-04-01

    This survey of biomass gasification was written to aid the Department of Energy and the Solar Energy Research Institute Biological and Chemical Conversion Branch in determining the areas of gasification that are ready for commercialization now and those areas in which further research and development will be most productive. Chapter 8 is a survey of gasifier types. Chapter 9 consists of a directory of current manufacturers of gasifiers and gasifier development programs. Chapter 10 is a sampling of current gasification R and D programs and their unique features. Chapter 11 compares air gasification for the conversion of existing gas/oil boiler systems to biomass feedstocks with the price of installing new biomass combustion equipment. Chapter 12 treats gas conditioning as a necessary adjunct to all but close-coupled gasifiers, in which the product is promptly burned. Chapter 13 evaluates, technically and economically, synthesis-gas processes for conversion to methanol, ammonia, gasoline, or methane. Chapter 14 compiles a number of comments that have been assembled from various members of the gasifier community as to possible roles of the government in accelerating the development of gasifier technology and commercialization. Chapter 15 includes recommendations for future gasification research and development.

  12. Technology Directions for the 21st Century, volume 1

    Science.gov (United States)

    Crimi, Giles F.; Verheggen, Henry; McIntosh, William; Botta, Robert

    1996-01-01

    For several decades, semiconductor device density and performance have been doubling about every 18 months (Moore's Law). With present photolithography techniques, this rate can continue for only about another 10 years. Continued improvement will need to rely on newer technologies. Transition from the current micron range for transistor size to the nanometer range will permit Moore's Law to operate well beyond 10 years. The technologies that will enable this extension include: single-electron transistors; quantum well devices; spin transistors; and nanotechnology and molecular engineering. Continuation of Moore's Law will rely on huge capital investments for manufacture as well as on new technologies. Much will depend on the fortunes of Intel, the premier chip manufacturer, which, in turn, depend on the development of mass-market applications and volume sales for chips of higher and higher density. The technology drivers are seen by different forecasters to include video/multimedia applications, digital signal processing, and business automation. Moore's Law will affect NASA in the areas of communications and space technology by reducing size and power requirements for data processing and data fusion functions to be performed onboard spacecraft. In addition, NASA will have the opportunity to be a pioneering contributor to nanotechnology research without incurring huge expenses.

  13. A framework for evaluation of technology transfer programs. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    The objective of this volume is to describe a framework with which DOE can develop a program specific methodology to evaluate it`s technology transfer efforts. This approach could also be applied to an integrated private sector technology transfer organization. Several benefits will be realized from the application of this work. While the immediate effect will be to assist program managers in evaluating and improving program performance, the ultimate benefits will accrue to the producing industry, the states, and the nation in the form of sustained or increased domestic oil production. This benefit depends also, of course, on the effectiveness of the technology being transferred. The managers of the Technology Transfer program, and the larger federal oil and gas R&D programs, will be provided with a means to design and assess the effectiveness of program efforts as they are developed, tested and performed. The framework allows deficiencies in critical aspects of the program to be quickly identified, allowing for timely corrections and improvements. The actual process of developing the evaluation also gives the staff of the Oil R&D Program or Technology Transfer subprogram the opportunity to become oriented to the overall program goals. The structure and focus imposed by the evaluation paradigm will guide program staff in selecting activities which are consistent with achieving the goals of the overall R&D program.

  14. Advanced Thermionic Technology Program: summary report. Volume 4. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    This report summarizes the progress made by the Advanced Thermionic Technology Program during the past several years. This Program, sponsored by the US Department of Energy, has had as its goal adapting thermionic devices to generate electricity in a terrestrial (i.e., combustion) environment. Volume 4 (Part E) is a highly technical discussion of the attempts made by the Program to push the state-of-the-art beyond the current generation of converters and is directed toward potential researchers engaged in this same task. These technical discussions are complemented with Appendices where appropriate.

  15. Emerging Communication Technologies (ECT) Phase 2 Report. Volume 3; Ultra Wideband (UWB) Technology

    Science.gov (United States)

    Bastin, Gary L.; Harris, William G.; Chiodini, Robert; Nelson, Richard A.; Huang, PoTien; Kruhm, David A.

    2003-01-01

    The Emerging Communication Technology (ECT) project investigated three First Mile communication technologies in support of NASA s Second Generation Reusable Launch Vehicle (2nd Gen RLV), Orbital Space Plane, Advanced Range Technology Working Group (ARTWG) and the Advanced Spaceport Technology Working Group (ASTWG). These First Mile technologies have the purpose of interconnecting mobile users with existing Range Communication infrastructures. ECT was a continuation of the Range Information System Management (RISM) task started in 2002. RISM identified the three advance communication technologies investigated under ECT. These were Wireless Ethernet (Wi-Fi), Free Space Optics (FSO), and Ultra Wideband (UWB). Due to the report s size, it has been broken into three volumes: 1) Main Report 2) Appendices 3) UWB

  16. Compilation of 1985 annual reports of the Navy elf (extremely low frequency) communications system ecological monitoring program. Volume 1. Tabs A-C. Annual progress report, January-December 1985

    Energy Technology Data Exchange (ETDEWEB)

    Becker, C.; Bruhn, J.; Cattelino, P.; Fuller, L.; Jurgensen, M.

    1986-07-01

    This is the fourth compilation of annual reports for the Navy's ELF Communications System Ecological Monitoring Program. The reports document the progress of ten studies performed during 1985 at the Wisconsin and Michigan Transmitting Facilities. The purpose of the monitoring is to determine whether electromagnetic fields produced by the ELF Communications System will affect resident biota or their ecological relationships. This volume consists of three reports: Herbaceous Plant Cover and Tree Studies; Litter Decomposition and Microflora; and The Effects of Exposing the Slime MOld Physarum polycephalum to Electromagnetic Fields.

  17. DEFTEST. Defence Technological and Scientific Thesaurus. Volume 2. M - Z

    Science.gov (United States)

    1988-05-01

    aoo soNO II DRS - SP- 1 (VOL 2) AR- 005 -466 DEFTEST DEFENCE TECHNOLOGICAL AND SCIENTIFIC THESAURUS Volume 2 M-Z 1988 Edition Defence Reports...dim =1.Waf ON Sealoves Offi anwONid mu. Cho ONSMWO Pliqdu INUMMONma~ei ON-lf NT. A" am .NKS emb Co di wf Jet ofl bUSSI nm md Oi e Midib mdi Fade...fa ii IFT N~m adds Pawnrum OtaMl orgaic add CrOMI adds uorm TaY1ears"ls "Takd ugin Serb Tod ca i iPolBaine o lchmif NT n I -~ Sdenm and Technolog Oro

  18. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part A, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part A of Volume 3 and contains the Remedial Action section.

  19. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part B, Characterization; robotics/automation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate theses problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part B of Volume 3 and contains the Characterization and Robotics/Automation sections.

  20. Calculating correct compilers

    DEFF Research Database (Denmark)

    Bahr, Patrick; Hutton, Graham

    2015-01-01

    In this article, we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high-level semantics by systematic calculation, with all details of the implementation of the compilers...... falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language features and their combination, including arithmetic expressions, exceptions, state, various forms...

  1. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  2. Kokkos GPU Compiler

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-15

    The Kokkos Clang compiler is a version of the Clang C++ compiler that has been modified to perform targeted code generation for Kokkos constructs in the goal of generating highly optimized code and to provide semantic (domain) awareness throughout the compilation toolchain of these constructs such as parallel for and parallel reduce. This approach is taken to explore the possibilities of exposing the developer’s intentions to the underlying compiler infrastructure (e.g. optimization and analysis passes within the middle stages of the compiler) instead of relying solely on the restricted capabilities of C++ template metaprogramming. To date our current activities have focused on correct GPU code generation and thus we have not yet focused on improving overall performance. The compiler is implemented by recognizing specific (syntactic) Kokkos constructs in order to bypass normal template expansion mechanisms and instead use the semantic knowledge of Kokkos to directly generate code in the compiler’s intermediate representation (IR); which is then translated into an NVIDIA-centric GPU program and supporting runtime calls. In addition, by capturing and maintaining the higher-level semantics of Kokkos directly within the lower levels of the compiler has the potential for significantly improving the ability of the compiler to communicate with the developer in the terms of their original programming model/semantics.

  3. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part A, Characterization, decontamination, dismantlement

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This report is part A of Volume 3 concerning characterization, decontamination, and dismantlement.

  4. Fifth DOE symposium on enhanced oil and gas recovery and improved drilling technology. Volume 2. Oil

    Energy Technology Data Exchange (ETDEWEB)

    Linville, B. [ed.

    1979-01-01

    Volume 2 contains papers from the following sessions: residual oil determination; thermal methods; heavy oil-tar sands; technology transfer; and carbon dioxide flooding. Individual papers were processed.

  5. Bedrock Outcrop Points Compilation

    Data.gov (United States)

    Vermont Center for Geographic Information — A compilation of bedrock outcrops as points and/or polygons from 1:62,500 and 1:24,000 geologic mapping by the Vermont Geological Survey, the United States...

  6. Gulf Coast geopressured-geothermal program summary report compilation. Volume 2-A: Resource description, program history, wells tested, university and company based research, site restoration

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Geopressured-geothermal resource description; Resource origin and sediment type; Gulf Coast resource extent; Resource estimates; Project history; Authorizing legislation; Program objectives; Perceived constraints; Program activities and structure; Well testing; Program management; Program cost summary; Funding history; Resource characterization; Wells of opportunity; Edna Delcambre No. 1 well; Edna Delcambre well recompletion; Fairfax Foster Sutter No. 2 well; Beulah Simon No. 2 well; P.E. Girouard No. 1 well; Prairie Canal No. 1 well; Crown Zellerbach No. 2 well; Alice C. Plantation No. 2 well; Tenneco Fee N No. 1 well; Pauline Kraft No. 1 well; Saldana well No. 2; G.M. Koelemay well No. 1; Willis Hulin No. 1 well; Investigations of other wells of opportunity; Clovis A. Kennedy No. 1 well; Watkins-Miller No. 1 well; Lucien J. Richard et al No. 1 well; and the C and K-Frank A. Godchaux, III, well No. 1.

  7. Gulf Coast geopressured-geothermal program summary report compilation. Volume 2-B: Resource description, program history, wells tested, university and company based research, site restoration

    Energy Technology Data Exchange (ETDEWEB)

    John, C.J.; Maciasz, G.; Harder, B.J.

    1998-06-01

    The US Department of Energy established a geopressured-geothermal energy program in the mid 1970`s as one response to America`s need to develop alternate energy resources in view of the increasing dependence on imported fossil fuel energy. This program continued for 17 years and approximately two hundred million dollars were expended for various types of research and well testing to thoroughly investigate this alternative energy source. This volume describes the following studies: Design well program; LaFourche Crossing; MG-T/DOE Amoco Fee No. 1 (Sweet Lake); Environmental monitoring at Sweet Lake; Air quality; Water quality; Microseismic monitoring; Subsidence; Dow/DOE L.R. Sweezy No. 1 well; Reservoir testing; Environmental monitoring at Parcperdue; Air monitoring; Water runoff; Groundwater; Microseismic events; Subsidence; Environmental consideration at site; Gladys McCall No. 1 well; Test results of Gladys McCall; Hydrocarbons in production gas and brine; Environmental monitoring at the Gladys McCall site; Pleasant Bayou No. 2 well; Pleasant Bayou hybrid power system; Environmental monitoring at Pleasant Bayou; and Plug abandonment and well site restoration of three geopressured-geothermal test sites. 197 figs., 64 tabs.

  8. Compilation of Shona Children's

    African Journals Online (AJOL)

    Mev. R.B. Ruthven

    Peniah Mabaso, African Languages Research Institute (ALRI), University of. Zimbabwe, Harare ... thirteen years age group and their teachers. Student ... The Compilation of a Shona Children's Dictionary: Challenges and Solutions. 113 language .... The current orthography is linguistically constricting in a number of ways.

  9. Flat-panel volume CT: fundamental principles, technology, and applications.

    Science.gov (United States)

    Gupta, Rajiv; Cheung, Arnold C; Bartling, Soenke H; Lisauskas, Jennifer; Grasruck, Michael; Leidecker, Christianne; Schmidt, Bernhard; Flohr, Thomas; Brady, Thomas J

    2008-01-01

    Flat-panel volume computed tomography (CT) systems have an innovative design that allows coverage of a large volume per rotation, fluoroscopic and dynamic imaging, and high spatial resolution that permits visualization of complex human anatomy such as fine temporal bone structures and trabecular bone architecture. In simple terms, flat-panel volume CT scanners can be thought of as conventional multidetector CT scanners in which the detector rows have been replaced by an area detector. The flat-panel detector has wide z-axis coverage that enables imaging of entire organs in one axial acquisition. Its fluoroscopic and angiographic capabilities are useful for intraoperative and vascular applications. Furthermore, the high-volume coverage and continuous rotation of the detector may enable depiction of dynamic processes such as coronary blood flow and whole-brain perfusion. Other applications in which flat-panel volume CT may play a role include small-animal imaging, nondestructive testing in animal survival surgeries, and tissue-engineering experiments. Such versatility has led some to predict that flat-panel volume CT will gain importance in interventional and intraoperative applications, especially in specialties such as cardiac imaging, interventional neuroradiology, orthopedics, and otolaryngology. However, the contrast resolution of flat-panel volume CT is slightly inferior to that of multidetector CT, a higher radiation dose is needed to achieve a comparable signal-to-noise ratio, and a slower scintillator results in a longer scanning time.

  10. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part B, Dismantlement, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  11. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part C, Robotics/automation, Waste management

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  12. Y-12 Plant decontamination and decommissioning technology logic diagram for Building 9201-4. Volume 2: Technology logic diagram

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 (TLD) was developed to provide a decision-support tool that relates decontamination and decommissioning (D and D) problems at Bldg. 9201-4 to potential technologies that can remediate these problems. This TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to D and D and waste management (WM) activities. It is essential that follow-on engineering studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in the TLD and by finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 2 has been divided into five sections: Characterization, Decontamination, Dismantlement, Robotics/Automation, and Waste Management. Each section contains logical breakdowns of the Y-12 D and D problems by subject area and identifies technologies that can be reasonably applied to each D and D challenge.

  13. Embedded Processor Oriented Compiler Infrastructure

    Directory of Open Access Journals (Sweden)

    DJUKIC, M.

    2014-08-01

    Full Text Available In the recent years, research of special compiler techniques and algorithms for embedded processors broaden the knowledge of how to achieve better compiler performance in irregular processor architectures. However, industrial strength compilers, besides ability to generate efficient code, must also be robust, understandable, maintainable, and extensible. This raises the need for compiler infrastructure that provides means for convenient implementation of embedded processor oriented compiler techniques. Cirrus Logic Coyote 32 DSP is an example that shows how traditional compiler infrastructure is not able to cope with the problem. That is why the new compiler infrastructure was developed for this processor, based on research. in the field of embedded system software tools and experience in development of industrial strength compilers. The new infrastructure is described in this paper. Compiler generated code quality is compared with code generated by the previous compiler for the same processor architecture.

  14. Elements of compiler design

    CERN Document Server

    Meduna, Alexander

    2007-01-01

    PREFACEINTRODUCTIONMathematical PreliminariesCompilationRewriting SystemsLEXICAL ANALYSISModelsMethodsTheorySYNTAX ANALYSISModelsMethodsTheoryDETERMINISTIC TOP-DOWN PARSINGPredictive Sets and LL GrammarsPredictive ParsingDETERMINISTIC BOTTOM-UP PARSINGPrecedence ParsingLR ParsingSYNTAX-DIRECTED TRANSLATION AND INTERMEDIATE CODE GENERATIONBottom-Up Syntax-Directed Translation and Intermediate Code GenerationTop-Down Syntax-Directed TranslationSymbol TableSemantic AnalysisSoftw

  15. Metallurgy: A compilation

    Science.gov (United States)

    1972-01-01

    A compilation on the technical uses of various metallurgical processes is presented. Descriptions are given of the mechanical properties of various alloys, ranging from TAZ-813 at 2200 F to investment cast alloy 718 at -320 F. Methods are also described for analyzing some of the constituents of various alloys from optical properties of carbide precipitates in Rene 41 to X-ray spectrographic analysis of the manganese content of high chromium steels.

  16. Industrial Sector Technology Use Model (ISTUM): industrial energy use in the United States, 1974-2000. Volume 4. Technology appendix

    Energy Technology Data Exchange (ETDEWEB)

    1978-06-19

    This volume of the 4-volume ISTUM documentation gives information on the individual technology specifications. The first chapter presents a general overview of the ISTUM technology data bases. It includes an explanation of the data-base printouts and how the separate-cost building blocks are combined to derive an aggregate-technology cost. The remaining chapters document the specific-technology-cost specifications. Boiler technologies (conventional coal steam, conventional natural gas and oil in the steam-service sector, black liquor and wood boilers, and space-heat service sector) and non-boiler conventional technologies (natural gas non-boiler, oil-fired non-boiler, coal-fired non-boiler technologies and non-boiler primary system costs) are covered in Chapter II. Chapter III, Fossil Energy Technologies, covers atmospheric fluidized-bed combustion, low-Btu gasification of coal, and medium-Btu gasification. Chapter IV, Cogeneration and Self-Generation Technologies, covers the steam service sector, machine-drive service sector, and electrolytic service sector. Solar and geothermal technologies (solar steam, solar space heat, and geothermal steam technologies) are covered in Chapter V, while Chapter VI covers conservation technologies. (MCW)

  17. Clean Coal Technology Programs: Program Update 2003 (Volume 1)

    Energy Technology Data Exchange (ETDEWEB)

    Assistant Secretary for Fossil Energy

    2003-12-01

    Annual report on the Clean Coal Technology Demonstration Program (CCTDP), Power Plant Improvement Initiative (PPII), and Clean Coal Power Initiative (CCPI). The report addresses the roles of the programs, implementation, funding and costs, project descriptions, legislative history, program history, environmental aspects, and project contacts. The project descriptions describe the technology and provides a brief summary of the demonstration results.

  18. Clean Coal Technology Programs: Completed Projects (Volume 2)

    Energy Technology Data Exchange (ETDEWEB)

    Assistant Secretary for Fossil Energy

    2003-12-01

    Annual report on the Clean Coal Technology Demonstration Program (CCTDP), Power Plant Improvement Initiative (PPII), and Clean Coal Power Initiative (CCPI). The report addresses the roles of the programs, implementation, funding and costs, project descriptions, legislative history, program history, environmental aspects, and project contacts. The project descriptions describe the technology and provides a brief summary of the demonstration results.

  19. The 5-Year Outlook on Science and Technology 1981. Source Materials Volume 2.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    This is the second of two volumes of source documents commissioned by the National Science Foundation in preparing the second 5-Year Outlook on Science and Technology for transmission to the Congress. This volume consists of the views of individuals selected by the Committee on Science, Engineering and Public Policy of the American Association for…

  20. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  1. Human choice and climate change. Volume 2: Resources and technology

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, S.; Malone, E.L.

    1997-12-31

    Foreward: Preface; Introduction; The natural science of global climate change; Land and water use; Coastal zones and oceans; Energy and industry; Energy and social systems; Technological change; and Sponsoring organizations, International Advisory Board, and project participants.

  2. TECHNOLOGY EVALUATION REPORT: RETECH'S PLASMA CENTRIFUGAL FURNACE - VOLUME I

    Science.gov (United States)

    A demonstration of the Retech, Inc. Plasma Centrifugal Furnace (PCF) was conducted under the Superfund Innovative Technology Evaluation (SITE) Program at the Department of Energy's (DOE's) Component Development and Integration Facility in Butte, Montana. The furnace uses heat gen...

  3. QuEST: Qualifying Environmentally Sustainable Technologies. Volume 6

    Science.gov (United States)

    Lewis, Pattie

    2011-01-01

    QuEST is a publication of the NASA Technology Evaluation for Environmental Risk Mitigation Principal Center (TEERM). This issue contains brief articles on: Risk Identification and Mitigation, Material Management and Substitution Efforts--Hexavalent Chrome-free Coatings and Low volatile organic compounds (VOCs) Coatings, Lead-Free Electronics, Corn-Based Depainting Media; Alternative Energy Efforts Hydrogen Sensors and Solar Air Conditioning. Other TEERM Efforts include: Energy and Water Management and Remediation Technology Collaboration.

  4. Fault-Tree Compiler

    Science.gov (United States)

    Butler, Ricky W.; Boerschlein, David P.

    1993-01-01

    Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.

  5. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  6. Hybrid Propulsion Technology Program, phase 1. Volume 1: Executive summary

    Science.gov (United States)

    1989-01-01

    The study program was contracted to evaluate concepts of hybrid propulsion, select the most optimum, and prepare a conceptual design package. Further, this study required preparation of a technology definition package to identify hybrid propulsion enabling technologies and planning to acquire that technology in Phase 2 and demonstrate that technology in Phase 3. Researchers evaluated two design philosophies for Hybrid Rocket Booster (HRB) selection. The first is an ASRM modified hybrid wherein as many components/designs as possible were used from the present Advanced Solid Rocket Motor (ASRM) design. The second was an entirely new hybrid optimized booster using ASRM criteria as a point of departure, i.e., diameter, thrust time curve, launch facilities, and external tank attach points. Researchers selected the new design based on the logic of optimizing a hybrid booster to provide NASA with a next generation vehicle in lieu of an interim advancement over the ASRM. The enabling technologies for hybrid propulsion are applicable to either and vehicle design may be selected at a downstream point (Phase 3) at NASA's discretion. The completion of these studies resulted in ranking the various concepts of boosters from the RSRM to a turbopump fed (TF) hybrid. The scoring resulting from the Figure of Merit (FOM) scoring system clearly shows a natural growth path where the turbopump fed solid liquid staged combustion hybrid provides maximized payload and the highest safety, reliability, and low life cycle costing.

  7. Heat Transfer and Thermodynamics: a Compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Studies include theories and mechanical considerations in the transfer of heat and the thermodynamic properties of matter and the causes and effects of certain interactions.

  8. Second annual clean coal technology conference: Proceedings. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-09

    The Second Annual Clean Coal Technology Conference was held at Atlanta, Georgia, September 7--9, 1993. The Conference, cosponsored by the US Department of Energy (USDOE) and the Southern States Energy Board (SSEB), seeks to examine the status and role of the Clean Coal Technology Demonstration Program (CCTDP) and its projects. The Program is reviewed within the larger context of environmental needs, sustained economic growth, world markets, user performance requirements and supplier commercialization activities. This will be accomplished through in-depth review and discussion of factors affecting domestic and international markets for clean coal technology, the environmental considerations in commercial deployment, the current status of projects, and the timing and effectiveness of transfer of data from these projects to potential users, suppliers, financing entities, regulators, the interested environmental community and the public. Individual papers have been entered separately.

  9. ICSOFT 2006 : First International Conference on Software and Data Technologies, Volume 1

    NARCIS (Netherlands)

    Filipe, Joaquim; Shishkov, Boris; Helfert, Markus

    2006-01-01

    This volume contains the proceedings of the first International Conference on Software and Data Technologies (ICSOFT 2006), organized by the Institute for Systems and Technologies of Information, Communication and Control (INSTICC) in cooperation with the Object Management Group (OMG), sponsored by

  10. Industrial Sector Technology Use Model (ISTUM): industrial energy use in the United States, 1974-2000. Volume 4. Technology appendix. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    Volume IV of the ISTUM documentation gives information on the individual technology specifications, but relates closely with Chapter II of Volume I. The emphasis in that chapter is on providing an overview of where each technology fits into the general-model logic. Volume IV presents the actual cost structure and specification of every technology modeled in ISTUM. The first chapter presents a general overview of the ISTUM technology data base. It includes an explanation of the data base printouts and how the separate-cost building blocks are combined to derive an aggregate-technology cost. The remaining chapters are devoted to documenting the specific-technology cost specifications. Technologies included are: conventional technologies (boiler and non-boiler conventional technologies); fossil-energy technologies (atmospheric fluidized bed combustion, low Btu coal and medium Btu coal gasification); cogeneration (steam, machine drive, and electrolytic service sectors); and solar and geothermal technologies (solar steam, solar space heat, and geothermal steam technologies), and conservation technologies.

  11. Hybrid propulsion technology program. Volume 1: Conceptional design package

    Science.gov (United States)

    Jensen, Gordon E.; Holzman, Allen L.; Leisch, Steven O.; Keilbach, Joseph; Parsley, Randy; Humphrey, John

    1989-01-01

    A concept design study was performed to configure two sizes of hybrid boosters; one which duplicates the advanced shuttle rocket motor vacuum thrust time curve and a smaller, quarter thrust level booster. Two sizes of hybrid boosters were configured for either pump-fed or pressure-fed oxygen feed systems. Performance analyses show improved payload capability relative to a solid propellant booster. Size optimization and fuel safety considerations resulted in a 4.57 m (180 inch) diameter large booster with an inert hydrocarbon fuel. The preferred diameter for the quarter thrust level booster is 2.53 m (96 inches). As part of the design study critical technology issues were identified and a technology acquisition and demonstration plan was formulated.

  12. Conventional engine technology. Volume 3: Comparisons and future potential

    Science.gov (United States)

    Dowdy, M. W.

    1981-01-01

    The status of five conventional automobile engine technologies was assessed and the future potential for increasing fuel economy and reducing exhaust emission was discussed, using the 1980 EPA California emisions standards as a comparative basis. By 1986, the fuel economy of a uniform charge Otto engine with a three-way catalyst is expected to increase 10%, while vehicles with lean burn (fast burn) engines should show a 20% fuel economy increase. Although vehicles with stratified-charge engines and rotary engines are expected to improve, their fuel economy will remain inferior to the other engine types. When adequate NO emissions control methods are implemented to meet the EPA requirements, vehicles with prechamber diesel engines are expected to yield a fuel economy advantage of about 15%. While successful introduction of direct injection diesel engine technology will provide a fuel savings of 30 to 35%, the planned regulation of exhaust particulates could seriously hinder this technology, because it is expected that only the smallest diesel engine vehicles could meet the proposed particulate requirements.

  13. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2

    Science.gov (United States)

    Krishen, Kumar (Compiler)

    1994-01-01

    This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.

  14. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    Optimizing compilers are vital for performance. However, compilers ability to optimize aggressively is limited in some cases. To address this limitation, we have developed a compiler guiding the programmer in making small source code changes, potentially making the source code more amenable...... to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...... of the programmers development flow. We have evaluated our preliminary implementation and show it can guide to a 12% improvement in performance. Furthermore the tool can be used as an interactive optimization adviser improving the performance of the code generated by a production compiler. Here it can lead to a 153...

  15. QuEST: Qualifying Environmentally Sustainable Technologies, Volume 5

    Science.gov (United States)

    Lewis, Pattie

    2010-01-01

    This edition of the QuEST newsletter contains brief articles that discuss the NASA Technology Evaluation for Environmental Risk Mitigation (TEERM) program, and the importance of collaboration, efforts in materials management and substitution for coatings for launch structures, Low volatile organic compound (VOC) Coatings Field Testing, Non-Chrome Coating Systems, Life Cycle Corrosion Testing, Lead-Free Electronics Testing and Corn Based Depainting and efforts in Pollution Control in the area of Hypergolic Propellant Destruction Evaluation, efforts in development of alternative energy in particular Hydrogen Sensors, Energy and Water Management, and efforts in remediation in the removal of Polychlorinated Biphenyl (PCB) contamination

  16. QuEST: Qualifying Environmentally Sustainable Technologies. Volume 4

    Science.gov (United States)

    Lewis, Pattie L.

    2009-01-01

    In 2004, in one of their first collaborative efforts, Centro Para Prevencao da Poluicao (Portuguese Center for Pollution Prevention or C3P). teamed with Technology Evaluation for Environmental Risk Mitigation Principal Center (TEERM) and two Portuguese entities, TAP Portugal (Portuguese National Airline) and OGMA Indtistria Aeron utica de Portugal (Portuguese Aeronautics Industry), to target the reduction of hexavalent chromium, cadmium, and volatile organic compounds (VOCs) in aircraft maintenance operations. This project focused on two coating systems that utilize non-chrome pretreatments and low-VOC primers and topcoats.

  17. Cogeneration technology alternatives study. Volume 2: Industrial process characteristics

    Science.gov (United States)

    1980-01-01

    Information and data for 26 industrial processes are presented. The following information is given for each process: (1) a description of the process including the annual energy consumption and product production and plant capacity; (2) the energy requirements of the process for each unit of production and the detailed data concerning electrical energy requirements and also hot water, steam, and direct fired thermal requirements; (3) anticipated trends affecting energy requirements with new process or production technologies; and (4) representative plant data including capacity and projected requirements through the year 2000.

  18. Soft Computing in Information Communication Technology Volume 2

    CERN Document Server

    2012-01-01

    This book is a collection of the accepted papers concerning soft computing in information communication technology. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Fuzzy Logic, Machine Learning, Cryptography, Pattern Recognition, Bioinformatics, Biomedical Engineering, Advancements in ICT.

  19. A pointer logic and certifying compiler

    Institute of Scientific and Technical Information of China (English)

    CHEN Yiyun; GE Lin; HUA Baojian; LI Zhaopeng; LIU Cheng; WANG Zhifang

    2007-01-01

    Proof-Carrying Code brings two big challenges to the research field of programming languages.One is to seek more expressive logics or type systems to specify or reason about the properties of low-level or high-level programs.The other is to study the technology of certifying compilation in which the compiler generates proofs for programs with annotations.This paper presents our progress in the above two aspects.A pointer logic was designed for PointerC (a C-like programming language) in our research.As an extension of Hoare logic,our pointer logic expresses the change of pointer information for each statement in its inference rules to support program verification.Meanwhile,based on the ideas from CAP (Certified Assembly Programming) and SCAP (Stack-based Certified Assembly Programming),a reasoning framework was built to verify the properties of object code in a Hoare style.And a certifying compiler prototype for PointerC was implemented based on this framework.The main contribution of this paper is the design of the pointer logic and the implementation of the certifying compiler prototype.In our certifying compiler,the source language contains rich pointer types and operations and also supports dynamic storage allocation and deallocation.

  20. Technology Directions for the 21st Century. Volume 3

    Science.gov (United States)

    Crimi, Giles F.; Botta, Robert; Ditanna, Thomas; Verheggen, Henry; Stancati, Michael; Feingold, Harvey; Jacobs, Mark

    1996-01-01

    New technologies will unleash the huge capacity of fiber-optic cable to meet growing demands for bandwidth. Companies will continue to replace private networks with public network bandwidth-on-demand. Although asynchronous transfer mode (ATM) is the transmission technology favored by many, its penetration will be slower than anticipated. Hybrid networks - e.g., a mix of ATM, frame relay, and fast Ethernet - may predominate, both as interim and long-term solutions, based on factors such as availability, interoperability, and cost. Telecommunications equipment and services prices will decrease further due to increased supply and more competition. Explosive Internet growth will continue, requiring additional backbone transmission capacity and enhanced protocols, but it is not clear who will fund the upgrade. Within ten years, space-based constellations of satellites in Low Earth orbit (LEO) will serve mobile users employing small, low-power terminals. 'Little LEO's' will provide packet transmission services and geo-position determination. 'Big LEO's' will function as global cellular telephone networks, with some planning to offer video and interactive multimedia services. Geosynchronous satellites also are proposed for mobile voice grade links and high-bandwidth services. NASA may benefit from resulting cost reductions in components, space hardware, launch services, and telecommunications services.

  1. Voyager Outreach Compilation

    Science.gov (United States)

    1998-01-01

    This NASA JPL (Jet Propulsion Laboratory) video presents a collection of the best videos that have been published of the Voyager mission. Computer animation/simulations comprise the largest portion of the video and include outer planetary magnetic fields, outer planetary lunar surfaces, and the Voyager spacecraft trajectory. Voyager visited the four outer planets: Jupiter, Saturn, Uranus, and Neptune. The video contains some live shots of Jupiter (actual), the Earth's moon (from orbit), Saturn (actual), Neptune (actual) and Uranus (actual), but is mainly comprised of computer animations of these planets and their moons. Some of the individual short videos that are compiled are entitled: The Solar System; Voyage to the Outer Planets; A Tour of the Solar System; and the Neptune Encounter. Computerized simulations of Viewing Neptune from Triton, Diving over Neptune to Meet Triton, and Catching Triton in its Retrograde Orbit are included. Several animations of Neptune's atmosphere, rotation and weather features as well as significant discussion of the planet's natural satellites are also presented.

  2. Cogeneration Technology Alternatives Study (CTAS). Volume 3: Industrial processes

    Science.gov (United States)

    Palmer, W. B.; Gerlaugh, H. E.; Priestley, R. R.

    1980-01-01

    Cogenerating electric power and process heat in single energy conversion systems rather than separately in utility plants and in process boilers is examined in terms of cost savings. The use of various advanced energy conversion systems are examined and compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the target energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. An attempt was made to use consistent assumptions and a consistent set of ground rules specified by NASA for determining performance and cost. Data and narrative descriptions of the industrial processes are given.

  3. Assessment of control technology for stationary sources. Volume I: technical discussion. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Minicucci, D.; Herther, M.; Babb, L.; Kuby, W.

    1980-02-01

    The purpose of this project was to develop a reference document for use by the Air Resources Board, local air pollution control districts, and the U.S. Environmental Protection Agency that describes technological options available for the control of emissions from stationary sources located in California. Control technologies were examined for 10 industry groups and six air pollutants. Volume I, Technical Discussion, includes an overall introduction to the project, descriptions of its major elements, background information for each industry group addressed, and the project bibliography. In Volume II, Control Technology Data Tables, qualitative descriptions of control options for the various sources and quantitative information on control technology cost, efficiency, reliability, energy consumption, other environmental impacts, and application status are presented in tabular format. Also included is a code list that classifies the stationary sources examined by industry, process and emission source.

  4. Environmental control implications of generating electric power from coal. Technology status report. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1976-12-01

    This is the first in a series of reports evaluating environmental control technologies applicable to the coal-to-electricity process. The technologies are described and evaluated from an engineering and cost perspective based upon the best available information obtained from utility experience and development work in progress. Environmental control regulations and the health effects of pollutants are also reviewed. Emphasis is placed primarily upon technologies that are now in use. For SO/sub 2/ control, these include the use of low sulfur coal, cleaned coal, or flue-gas desulfurization systems. Electrostatic precipitators and fabric filters used for the control of particulate matter are analyzed, and combustion modifications for NO/sub x/ control are described. In each area, advanced technologies still in the development stage are described briefly and evaluated on the basis of current knowledge. Fluidized-bed combustion (FBC) is a near-term technology that is discussed extensively in the report. The potential for control of SO/sub 2/ and NO/sub x/ emissions by use of FBC is analyzed, as are the resulting solid waste disposal problems, cost estimates, and its potential applicability to electric utility systems. Volume II presents the detailed technology analyses complete with reference citations. This same material is given in condensed form in Volume I without references. A brief executive summary is also given in Volume I.

  5. Emerging Communication Technologies (ECT) Phase 2 Report. Volume 1; Main Report

    Science.gov (United States)

    Bastin, Gary L.; Harris, William G.; Chiodini, Robert; Nelson, Richard A.; Huang, PoTien; Kruhm, David A.

    2003-01-01

    The Emerging Communication Technology (ECT) project investigated three First Mile communication technologies in support of NASA s Second Generation Reusable Launch Vehicle (2nd Gen RLV), Orbital Space Plane, Advanced Range Technology Working Group (ARTWG) and the Advanced Spaceport Technology Working Group (ASTWG). These First Mile technologies have the purpose of interconnecting mobile users with existing Range Communication infrastructures. ECT was a continuation of the Range Information System Management (RISM) task started in 2002. RISM identified the three advance communication technologies investigated under ECT. These were Wireless Ethernet (Wi-Fi), Free Space Optics (FSO), and Ultra Wideband (UWB). Due to the report s size, it has been broken into three volumes: 1) Main Report 2) Appendices 3) UWB.

  6. Research on surveying technology applied for DTM modelling and volume computation in open pit mines

    Directory of Open Access Journals (Sweden)

    Jaroslaw Wajs

    2016-01-01

    Full Text Available The spatial information systems of mining company can be used for monitoring of mining activity, excavation planning, calculations of the ore volume and decision making. Nowadays, data base has to be updated by sources such as surveying positioning technologies and remote sensed photogrammetry data. The presented paper contains review of the methodology for the digital terrain model, i.e. DTM, modelling and obtaining data from surveying technologies in an open pit mine or quarry. This paper reviews the application of GPS, total station measurements, and ground photogrammetry for the volume accuracy assessment of a selected object. The testing field was situated in Belchatow lignite open pit mine. A suitable object had been selected. The testing layer of coal seam was located at 8’th pit sidewall excavation area. The data were acquired two times within one month period and it was connected with monthly DTM actualization of excavation. This paper presents the technological process and the results of the research of using digital photogrammetry for opencast mining purposes in the scope of numerical volume computation and monitoring the mines by comparison of different sources. The results shows that the presented workflow allow to build DTM manually and remote sensed and the accuracy assessment was presented by the volume computation pathway. Major advantages of the techniques are presented illustrating how a terrestrial photogrammetry techniques provide rapid spatial measurements of breaklines 3D data utilized to volume calculation.

  7. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    Science.gov (United States)

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  8. Animal Science Technology. An Experimental Developmental Program. Volume I, Report of the Developmental Program.

    Science.gov (United States)

    Brant, Herman G.; And Others

    In 1961, administrative personnel at Delhi College in New York observed that formal training programs for animal science technicians were virtually nonexistant. Response to this apparent need resulted in the initiation of perhaps the first 2-year Animal Science Technology Program in the nation. This two-volume report is the result of an extensive…

  9. Compilation of 1986 annual reports of the Navy ELF (extremely low frequency) communications system ecological-monitoring program. Volume 1. Tabs A-C. Annual progress report, January-December 1986

    Energy Technology Data Exchange (ETDEWEB)

    1987-07-01

    This is the fifth compilation of annual reports for the Navy's ELF Communications System Ecological Monitoring Program. This report documents the progress of the following studies: Herbaceous Plant Cover and Tree Studies; Litter Decomposition and Microflora; and The Effects of Exposing the Slime Mold Physarum Polycephalum to Electromagnetic Fields.

  10. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part B, Remedial action, robotics/automation, waste management

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration (ER) and waste management (WN) problems at the Oak Ridge K-25 Site. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remediation, decontamination, and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume 3 B provides the Technology Evaluation Data Sheets (TEDS) for ER/WM activities (Remedial Action Robotics and Automation, Waste Management) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than each technology in Vol. 2. The TEDS are arranged alphanumerically by the TEDS code number in the upper right corner of each data sheet. Volume 3 can be used in two ways: (1) technologies that are identified from Vol. 2 can be referenced directly in Vol. 3 by using the TEDS codes, and (2) technologies and general technology areas (alternatives) can be located in the index in the front of this volume.

  11. Distributed technologies in California's energy future: A preliminary report. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, M.; Craig, P.; McGuire, C.B.; Simmons, M. (eds.)

    1977-09-01

    The chapters in Volume 2 of Distributed Energy Systems in California's Future are: Environmental Impacts of Alternative Energy Technologies for California; Land Use Configurations and the Utilization of Distributive Energy Technology; Land Use Implications of a Dispersed Energy Path; Belief, Behavior, and Technologies as Driving Forces in Transitional Stages--The People Problem in Dispersed Energy Futures; Development of an Energy Attitude Survey; Interventions to Influence Firms Toward the Adoption of ''Soft'' Energy Technology; The Entry of Small Firms into Distributed Technology Energy Industries; Short-Term Matching of Supply and Demand in Electrical Systems with Renewable Sources; Vulnerability of Renewable Energy Systems; and District Heating for California.

  12. Oil and gas technology transfer activities and potential in eight major producing states. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    In 1990, the Interstate Oil and Gas Compact Commission (the Compact) performed a study that identified the structure and deficiencies of the system by which oil and gas producers receive information about the potential of new technologies and communicate their problems and technology needs back to the research community. The conclusions of that work were that major integrated companies have significantly more and better sources of technology information than independent producers. The majors also have significantly better mechanisms for communicating problems to the research and development (R&D) community. As a consequence, the Compact recommended analyzing potential mechanisms to improve technology transfer channels for independents and to accelerate independents acceptance and use of existing and emerging technologies. Building on this work, the Compact, with a grant from the US Department Energy, has reviewed specific technology transfer organizations in each of eight major oil producing states to identify specific R&D and technology transfer organizations, characterize their existing activities, and identify potential future activities that could be performed to enhance technology transfer to oil and gas producers. The profiles were developed based on information received from organizations,follow-up interviews, site visit and conversations, and participation in their sponsored technology transfer activities. The results of this effort are reported in this volume. In addition, the Compact has also developed a framework for the development of evaluation methodologies to determine the effectiveness of technology transfer programs in performing their intended functions and in achieving desired impacts impacts in the producing community. The results of that work are provided in a separate volume.

  13. Using Task Data in Diagnostic Radiology. Research Report No. 8. Volume 2. Curriculum Objectives for Radiologic Technology.

    Science.gov (United States)

    Gilpatrick, Eleanor; Gullion, Christina

    This volume is the result of the application of the Health Services Mobility Study (HSMS) curriculum design method in radiologic technology and is presented in conjunction with volume 1, which reports the task analysis results. Volume 2 contains job-related behavioral curriculum objectives for the aide, technician, and technologist levels in…

  14. Slope excavation quality assessment and excavated volume calculation in hydraulic projects based on laser scanning technology

    Directory of Open Access Journals (Sweden)

    Chao Hu

    2015-04-01

    Full Text Available Slope excavation is one of the most crucial steps in the construction of a hydraulic project. Excavation project quality assessment and excavated volume calculation are critical in construction management. The positioning of excavation projects using traditional instruments is inefficient and may cause error. To improve the efficiency and precision of calculation and assessment, three-dimensional laser scanning technology was used for slope excavation quality assessment. An efficient data acquisition, processing, and management workflow was presented in this study. Based on the quality control indices, including the average gradient, slope toe elevation, and overbreak and underbreak, cross-sectional quality assessment and holistic quality assessment methods were proposed to assess the slope excavation quality with laser-scanned data. An algorithm was also presented to calculate the excavated volume with laser-scanned data. A field application and a laboratory experiment were carried out to verify the feasibility of these methods for excavation quality assessment and excavated volume calculation. The results show that the quality assessment indices can be obtained rapidly and accurately with design parameters and scanned data, and the results of holistic quality assessment are consistent with those of cross-sectional quality assessment. In addition, the time consumption in excavation project quality assessment with the laser scanning technology can be reduced by 70%−90%, as compared with the traditional method. The excavated volume calculated with the scanned data only slightly differs from measured data, demonstrating the applicability of the excavated volume calculation method presented in this study.

  15. 5. annual clean coal technology conference: powering the next millennium. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Fifth Annual Clean Coal Technology Conference focuses on presenting strategies and approaches that will enable clean coal technologies to resolve the competing, interrelated demands for power, economic viability, and environmental constraints associated with the use of coal in the post-2000 era. The program addresses the dynamic changes that will result from utility competition and industry restructuring, and to the evolution of markets abroad. Current projections for electricity highlight the preferential role that electric power will have in accomplishing the long-range goals of most nations. Increase demands can be met by utilizing coal in technologies that achieve environmental goals while keeping the cost- per-unit of energy competitive. Results from projects in the DOE Clean Coal Technology Demonstration Program confirm that technology is the pathway to achieving these goals. The industry/government partnership, cemented over the past 10 years, is focused on moving the clean coal technologies into the domestic and international marketplaces. The Fifth Annual Clean Coal Technology Conference provides a forum to discuss these benchmark issues and the essential role and need for these technologies in the post-2000 era. This volume contains technical papers on: advanced coal process systems; advanced industrial systems; advanced cleanup systems; and advanced power generation systems. In addition, there are poster session abstracts. Selected papers from this proceedings have been processed for inclusion in the Energy Science and Technology database.

  16. A Novel Technology for Measurements of Dielectric Properties of Extremely Small Volumes of Liquids

    Directory of Open Access Journals (Sweden)

    Wei-Na Liu

    2016-01-01

    Full Text Available A high sensitivity sensor for measurement radio frequency (RF dielectric permittivity of liquids is described. Interference is used and parasitic effects are cancellation, which makes the sensor can catch weak signals caused by liquids with extremely small volumes. In addition, we present the relationship between transmission coefficient and permittivity of liquids under test (LUT. Using this sensor, quantitative measurements of the dielectric properties at 5.8 GHz are demonstrated of LUTs. Experiments show that the proposed method only requires the volume of 160 nanoliters for liquids. Therefore, the technology can be used for RF spectroscopic analysis of biological samples and extremely precious liquids.

  17. Oak Ridge National Laboratory Technology Logic Diagram. Volume 1, Technology Evaluation: Part A, Decontamination and Decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Strategic Roadmap for the Oak Ridge Reservation is a generalized planning document that identifies broad categories of issues that keep ORNL outside full compliance with the law and other legally binding agreements. Possible generic paths to compliance, issues, and the schedule for resolution of the issues one identified. The role of the Oak Ridge National Laboratory Technology Logic Diagram (TLD) is then to identify specific site issues (problems), identify specific technologies that can be brought to bear on the issues, and assess the current status and readiness of these remediation technologies within the constraints of the schedule commitment. Regulatory requirements and commitments contained in the Strategic Roadmap for the Oak Ridge Reservation are also included in the TLD as constraints to the application of immature technological solutions. Some otherwise attractive technological solutions may not be employed because they may not be deployable on the schedule enumerated in the regulatory agreements. The roadmap for ORNL includes a list of 46 comprehensive logic diagrams for WM of low-level, radioactive-mixed, hazardous, sanitary and industrial. and TRU waste. The roadmapping process gives comparisons of the installation as it exists to the way the installation should exist under full compliance. The identification of the issues is the goal of roadmapping. This allows accurate and timely formulation of activities.

  18. Y-12 Plant Remedial Action technology logic diagram. Volume I: Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Remedial Action Program addresses remediation of the contaminated groundwater, surface water and soil in the following areas located on the Oak Ridge Reservation: Chestnut Ridge, Bear Creek Valley, the Upper and Lower East Fork Popular Creek Watersheds, CAPCA 1, which includes several areas in which remediation has been completed, and CAPCA 2, which includes dense nonaqueous phase liquid wells and a storage facility. There are many facilities within these areas that are contaminated by uranium, mercury, organics, and other materials. This Technology Logic Diagram identifies possible remediation technologies that can be applied to the soil, water, and contaminants for characterization, treatment, and waste management technology options are supplemented by identification of possible robotics or automation technologies. These would facilitate the cleanup effort by improving safety, of remediation, improving the final remediation product, or decreasing the remediation cost. The Technology Logic Diagram was prepared by a diverse group of more than 35 scientists and engineers from across the Oak Ridge Reservation. Most are specialists in the areas of their contributions. 22 refs., 25 tabs.

  19. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  20. Theory and practice of compilation

    CERN Document Server

    Langmaack, H

    1972-01-01

    Compilation is the translation of high level language programs into machine code. Correct translation can only be achieved if syntax and semantics of programming languages are clearly defined and strictly obeyed by compiler constructors. The author presents a simple extendable scheme for syntax and semantics to be defined rigorously. This scheme fits many programming languages, especially ALGOL-like ones. The author considers statements and programs to be notations of state transformations; in special cases storage state transformations. (5 refs).

  1. Experiences with Compiler Support for Processors with Exposed Pipelines

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Schleuniger, Pascal; Hindborg, Andreas Erik;

    2015-01-01

    Field programmable gate arrays, FPGAs, have become an attractive implementation technology for a broad range of computing systems. We recently proposed a processor architecture, Tinuso, which achieves high performance by moving complexity from hardware to the compiler tool chain. This means...... that the compiler tool chain must handle the increased complexity. However, it is not clear if current production compilers can successfully meet the strict constraints on instruction order and generate efficient object code. In this paper, we present our experiences developing a compiler backend using the GNU...... Compiler Collection, GCC. For a set of C benchmarks, we show that a Tinuso implementation with our GCC backend reaches a relative speedup of up to 1.73 over a similar Xilinx Micro Blaze configuration while using 30% fewer hardware resources. While our experiences are generally positive, we expose some...

  2. Proceedings of the environmental technology through industry partnership conference. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kothari, V.P.

    1995-10-01

    The overall objective of this conference was to review the latest environmental and waste management technologies being developed under the sponsorship of METC. The focus of this conference was also to address the accomplishments and barriers affecting private sector, and lay the groundwork for future technology development initiatives and opportunities. 26 presentations were presented in: Mixed waste characterization, treatment, and disposal; Contaminant plume containment and remediation; and Decontamination and decommissioning. In addition there were 10 Focus Area presentations, 31 Poster papers covering all Focus Areas, and two panel discussions on: Mixed waste characterization, treatment, and disposal issues; and the Application, evaluation, and acceptance of in-situ and ex-situ plume remediation technologies. Volume 2 contains 16 papers in a poster session and 8 papers in the contaminant plume containment and remediation and landfill stabilization Focus Areas. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  3. Western oil-shale development: a technology assessment. Volume 2: technology characterization and production scenarios

    Energy Technology Data Exchange (ETDEWEB)

    1982-01-01

    A technology characterization of processes that may be used in the oil shale industry is presented. The six processes investigated are TOSCO II, Paraho Direct, Union B, Superior, Occidental MIS, and Lurgi-Ruhrgas. A scanario of shale oil production to the 300,000 BPD level by 1990 is developed. (ACR)

  4. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  5. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  6. US long distance fiber optic networks: Technology, evolution and advanced concepts. Volume 1: Executive summary

    Science.gov (United States)

    1986-01-01

    Over the past two decades, fiber optics has emerged as a highly practical and cost-efficient communications technology. Its competitiveness vis-a-vis other transmission media, especially satellite, has become a critical question. This report studies the likely evolution and application of fiber optic networks in the United States to the end of the century. The outlook for the technology of fiber systems is assessed and forecast, scenarios of the evolution of fiber optic network development are constructed, and costs to provide service are determined and examined parametrically as a function of network size and traffic carried. Volume 1 consists of the Executive Summary. Volume 2 focuses on fiber optic technology and long distance fiber optic networks. Volume 3 develops a traffic and financial model of a nationwide long distance transmission network. Among the study's most important conclusions are: revenue requirements per circuit for LATA-to-LATA fiber optic links are less than one cent per call minute; multiplex equipment, which is likely to be required in any competing system, is the largest contributor to circuit costs; the potential capacity of fiber optic cable is very large and as yet undefined; and fiber optic transmission combined with other network optimization schemes can lead to even lower costs than those identified in this study.

  7. Annual Science and Engineering Technology Conference/DOD Technology Exposition (7th). Volume 2. Wednesday - Thursday

    Science.gov (United States)

    2006-04-20

    to engine teams thru AF code AF added strong staff/tech support (AQR) Interview, Dr.P. Bevilaqua, NAE Skunk-PM, Invented Lift Fan FIRST: STO-SSDash-VL...wafer-scale Pilot Line for rapid technology development Producing 4” QVGA backplanes and TDs Tools, materials and processes developed to enable...Centers of Excellence • Co-Funded by AFOSR, TD and University • Extends the capabilities and scope of AFRL • Opens up opportunity for continued focus on AF

  8. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  9. Memory management and compiler support for rapid recovery from failures in computer systems

    Science.gov (United States)

    Fuchs, W. K.

    1991-01-01

    This paper describes recent developments in the use of memory management and compiler technology to support rapid recovery from failures in computer systems. The techniques described include cache coherence protocols for user transparent checkpointing in multiprocessor systems, compiler-based checkpoint placement, compiler-based code modification for multiple instruction retry, and forward recovery in distributed systems utilizing optimistic execution.

  10. Idaho National Engineering Laboratory Waste Area Groups 1-7 and 10 Technology Logic Diagram. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Meservey, R.H.; Little, M.; Ferguson, J.S.; Gilmore, M.C.

    1993-09-01

    The Idaho National Engineering Laboratory (INEL) Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates Environmental Restoration (ER) and Waste Management (WM) problems at the INEL to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to an environmental restoration need. It is essential that follow-on engineering and system studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in this TLD and finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk to meet the site windows of opportunity. The TLD consists of three separate volumes: Volume I includes the purpose and scope of the TLD, a brief history of the INEL Waste Area Groups, and environmental problems they represent. A description of the TLD, definitions of terms, a description of the technology evaluation process, and a summary of each subelement, is presented. Volume II (this volume) describes the overall layout and development of the TLD in logic diagram format. This section addresses the environmental restoration of contaminated INEL sites. Specific INEL problem areas/contaminants are identified along with technology solutions, the status of the technologies, precise science and technology needs, and implementation requirements. Volume III provides the Technology Evaluation Data Sheets (TEDS) for Environmental Restoration and Waste Management (EM) activities that are referenced by a TEDS codenumber in Volume II. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than provided for technologies in Volume II.

  11. Enhancing economic competiveness of dish Stirling technology through production volume and localization: Case study for Morocco

    Science.gov (United States)

    Larchet, Kevin; Guédez, Rafael; Topel, Monika; Gustavsson, Lars; Machirant, Andrew; Hedlund, Maria-Lina; Laumert, Björn

    2017-06-01

    The present study quantifies the reduction in the levelized cost of electricity (LCoE) and capital expenditure (CAPEX) of a dish Stirling power plant (DSPP) through an increase in localization and unit production volume. Furthermore, the localization value of the plant is examined to determine how much investment is brought into the local economy. Ouarzazate, Morocco, was chosen as the location of the study due to the country's favorable regulatory framework with regards to solar power technologies and its established industry in the concentrating solar power (CSP) field. A detailed techno-economic model of a DSPP was developed using KTH's in-house modelling tool DYESOPT, which allows power plant evaluation by means of technical and economic performance indicators. Results on the basis of LCoE and CAPEX were compared between two different cases of production volume, examining both a minimum and maximum level of localization. Thereafter, the DSPP LCoE and localization value were compared against competing solar technologies to evaluate its competitiveness. In addition, a sensitivity analysis was conducted around key design parameters. The study confirms that the LCoE of a DSPP can be reduced to values similar to solar photovoltaic (PV) and lower than other CSP technologies. Furthermore, the investment in the local economy is far greater when compared to PV and of the same magnitude to other CSP technologies. The competiveness of a DSPP has the potential to increase further when coupled with thermal energy storage (TES), which is currently under development.

  12. Certifying cost annotations in compilers

    CERN Document Server

    Amadio, Roberto M; Régis-Gianas, Yann; Saillard, Ronan

    2010-01-01

    We discuss the problem of building a compiler which can lift in a provably correct way pieces of information on the execution cost of the object code to cost annotations on the source code. To this end, we need a clear and flexible picture of: (i) the meaning of cost annotations, (ii) the method to prove them sound and precise, and (iii) the way such proofs can be composed. We propose a so-called labelling approach to these three questions. As a first step, we examine its application to a toy compiler. This formal study suggests that the labelling approach has good compositionality and scalability properties. In order to provide further evidence for this claim, we report our successful experience in implementing and testing the labelling approach on top of a prototype compiler written in OCAML for (a large fragment of) the C language.

  13. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  14. A compiler for variational forms

    CERN Document Server

    Kirby, Robert C; 10.1145/1163641.1163644

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in some cases the speedup is as large as a factor 1000.

  15. A reliable and consistent production technology for high volume compacted graphite iron castings

    Institute of Scientific and Technical Information of China (English)

    Liu Jincheng

    2014-01-01

    The demands for improved engine performance, fuel economy, durability, and lower emissions provide a continual chalenge for engine designers. The use of Compacted Graphite Iron (CGI) has been established for successful high volume series production in the passenger vehicle, commercial vehicle and industrial power sectors over the last decade. The increased demand for CGI engine components provides new opportunities for the cast iron foundry industry to establish efficient and robust CGI volume production processes, in China and globaly. The production window range for stable CGI is narrow and constantly moving. Therefore, any one step single addition of magnesium aloy and the inoculant cannot ensure a reliable and consistent production process for complicated CGI engine castings. The present paper introduces the SinterCast thermal analysis process control system that provides for the consistent production of CGI with low nodularity and reduced porosity, without risking the formation of lfake graphite. The technology is currently being used in high volume Chinese foundry production. The Chinese foundry industry can develop complicated high demand CGI engine castings with the proper process control technology.

  16. Electric and Magnetic Fields (EMF) RAPID Program Engineering Project 8: FINAL REPORT, Evaluation of Field Reduction Technologies, Volume 1 (Report) and Volume 2 (Appendices)

    Energy Technology Data Exchange (ETDEWEB)

    Commonwealth Associates, Inc.; IIT Research Institute

    1997-08-01

    This draft report consists of two volumes. Volume 1, the main body, contains an introducto~ sectionj an overview of magnetic fields sectio~ and field reduction technology evaluation section. Magnetic field reduction methods are evalpated for transmission lines, distribution Iines,sulxtations, building wiring applkmd machinery, and transportation systems. The evaluation considers effectiveness, co% and other ftiors. Volume 2 contains five appendices, Append~ A presents magnetic field shielding information. Appendices B and C present design assumptions and magnetic field plots for transmission and distribution lines, respectively. Appendices D and E present cost estimate details for transmission and distribution limes, respectively.

  17. Data summary of municipal solid waste management alternatives. Volume 3, Appendix A: Mass burn technologies

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-10-01

    This appendix on Mass Burn Technologies is the first in a series designed to identify, describe and assess the suitability of several currently or potentially available generic technologies for the management of municipal solid waste (MSW). These appendices, which cover eight core thermoconversion, bioconversion and recycling technologies, reflect public domain information gathered from many sources. Representative sources include: professional journal articles, conference proceedings, selected municipality solid waste management plans and subscription technology data bases. The information presented is intended to serve as background information that will facilitate the preparation of the technoeconomic and life cycle mass, energy and environmental analyses that are being developed for each of the technologies. Mass burn has been and continues to be the predominant technology in Europe for the management of MSW. In the United States, the majority of the existing waste-to-energy projects utilize this technology and nearly 90 percent of all currently planned facilities have selected mass burn systems. Mass burning generally refers to the direct feeding and combustion of municipal solid waste in a furnace without any significant waste preprocessing. The only materials typically removed from the waste stream prior to combustion are large bulky objects and potentially hazardous or undesirable wastes. The technology has evolved over the last 100 or so years from simple incineration to the most highly developed and commercially proven process available for both reducing the volume of MSW and for recovering energy in the forms of steam and electricity. In general, mass burn plants are considered to operate reliably with high availability.

  18. Fifth DOE symposium on enhanced oil and gas recovery and improved drilling technology. Volume 3. Gas and drilling

    Energy Technology Data Exchange (ETDEWEB)

    Linville, B. [ed.

    1979-01-01

    Volume 3 contains papers from the sessions on natural gas supporting research, western gas sands project, drilling technology, and environmental effects. Individuals were processed for inclusion in the Energy Data Base.

  19. Compiler validates units and dimensions

    Science.gov (United States)

    Levine, F. E.

    1980-01-01

    Software added to compiler for automated test system for Space Shuttle decreases computer run errors by providing offline validation of engineering units used system command programs. Validation procedures are general, though originally written for GOAL, a free-form language that accepts "English-like" statements, and may be adapted to other programming languages.

  20. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  1. Idaho National Engineering Laboratory Waste Area Groups 1-7 and 10 Technology Logic Diagram. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Meservey, R.H.; Little, M.; Ferguson, J.S.; Gilmore, M.C.

    1993-09-01

    The Idaho National Engineering Laboratory (INEL) Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates Environmental Restoration (ER) and Waste Management (WM) problems at the INEL to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to an environmental restoration need. It is essential that follow-on engineering and system studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in this TLD and finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk to meet the site windows of opportunity. The TLD consists of three separate volumes: Volume I includes the purpose and scope of the TLD, a brief history of the INEL Waste Area Groups, and environmental problems they represent. A description of the TLD, definitions of terms, a description of the technology evaluation process, and a summary of each subelement, is presented. Volume II describes the overall layout and development of the TLD in logic diagram format. This section addresses the environmental restoration of contaminated INEL sites. Volume III (this volume) provides the Technology Evaluation Data Sheets (TEDS) for Environmental Restoration and Waste Management (EM) activities that are reference by a TEDS code number in Volume II. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than provided for technologies in Volume II. Data sheets are arranged alphanumerically by the TEDS code number in the upper right corner of each sheet.

  2. Impact of geothermal technology improvements on royalty collections on federal lands: Volume II: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1988-10-01

    This volume contains the appendices for the ''Impact of Geothermal Technology Improvements on Royalty Collections on Federal Lands, Final Report, Volume I.'' The material in this volume supports the conclusions presented in Volume I and details each Known Geothermal Resource Area's (KGRA's) royalty estimation. Appendix A details the physical characteristics of each KGRA considered in Volume I. Appendix B supplies summary narratives on each state which has a KGRA. The information presented in Appendix C shows the geothermal power plant area proxies chosen for each KGRA considered within the report. It also provides data ranges which fit into the IMGEO model for electric energy cost estimates. Appendix D provides detailed cost information from the IMGEO model if no Geothermal Program RandD goals were completed beyond 1987 and if all the RandD goals were completed by the year 2000. This appendix gives an overall electric cost and major system costs, which add up to the overall electric cost. Appendix E supplies information for avoided cost projections for each state involved in the study that were used in the IMGEO model run to determine at what cost/kWh a 50 MWe plant could come on line. Appendix F supplies the code used in the determination of royalty income, as well as, tabled results of the royalty runs (detailed in Appendix G). The tabled results show royalty incomes, assuming a 10% discount rate, with and without RandD and with and without a $0.01/kWh transmission cost. Individual data sheets for each KGRA royalty income run are presented in Appendix G.

  3. Generating a Pattern Matching Compiler by Partial Evaluation

    DEFF Research Database (Denmark)

    Jørgensen, Knud Jesper

    1991-01-01

    Datalogi, partial Evaluation, Compiling, denotational Semantics, Pattern Matching, Semantic directed Compiler Generation......Datalogi, partial Evaluation, Compiling, denotational Semantics, Pattern Matching, Semantic directed Compiler Generation...

  4. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  5. Air Pollution Translations: A Bibliography with Abstracts - Volume 4.

    Science.gov (United States)

    Environmental Protection Agency, Research Triangle Park, NC. Air Pollution Technical Information Center.

    This volume is the fourth in a series of compilations presenting abstracts and indexes of translations of technical air pollution literature. The entries are grouped into 12 subject categories: Emission Sources, Control Methods, Measurement Methods, Air Quality Measurements, Atmospheric Interaction, Basic Science and Technology, Effects--Human…

  6. 闽北天然阔叶林径阶材种结构分析及出材率表的编制%Compilation of Output Structure of Diameter Grade Wood Assortments and Volume Ratio Table for Natural Broad-Leaved Forest Tree Species in North Fujian Province

    Institute of Scientific and Technical Information of China (English)

    李晓景; 江希钿; 庄崇洋; 李小铃

    2012-01-01

    应用闽北天然阔叶林现场造材资料,在分析径阶材种结构规律的基础上,选择适合的方程建立材种出材率模型并编制二元材种出材率表.经检验,该表精度较高,在林业生产上有实用价值.为方便生产中应用,还建立了树高曲线模型,由二元材种出材率袁导出了一元材种出材率表.%The volume ratio model of wood assortment was built by choosing the proper equation, and the two dimension merchantable wood assortment volume yielding rate tables were compiled based on structure analyses of diameter grade wood assortments, with on the spot sample timber data collected from the natural broad-leaved forests in the north Fujian Province. The wood assortment volume yielding rate tables were tested to be with high precision and practicable in forestry production. In order to facilitate the production and application, tree height model was established, and the one dimension merchantable wood assortment volume yielding rate tables were derived from the two dimension tables.

  7. New municipal solid waste processing technology reduces volume and provides beneficial reuse applications for soil improvement and dust control

    Science.gov (United States)

    A garbage-processing technology has been developed that shreds, sterilizes, and separates inorganic and organic components of municipal solid waste. The technology not only greatly reduces waste volume, but the non-composted byproduct of this process, Fluff®, has the potential to be utilized as a s...

  8. Verified Separate Compilation for C

    Science.gov (United States)

    2015-06-01

    independent linking, a new operational model of multilanguage module interaction that supports the statement and proof of cross-language contextual...Compiling Open Programs A presumption of the preceding is that we at least have a specification of multilanguage programs. By multilanguage , I mean...Ahmed [PA14] have also observed, multilanguage semantics is useful not only for program understanding, but also as a mechanism for stating cross

  9. Explanatory Notes to Standard Compilation

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ Ⅰ. Basis for Standard Compilation The economic globalization and China's rapid expansion of foreign exchanges have drastically boosted the demand for translation services. As a result, enterprises offering translation services mushroomed and formed a new industry unlike any other service industry. Though the output value of translation services is not high at the moment, their level and quality have a great impact on the clients because they cover the foreign intercourses in various fields and the construction of major foreign-invested projects.

  10. Compilation of HPSG to TAG

    CERN Document Server

    Kasper, R; Netter, K; Vijay-Shanker, K; Kasper, Robert; Kiefer, Bernd; Netter, Klaus

    1995-01-01

    We present an implemented compilation algorithm that translates HPSG into lexicalized feature-based TAG, relating concepts of the two theories. While HPSG has a more elaborated principle-based theory of possible phrase structures, TAG provides the means to represent lexicalized structures more explicitly. Our objectives are met by giving clear definitions that determine the projection of structures from the lexicon, and identify maximal projections, auxiliary trees and foot nodes.

  11. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  12. The Defense Science Board 1999 Summer Study Task Force on 21st Century Defense Technology Strategies. Volume 1

    Science.gov (United States)

    2016-06-07

    Study Task Force On 21ST Century Defense Technology Strategies Volume 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...3. Defense Technology Strategy and Management PART 4. Strategic Agility PART 5. Analysis and Quantitative Results iii PREFACE The Defense Science...Board (DSB) 1999 Summer Study Task Force on 21st Century Defense Technology Strategies continues a series of studies that have examined key challenges

  13. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  14. An Action Compiler Targeting Standard ML

    DEFF Research Database (Denmark)

    Iversen, Jørgen

    2005-01-01

    We present an action compiler that can be used in connection with an action semantics based compiler generator. Our action compiler produces code with faster execution times than code produced by other action compilers, and for some non-trivial test examples it is only a factor two slower than th...... the code produced by the Gnu C Compiler. Targeting Standard ML makes the description of the code generation simple and easy to implement. The action compiler has been tested on a description of the Core of Standard ML and a subset of C....

  15. Fault-Tree Compiler Program

    Science.gov (United States)

    Butler, Ricky W.; Martensen, Anna L.

    1992-01-01

    FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.

  16. Recommendations for a Retargetable Compiler.

    Science.gov (United States)

    1980-03-01

    reloasables to the genor" pubic, R4ludlgw.u te 106-7R-79-331 hot beea reviewed and U* approved for pb1h*s~~I. APPROVED:j. SAMUE A. DI NITTO , JR. Project...1 b.t,.cf ente~red 17ll1k 20. it ditI.,OI Ito Report) Same ff I IS SUPPLEMENTARY NOTES RADC Project Engineer: Samuel A. Di Nitto , Jr. (ISIS) 2...compiler for Ada can commence development in FY82. SAMUEL A. DI NITTO , JR Project Engineer viii 1. INTRODUCTION In this section, we discuss the current

  17. Proceedings of The Twentieth International Symposium on Space Technology and Science. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-10-31

    The 20th international symposium on space technology and science was held in Nagaragawa city, Gifu prefecture on May 19-25, 1996, and 401 papers were made public. Out of those, 112 papers were summed up as Volume 2 following the previous Volume 1. As to space transportation, the paper included reports titled as follows: Conceptual study of H-IIA rocket (upgraded H-II rocket); Test flight of the launch vehicle; International cooperation in space transportation; etc. Concerning microgravity science, Recent advances in microgravity research; Use of microgravity environment to investigate the effect of magnetic field on flame shape; etc. Relating to satellite communications and broadcasting, `Project GENESYS`: CRL`s R and D project for realizing high data rate satellite communications networks; The Astrolink {sup TM/SM} system; etc. Besides, the paper contained reports on the following fields: lunar and planetary missions and utilization, space science and balloons, earth observations, life science and human presence, international cooperation and space environment, etc

  18. Compilation of fatigue, fatigue-crack propagation, and fracture data for 2024 and 7075 aluminum, Ti-6Al-4V titanium, and 300M steel. Volume 1: Description of data and data storage on magnetic tape. Volume 2: Data tape (7-track magnetic tape)

    Science.gov (United States)

    Rice, R. C.; Reynolds, J. L.

    1976-01-01

    Fatigue, fatigue-crack-propagation, and fracture data compiled and stored on magnetic tape are documented. Data for 202 and 7075 aluminum alloys, Ti-6Al-4V titanium alloy, and 300M steel are included in the compilation. Approximately 4,500 fatigue, 6,500 fatigue-crack-propagation, and 1,500 fracture data points are stored on magnetic tape. Descriptions of the data, an index to the data on the magnetic tape, information on data storage format on the tape, a listing of all data source references, and abstracts of other pertinent test information from each data source reference are included.

  19. Lattice Simulations using OpenACC compilers

    CERN Document Server

    Majumdar, Pushan

    2013-01-01

    OpenACC compilers allow one to use Graphics Processing Units without having to write explicit CUDA codes. Programs can be modified incrementally using OpenMP like directives which causes the compiler to generate CUDA kernels to be run on the GPUs. In this article we look at the performance gain in lattice simulations with dynamical fermions using OpenACC compilers.

  20. Multi-parallel open technology to enable collaborative volume visualization: how to create global immersive virtual anatomy classrooms.

    Science.gov (United States)

    Silverstein, Jonathan C; Walsh, Colin; Dech, Fred; Olson, Eric; E, Michael; Parsad, Nigel; Stevens, Rick

    2008-01-01

    Many prototype projects aspire to develop a sustainable model of immersive radiological volume visualization for virtual anatomic education. Some have focused on distributed or parallel architectures. However, very few, if any others, have combined multi-location, multi-directional, multi-stream sharing of video, audio, desktop applications, and parallel stereo volume rendering, to converge on an open, globally scalable, and inexpensive collaborative architecture and implementation method for anatomic teaching using radiological volumes. We have focused our efforts on bringing this all together for several years. We outline here the technology we're making available to the open source community and a system implementation suggestion for how to create global immersive virtual anatomy classrooms. With the releases of Access Grid 3.1 and our parallel stereo volume rendering code, inexpensive globally scalable technology is available to enable collaborative volume visualization upon an award-winning framework. Based upon these technologies, immersive virtual anatomy classrooms that share educational or clinical principles can be constructed with the setup described with moderate technological expertise and global scalability.

  1. Distributed memory compiler design for sparse problems

    Science.gov (United States)

    Wu, Janet; Saltz, Joel; Berryman, Harry; Hiranandani, Seema

    1991-01-01

    A compiler and runtime support mechanism is described and demonstrated. The methods presented are capable of solving a wide range of sparse and unstructured problems in scientific computing. The compiler takes as input a FORTRAN 77 program enhanced with specifications for distributing data, and the compiler outputs a message passing program that runs on a distributed memory computer. The runtime support for this compiler is a library of primitives designed to efficiently support irregular patterns of distributed array accesses and irregular distributed array partitions. A variety of Intel iPSC/860 performance results obtained through the use of this compiler are presented.

  2. An OpenMP Compiler Benchmark

    Directory of Open Access Journals (Sweden)

    Matthias S. Müller

    2003-01-01

    Full Text Available The purpose of this benchmark is to propose several optimization techniques and to test their existence in current OpenMP compilers. Examples are the removal of redundant synchronization constructs, effective constructs for alternative code and orphaned directives. The effectiveness of the compiler generated code is measured by comparing different OpenMP constructs and compilers. If possible, we also compare with the hand coded "equivalent" solution. Six out of seven proposed optimization techniques are already implemented in different compilers. However, most compilers implement only one or two of them.

  3. Solar/hydrogen systems technologies. Volume II (Part 1 of 2). Solar/hydrogen systems assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Escher, W. J.D.; Foster, R. W.; Tison, R. R.; Hanson, J. A.

    1980-06-02

    Volume II of the Solar/Hydrogen Systems Assessment contract report (2 volumes) is basically a technological source book. Relying heavily on expert contributions, it comprehensively reviews constituent technologies from which can be assembled a wide range of specific solar/hydrogen systems. Covered here are both direct and indirect solar energy conversion technologies; respectively, those that utilize solar radiant energy input directly and immediately, and those that absorb energy from a physical intermediary, previously energized by the sun. Solar-operated hydrogen energy production technologies are also covered in the report. The single most prominent of these is water electrolysis. Utilization of solar-produced hydrogen is outside the scope of the volume. However, the important hydrogen delivery step is treated under the delivery sub-steps of hydrogen transmission, distribution and storage. An exemplary use of the presented information is in the synthesis and analysis of those solar/hydrogen system candidates documented in the report's Volume I. Morever, it is intended that broad use be made of this technology information in the implementation of future solar/hydrogen systems. Such systems, configured on either a distributed or a central-plant basis, or both, may well be of major significance in effecting an ultimate transition to renewable energy systems.

  4. Solar/hydrogen systems technologies. Volume II (Part 2 of 2). Solar/hydrogen systems assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Escher, W. J.D.; Foster, R. W.; Tison, R. R.; Hanson, J. A.

    1980-06-02

    Volume II of the Solar/Hydrogen Systems Assessment contract report (2 volumes) is basically a technological source book. Relying heavily on expert contributions, it comprehensively reviews constituent technologies from which can be assembled a wide range of specific solar/hydrogen systems. Covered here are both direct and indirect solar energy conversion technologies; respectively, those that utilize solar radiant energy input directly and immediately, and those that absorb energy from a physical intermediary, previously energized by the sun. Solar-operated hydrogen energy production technologies are also covered in the report. The single most prominent of these is water electrolysis. Utilization of solar-produced hydrogen is outside the scope of the volume. However, the important hydrogen delivery step is treated under the delivery sub-steps of hydrogen transmission, distribution and storage. An exemplary use of the presented information is in the synthesis and analysis of those solar/hydrogen system candidates documented in the report's Volume I. Moreover, it is intended that broad use be made of this technology information in the implementation of future solar/hydrogen systems. Such systems, configured on either a distributed or a central-plant basis, or both, may well be a major significance in effecting an ultimate transition to renewable energy systems.

  5. CONSUMPTION VOLUMES TECHNOLOGY OF ELECTRICITY AND HEAT BY DEPARTMENTS OF THE UNIVERSITY

    Directory of Open Access Journals (Sweden)

    O. M. Pshinko

    2015-01-01

    Full Text Available Purpose. Efficient use of natural energy resources is one of the priorities of the state policy in the sphere of universities and institutions of the Ministry of Education and Science of Ukraine. Besides search and development the new efficient and clean energy systems it is necessary to implement optimal management of the development and operation of existing facilities, reducing their energy costs. Purpose of this work is to develop consumption volume technology of electricity and heat by scientific departments of Dnipropetrovsk National University of Railway Transport named after Academician V. Lazaryan (DNURT for further finding the ways to reduce energy consumption. The problem is due to the specifics of University’s energy scheme. There is a difficulty for the installation of energy meters and data acquisition about their use in individual branches and structural units. At the same time it is impossible to assess qualitatively the energy position of scientific departments. Methodology. The method to determine the electricity and heat consumption for space heating of scientific departments at the university is based on «The intersectoral rules of electricity and heat energy for institutions and public sector organizations in Ukraine» and «Codes and regulations on rationing of fuel and heat energy for heating the residential buildings as well as for economic needs in Ukraine». Findings. Developed determining expenditure technology of electricity and heat for heating by scientific departments at the DNURT named after Academician V. Lazaryan allows obtaining data on energy consumption in individual units without direct measure and analyzing the effectiveness of energy saving technologies. Originality. It is represented by energy costs in the form of two components and these components are defined on the basis of the energy audit. This enables the energy inputs to implement energy efficiency measures in the research departments of the

  6. Low-level radioactive waste from commercial nuclear reactors. Volume 2. Treatment, storage, disposal, and transportation technologies and constraints

    Energy Technology Data Exchange (ETDEWEB)

    Jolley, R.L.; Dole, L.R.; Godbee, H.W.; Kibbey, A.H.; Oyen, L.C.; Robinson, S.M.; Rodgers, B.R.; Tucker, R.F. Jr.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 2 discusses the definition, forms, and sources of LLRW; regulatory constraints affecting treatment, storage, transportation, and disposal; current technologies used for treatment, packaging, storage, transportation, and disposal; and the development of a matrix relating treatment technology to the LLRW stream as an aid for choosing methods for treating the waste. Detailed discussions are presented for most LLRW treatment methods, such as aqueous processes (e.g., filtration, ion exchange); dewatering (e.g., evaporation, centrifugation); sorting/segregation; mechanical treatment (e.g., shredding, baling, compaction); thermal processes (e.g., incineration, vitrification); solidification (e.g., cement, asphalt); and biological treatment.

  7. The fault-tree compiler

    Science.gov (United States)

    Martensen, Anna L.; Butler, Ricky W.

    1987-01-01

    The Fault Tree Compiler Program is a new reliability tool used to predict the top event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N gates. The high level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precise (within the limits of double precision floating point arithmetic) to the five digits in the answer. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Corporation VAX with the VMS operation system.

  8. Technology transfer for the US Department of Energy's Energy Storage Program: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Bruneau, C.L.; Fassbender, L.L.

    1988-10-01

    This document contains the appendices to Technology Transfer Recommendations for the US Department of Energy's Storage Program (PNL-6484, Vol. 1). These appendices are a list of projects, publications, and presentations connected with the Energy Storage (STOR) program. In Volume 1, the technology transfer activities of the STOR program are examined and mechanisms for increasing the effectiveness of those activities are recommended.

  9. High-Performance Home Technologies: Solar Thermal & Photovoltaic Systems; Volume 6 Building America Best Practices Series

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-01

    The sixth volume of the Building America Best Practices Series presents information that is useful throughout the U.S. for enhancing the energy efficiency practices in the specific climate zones that are presented in each of the volumes.

  10. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice...... and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., that it is a decompiler. In the context of partial evaluation, the binding-time shift of going from a source interpreter to a compiler is classically referred...... to as a Futamura projection. By symmetry, it seems logical to refer to the binding-time shift of going from a target interpreter to a compiler as a Futamura embedding....

  11. Space industrialization. Volume 1: An overview. [Marekt research, technology assessment, and economic impact

    Science.gov (United States)

    1978-01-01

    Benefits accruing to the United States from the investment of public and private resources in space industralization are projected. The future was examined to characterize resource pressures, requirements and supply (population, energy, materials, food). The backdrop of probable events, attitudes, and trends against which space industralization will evolve were postulated. The opportunities for space industry that would benefit earth were compiled and screened against terrestrial alternatives. A cursory market survey was conducted for the selected services and products provided by these initiatives.

  12. Compiling scheme using abstract state machines

    OpenAIRE

    2003-01-01

    The project investigates the use of Abstract State Machine in the process of computer program compilation. Compilation is to produce machine-code from a source program written in a high-level language. A compiler is a program written for the purpose. Machine-code is the computer-readable representation of sequences of computer instructions. An Abstract State Machine (ASM) is a notional computing machine, developed by Yuri Gurevich, for accurately and easily representing the semantics of...

  13. Compiler-assisted static checkpoint insertion

    Science.gov (United States)

    Long, Junsheng; Fuchs, W. K.; Abraham, Jacob A.

    1992-01-01

    This paper describes a compiler-assisted approach for static checkpoint insertion. Instead of fixing the checkpoint location before program execution, a compiler enhanced polling mechanism is utilized to maintain both the desired checkpoint intervals and reproducible checkpoint 1ocations. The technique has been implemented in a GNU CC compiler for Sun 3 and Sun 4 (Sparc) processors. Experiments demonstrate that the approach provides for stable checkpoint intervals and reproducible checkpoint placements with performance overhead comparable to a previously presented compiler assisted dynamic scheme (CATCH) utilizing the system clock.

  14. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  15. Compiler Optimization Techniques for OpenMP Programs

    Directory of Open Access Journals (Sweden)

    Shigehisa Satoh

    2001-01-01

    Full Text Available We have developed compiler optimization techniques for explicit parallel programs using the OpenMP API. To enable optimization across threads, we designed dataflow analysis techniques in which interactions between threads are effectively modeled. Structured description of parallelism and relaxed memory consistency in OpenMP make the analyses effective and efficient. We developed algorithms for reaching definitions analysis, memory synchronization analysis, and cross-loop data dependence analysis for parallel loops. Our primary target is compiler-directed software distributed shared memory systems in which aggressive compiler optimizations for software-implemented coherence schemes are crucial to obtaining good performance. We also developed optimizations applicable to general OpenMP implementations, namely redundant barrier removal and privatization of dynamically allocated objects. Experimental results for the coherency optimization show that aggressive compiler optimizations are quite effective for a shared-write intensive program because the coherence-induced communication volume in such a program is much larger than that in shared-read intensive programs.

  16. Asymptotic fitting optimization technology for source-to-source compile system on CPU-GPU architecture%面向CPU-GPU源到源编译系统的渐近拟合优化方法

    Institute of Scientific and Technical Information of China (English)

    魏洪昌; 朱正东; 董小社; 宁洁

    2016-01-01

    Aiming at addressing the problem of the inadequate performance optimization after developing and porting of application on CPU-GPU heterogeneous parallel system, a new approach for CPU-GPU system is proposed, which com-bines asymptotic fitting optimization with source-to-source compiling technique. This approach can translate C code that inserts directives into CUDA code, and profile the generated code several times. Besides, the approach can realize the source-to-source compiling and optimization of the generated code automatically, and a prototype system based on the approach is realized in this paper as well. Functionality and performance evaluations of the prototype show that the gener-ated CUDA code is functionally equivalent to the original C code while its improvement in performance is significant. When compared with CUDA benchmark, the performance of the generated CUDA code is obviously better than codes generated by other source-to-source compiling technique.%针对CPU-GPU异构并行系统应用开发移植后优化不充分问题,提出了一种渐近拟合优化与源到源编译相结合的方法,该方法能够对插入了制导语句的C语言程序转换为CUDA语言后的程序进行多次剖分,根据源程序特性和硬件信息自动完成源到源编译与优化,并基于该方法实现了原型系统。通过在不同环境中的该原型系统在功能和性能方面进行的测试表明,由系统生成的CUDA目标程序与C源程序在功能上一致,性能上却有了大幅度提高,通过与CUDA基准测试程序相比表明,该目标程序在性能上明显优于其他源到源编译转换生成的程序。

  17. Proving Correctness of Compilers Using Structured Graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    We present an approach to compiler implementation using Oliveira and Cook’s structured graphs that avoids the use of explicit jumps in the generated code. The advantage of our method is that it takes the implementation of a compiler using a tree type along with its correctness proof and turns it ...

  18. Criteria for Evaluating the Performance of Compilers

    Science.gov (United States)

    1974-10-01

    skilled programmer to take advantage of all of the environmental special features which could be exploited by a compiler. These programs are then...id efl !,i% programs, except remove all statement labels. Subtract the ba-c; 162 values obtained by compiling and running a program cont.ziing the

  19. The Molen compiler for reconfigurable architectures

    NARCIS (Netherlands)

    Moscu Panainte, E.

    2007-01-01

    In this dissertation, we present the Molen compiler framework that targets reconfigurable architectures under the Molen Programming Paradigm. More specifically, we introduce a set of compiler optimizations that address one of the main shortcomings of the reconfigurable architectures, namely the reco

  20. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, D.P. (ed)

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  1. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    Helping programmers write parallel software is an urgent problem given the popularity of multi-core architectures. Engineering compilers which automatically parallelize and vectorize code has turned out to be very challenging. Consequently, compilers are very selective with respect to the coding...... patterns they can optimize. We present an interactive approach and a tool set which leverages ad- vanced compiler analysis and optimizations while retaining programmer control over the source code and its transformation. This allows opti- mization even when programmers refrain from enabling optimizations...... to preserve accurate debug information or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and suggests workarounds which can be applied automatically...

  2. Compiler writing system detail design specification. Volume 1: Language specification

    Science.gov (United States)

    Arthur, W. J.

    1974-01-01

    Construction within the Meta language for both language and target machine specification is reported. The elements of the function language as a meaning and syntax are presented, and the structure of the target language is described which represents the target dependent object text representation of applications programs.

  3. The space shuttle payload planning working groups. Volume 10: Space technology

    Science.gov (United States)

    1973-01-01

    The findings and recommendations of the Space Technology group of the space shuttle payload planning activity are presented. The elements of the space technology program are: (1) long duration exposure facility, (2) advanced technology laboratory, (3) physics and chemistry laboratory, (4) contamination experiments, and (5) laser information/data transmission technology. The space technology mission model is presented in tabular form. The proposed experiments to be conducted by each test facility are described. Recommended approaches for user community interfacing are included.

  4. Annual Proceedings of Selected Papers on the Practice of Educational Communications and Technology Presented at the Annual Convention of the Association for Educational Communications and Technology (33rd, Anaheim, California, 2010). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.

    2010-01-01

    For the thirty-third year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. This is Volume #2 of the 33rd "Annual Proceedings of Selected Papers on the Practice of Educational Communications and Technology." This volume includes…

  5. Large floating structures technological advances

    CERN Document Server

    Wang, BT

    2015-01-01

    This book surveys key projects that have seen the construction of large floating structures or have attained detailed conceptual designs. This compilation of key floating structures in a single volume captures the innovative features that mark the technological advances made in this field of engineering, and will provide a useful reference for ideas, analysis, design, and construction of these unique and emerging urban projects to offshore and marine engineers, urban planners, architects and students.

  6. Technical and economic assessment of fluidized bed augmented compressed air energy-storage system. Volume II. Introduction and technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Giramonti, A.J.; Lessard, R.D.; Merrick, D.; Hobson, M.J.

    1981-09-01

    The results are described of a study subcontracted by PNL to the United Technologies Research Center on the engineering feasibility and economics of a CAES concept which uses a coal fired, fluidized bed combustor (FBC) to heat the air being returned from storage during the power production cycle. By burning coal instead of fuel oil, the CAES/FBC concept can completely eliminate the dependence of compressed air energy storage on petroleum fuels. The results of this assessment effort are presented in three volumes. Volume II presents a discussion of program background and an in-depth coverage of both fluid bed combustion and turbomachinery technology pertinent to their application in a CAES power plant system. The CAES/FBC concept appears technically feasible and economically competitive with conventional CAES. However, significant advancement is required in FBC technology before serious commercial commitment to CAES/FBC can be realized. At present, other elements of DOE, industrial groups, and other countries are performing the required R and D for advancement of FBC technology. The CAES/FBC will be reevaluated at a later date when FBC technology has matured and many of the concerns now plaguing FBC are resolved. (LCL)

  7. Idaho National Engineering Laboratory waste area groups 1--7 and 10 Technology Logic Diagram. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Meservey, R.H.; Little, M.; Ferguson, J.S.; Gilmore, M.C.

    1993-09-01

    The Technology Logic Diagram was developed to provide technical alternatives for environmental restoration projects at the Idaho National Engineering Laboratory. The diagram (three volumes) documents suggested solutions to the characterization, retrieval, and treatment phases of cleanup activities at contaminated sites within 8 of the laboratory`s 10 waste area groups. Contaminated sites at the laboratory`s Naval Reactor Facility and Argonne National Laboratory-West are not included in this diagram.

  8. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  9. Data summary of municipal solid waste management alternatives. Volume 2, Exhibits

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-10-01

    The overall objective of the study in this report was to gather data on waste management technologies to allow comparison of various alternatives for managing municipal solid waste (MSW). The specific objectives of the study were to: 1. Compile detailed data for existing waste management technologies on costs, environmental releases, energy requirements and production, and coproducts such as recycled materials and compost. Identify missing information necessary to make energy, economic, and environmental comparisons of various MSW management technologies, and define needed research that could enhance the usefulness of the technology. 3. Develop a data base that can be used to identify the technology that best meets specific criteria defined by a user of the data base. Volume I contains the report text. Volume II contains supporting exhibits. Volumes III through X are appendices, each addressing a specific MSW management technology. Volumes XI and XII contain project bibliographies.

  10. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  11. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  13. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1

    Science.gov (United States)

    Krishen, Kumar (Compiler)

    1994-01-01

    This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.

  14. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2010-01-01

    patterns they can optimize. We present an interactive approach which leverages advanced compiler analysis and optimizations while retaining program- mer control over the source code and its transformation. This allows optimization even when programmers refrain from enabling optimizations to preserve...... accurate debug in- formation or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and sug- gests workarounds which can be applied automatically. We...

  15. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  16. Limb volume measurement: from the past methods to optoelectronic technologies, bioimpedance analysis and laser based devices.

    Science.gov (United States)

    Cavezzi, A; Schingale, F; Elio, C

    2010-10-01

    Accurate measurement of limb volume is considered crucial to lymphedema management. Various non-invasive methods may be used and have been validated in recent years, though suboptimal standardisation has been highlighted in different publications.

  17. Low-rank coal study: national needs for resource development. Volume 3. Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    Technologies applicable to the development and use of low-rank coals are analyzed in order to identify specific needs for research, development, and demonstration (RD and D). Major sections of the report address the following technologies: extraction; transportation; preparation, handling and storage; conventional combustion and environmental control technology; gasification; liquefaction; and pyrolysis. Each of these sections contains an introduction and summary of the key issues with regard to subbituminous coal and lignite; description of all relevant technology, both existing and under development; a description of related environmental control technology; an evaluation of the effects of low-rank coal properties on the technology; and summaries of current commercial status of the technology and/or current RD and D projects relevant to low-rank coals.

  18. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  19. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  20. Extension of Alvis compiler front-end

    Science.gov (United States)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr

    2015-12-01

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters' types and operations on them. Thanks to the compiler's modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  1. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  2. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  3. Compilation of Pilot Cognitive Ability Norms

    Science.gov (United States)

    2011-12-01

    has amassed a body of knowledge about many topics .87 Comprehension (Comp) Measures “ social acculturation ,” “ social intelligence,” and the...AFRL-SA-WP-TR-2012-0001 COMPILATION OF PILOT COGNITIVE ABILITY NORMS Raymond E. King U.S Air Force School of Aerospace Medicine...2011 4. TITLE AND SUBTITLE Compilation of Pilot Cognitive Ability Norms 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  4. Verified Compilation of Floating-Point Computations

    OpenAIRE

    Boldo, Sylvie; Jourdan, Jacques-Henri; Leroy, Xavier; Melquiond, Guillaume

    2015-01-01

    International audience; Floating-point arithmetic is known to be tricky: roundings, formats, exceptional values. The IEEE-754 standard was a push towards straightening the field and made formal reasoning about floating-point computations easier and flourishing. Unfortunately, this is not sufficient to guarantee the final result of a program, as several other actors are involved: programming language, compiler, architecture. The CompCert formally-verified compiler provides a solution to this p...

  5. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  6. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  7. Inventory of Federal energy-related environment and safety research for FY 1978. Volume 1. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-12-01

    The FY 1978 Federal Inventory is a compilation of 3225 federally funded energy-related environmental and safety reserch projects. It consists of three volumes: an executive summary providing an overview of the data (Volume I), a catalog listing each Inventory project followed by series of indexes (Volume II), and an interactive terminal guide giving instructions for on-line data retrieval (Volume III). Volume I reviews the inventory data as a whole and also within each of three major categories: biomedical and environmental research, environmental control technology research, and operational safety research.

  8. Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4. Volume 1: Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    During World War 11, the Oak Ridge Y-12 Plant was built as part of the Manhattan Project to supply enriched uranium for weapons production. In 1945, Building 9201-4 (Alpha-4) was originally used to house a uranium isotope separation process based on electromagnetic separation technology. With the startup of the Oak Ridge K-25 Site gaseous diffusion plant In 1947, Alpha-4 was placed on standby. In 1953, the uranium enrichment process was removed, and installation of equipment for the Colex process began. The Colex process--which uses a mercury solvent and lithium hydroxide as the lithium feed material-was shut down in 1962 and drained of process materials. Residual Quantities of mercury and lithium hydroxide have remained in the process equipment. Alpha-4 contains more than one-half million ft{sup 2} of floor area; 15,000 tons of process and electrical equipment; and 23,000 tons of insulation, mortar, brick, flooring, handrails, ducts, utilities, burnables, and sludge. Because much of this equipment and construction material is contaminated with elemental mercury, cleanup is necessary. The goal of the Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 is to provide a planning document that relates decontamination and decommissioning and waste management problems at the Alpha-4 building to the technologies that can be used to remediate these problems. The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 builds on the methodology transferred by the U.S. Air Force to the Environmental Management organization with DOE and draws from previous technology logic diagram-efforts: logic diagrams for Hanford, the K-25 Site, and ORNL.

  9. Function Interface Models for Hardware Compilation: Types, Signatures, Protocols

    CERN Document Server

    Ghica, Dan R

    2009-01-01

    The problem of synthesis of gate-level descriptions of digital circuits from behavioural specifications written in higher-level programming languages (hardware compilation) has been studied for a long time yet a definitive solution has not been forthcoming. The argument of this essay is mainly methodological, bringing a perspective that is informed by recent developments in programming-language theory. We argue that one of the major obstacles in the way of hardware compilation becoming a useful and mature technology is the lack of a well defined function interface model, i.e. a canonical way in which functions communicate with arguments. We discuss the consequences of this problem and propose a solution based on new developments in programming language theory. We conclude by presenting a prototype implementation and some examples illustrating our principles.

  10. Evaluation of spacecraft technology programs (effects on communication satellite business ventures), volume 2

    Science.gov (United States)

    Greenburg, J. S.; Kaplan, M.; Fishman, J.; Hopkins, C.

    1985-01-01

    The computational procedures used in the evaluation of spacecraft technology programs that impact upon commercial communication satellite operations are discussed. Computer programs and data bases are described.

  11. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    Science.gov (United States)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where

  12. Variation in the measurement of cranial volume and surface area using 3D laser scanning technology.

    Science.gov (United States)

    Sholts, Sabrina B; Wärmländer, Sebastian K T S; Flores, Louise M; Miller, Kevin W P; Walker, Phillip L

    2010-07-01

    Three-dimensional (3D) laser scanner models of human crania can be used for forensic facial reconstruction, and for obtaining craniometric data useful for estimating age, sex, and population affinity of unidentified human remains. However, the use of computer-generated measurements in a casework setting requires the measurement precision to be known. Here, we assess the repeatability and precision of cranial volume and surface area measurements using 3D laser scanner models created by different operators using different protocols for collecting and processing data. We report intraobserver measurement errors of 0.2% and interobserver errors of 2% of the total area and volume values, suggesting that observer-related errors do not pose major obstacles for sharing, combining, or comparing such measurements. Nevertheless, as no standardized procedure exists for area or volume measurements from 3D models, it is imperative to report the scanning and postscanning protocols employed when such measurements are conducted in a forensic setting.

  13. Technology of high-level nuclear waste disposal. Advances in the science and engineering of the management of high-level nuclear wastes. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, P.L.; Breslin, J.J. (eds.)

    1981-01-01

    The papers in this volume cover the following subjects: waste isolation and the natural geohydrologic system; repository perturbations of the natural system; radionuclide migration through the natural system; and repository design technology. Individual papers are abstracted.

  14. Cryogenic Fluid Management Technology Workshop. Volume 1: Presentation material and discussion

    Science.gov (United States)

    Aydelott, John C. (Editor); Devol, William (Editor)

    1987-01-01

    The major objective of the workshop was to identify future NASA needs for technology that will allow the management of subcritical cryogenic fluids in the low gravity space environment. Workshop participants were asked to identify those technologies which will require in-space experimentation and are thus candidates for inclusion in the flight experiment being defined at the Lewis Research Center.

  15. Focus on Technology's Impact on Postsecondary Education. Network News. Volume 23, Number 1

    Science.gov (United States)

    L'Orange, Hans P., Ed.

    2004-01-01

    "Network News" provides an overview of technology's impact on postsecondary education. Particular attention is paid to recent studies looking at distance education and access. This issue contains the following articles: (1) New NCES Report: Distance Education at Degree-Granting Postsecondary Institutions 2000-2001; (2) How Does Technology Affect…

  16. The Objective Force Soldier/Soldier Team. Volume II - The Science and Technology Challenges

    Science.gov (United States)

    2001-11-01

    Number 703767-9007 DSN 427-9007 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39.18 DISCLAIMER This report is the product of the Army Science...MOSAIC. Commercial technology air interfaces and protocols, such as CDMA , Bluetooth, and Personnel Data Assistants (PDAs) will provide technology

  17. Mixed and low-level waste treatment facility project. Volume 3, Waste treatment technologies (Draft)

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-01

    The technology information provided in this report is only the first step toward the identification and selection of process systems that may be recommended for a proposed mixed and low-level waste treatment facility. More specific information on each technology will be required to conduct the system and equipment tradeoff studies that will follow these preengineering studies. For example, capacity, maintainability, reliability, cost, applicability to specific waste streams, and technology availability must be further defined. This report does not currently contain all needed information; however, all major technologies considered to be potentially applicable to the treatment of mixed and low-level waste are identified and described herein. Future reports will seek to improve the depth of information on technologies.

  18. Treatment of emphysema using bronchoscopic lung volume reduction coil technology : an update on efficacy and safety

    NARCIS (Netherlands)

    Hartman, Jorine E.; Klooster, Karin; ten Hacken, Nick H. T.; Slebos, Dirk-Jan

    2015-01-01

    In the last decade several promising bronchoscopic lung volume reduction (BLVR) treatments were developed and investigated. One of these treatments is BLVR treatment with coils. The advantage of this specific treatment is that it works independently of collateral flow, and also shows promise for pat

  19. 28. annual offshore technology conference: Proceedings. Volume 3: Construction and installation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    The 32 papers in this volume cover the following topics: Gulf of Mexico developments -- Popeye and SeaStar; Materials, utilization and fabrication; Mooring and station keeping; Offshore pipelines; and Platform installation. Most papers have been processed separately for inclusion on the data base.

  20. Low-rank coal research: Volume 2, Advanced research and technology development: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mann, M.D.; Swanson, M.L.; Benson, S.A.; Radonovich, L.; Steadman, E.N.; Sweeny, P.G.; McCollor, D.P.; Kleesattel, D.; Grow, D.; Falcone, S.K.

    1987-04-01

    Volume II contains articles on advanced combustion phenomena, combustion inorganic transformation; coal/char reactivity; liquefaction reactivity of low-rank coals, gasification ash and slag characterization, and fine particulate emissions. These articles have been entered individually into EDB and ERA. (LTN)

  1. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  2. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  3. Advances in software science and technology

    CERN Document Server

    Hikita, Teruo; Kakuda, Hiroyasu

    1993-01-01

    Advances in Software Science and Technology, Volume 4 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 10 chapters, this volume begins with an overview of the historical survey of programming languages for vector/parallel computers in Japan and describes compiling methods for supercomputers in Japan. This text then explains the model of a Japanese software factory, which is presented by the logical configuration that has been satisfied by

  4. Characterization of alternative electric generation technologies for the SPS comparative assessment: volume 2, central-station technologies

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    The SPS Concept Development and Evaluation Program includes a comparative assessment. An early first step in the assessment process is the selection and characterization of alternative technologies. This document describes the cost and performance (i.e., technical and environmental) characteristics of six central station energy alternatives: (1) conventional coal-fired powerplant; (2) conventional light water reactor (LWR); (3) combined cycle powerplant with low-Btu gasifiers; (4) liquid metal fast breeder reactor (LMFBR); (5) photovoltaic system without storage; and (6) fusion reactor.

  5. Analytical challenges of determining composition and structure in small volumes with applications to semiconductor technology, nanostructures and solid state science

    Science.gov (United States)

    Ma, Zhiyong; Kuhn, Markus; Johnson, David C.

    2017-03-01

    Determining the structure and composition of small volumes is vital to the ability to understand and control nanoscale properties and critical for advancing both fundamental science and applications, such as semiconductor device manufacturing. While metrology of nanoscale materials (nanoparticles, nanocomposites) and nanoscale semiconductor structures is challenging, both basic research and cutting edge technology benefit from new and enhanced analytical techniques. This focus issue contains articles describing approaches to overcome the challenges in obtaining statistically significant atomic-scale quantification of structure and composition in a variety of materials and devices using electron microscopy and atom probe tomography.

  6. Flexible IDL Compilation for Complex Communication Patterns

    Directory of Open Access Journals (Sweden)

    Eric Eide

    1999-01-01

    Full Text Available Distributed applications are complex by nature, so it is essential that there be effective software development tools to aid in the construction of these programs. Commonplace “middleware” tools, however, often impose a tradeoff between programmer productivity and application performance. For instance, many CORBA IDL compilers generate code that is too slow for high‐performance systems. More importantly, these compilers provide inadequate support for sophisticated patterns of communication. We believe that these problems can be overcome, thus making idl compilers and similar middleware tools useful for a broader range of systems. To this end we have implemented Flick, a flexible and optimizing IDL compiler, and are using it to produce specialized high‐performance code for complex distributed applications. Flick can produce specially “decomposed” stubs that encapsulate different aspects of communication in separate functions, thus providing application programmers with fine‐grain control over all messages. The design of our decomposed stubs was inspired by the requirements of a particular distributed application called Khazana, and in this paper we describe our experience to date in refitting Khazana with Flick‐generated stubs. We believe that the special idl compilation techniques developed for Khazana will be useful in other applications with similar communication requirements.

  7. Emerging Technologies Program Integration Report. Volume 2. Background, Delphi and Workshop Data. Appendices

    Science.gov (United States)

    1987-05-04

    models for specific a;plication domains. --Need experience with specific applications that will push the hardware/ .’ software/conceptual resources...militaryf impact of this technology: A. How might the tecnology in question change US military capabilities? Diminish days lost due to common viral...capability. To achieve necessary goals will require pushing of 7m1 wave tube, solid state device and antenna technologies. Oirk R. Klose . REPORTER: AV 995

  8. Technology assessment of future intercity passenger transportation systems. Volume 6: Impact assessment

    Science.gov (United States)

    1976-01-01

    Consequences that might occur if certain technological developments take place in intercity transportation are described. These consequences are broad ranging, and include economic, environmental, social, institutional, energy-related, and transportation service implications. The possible consequences are traced through direct (primary) impacts to indirect (secondary, tertiary, etc.) impacts. Chains of consequences are traced, reaching as far beyond the original transportation cause as is necessary to identify all impacts felt to be influenced significantly by the technological development considered.

  9. High-volume extraction of nucleic acids by magnetic bead technology for ultrasensitive detection of bacteria in blood components.

    Science.gov (United States)

    Störmer, Melanie; Kleesiek, Knut; Dreier, Jens

    2007-01-01

    Nucleic acid isolation, the most technically demanding and laborious procedure performed in molecular diagnostics, harbors the potential for improvements in automation. A recent development is the use of magnetic beads covered with nucleic acid-binding matrices. We adapted this technology with a broad-range 23S rRNA real-time reverse transcription (RT)-PCR assay for fast and sensitive detection of bacterial contamination of blood products. We investigated different protocols for an automated high-volume extraction method based on magnetic-separation technology for the extraction of bacterial nucleic acids from platelet concentrates (PCs). We added 2 model bacteria, Staphylococcus epidermidis and Escherichia coli, to a single pool of apheresis-derived, single-donor platelets and assayed the PCs by real-time RT-PCR analysis with an improved primer-probe system and locked nucleic acid technology. Co-amplification of human beta(2)-microglobulin mRNA served as an internal control (IC). We used probit analysis to calculate the minimum concentration of bacteria that would be detected with 95% confidence. For automated magnetic bead-based extraction technology with the real-time RT-PCR, the 95% detection limit was 29 x 10(3) colony-forming units (CFU)/L for S. epidermidis and 22 x 10(3) CFU/L for E. coli. No false-positive results occurred, either due to nucleic acid contamination of reagents or externally during testing of 1030 PCs. High-volume nucleic acid extraction improved the detection limit of the assay. The improvement of the primer-probe system and the integration of an IC make the RT-PCR assay appropriate for bacteria screening of platelets.

  10. Survey of Technology with Possible Applications to United States Coast Guard Buoy Tenders. Volume 1. Technology Assessment.

    Science.gov (United States)

    1987-09-01

    performance. Zig - zag tests show SWATH ships have half the overshoot of conventional ships. At slow speeds, the widely separated propulsion units aid...Ii S2 2-13 C, Lb 0 00 2-14 ’rkpl I% -W N~ louw W ME ’ K’.MM w~~b Af’- w~t W pw -.J 1 W ’W SS U. C ’U . 2-15 Mill ll WRII9PqWWmWIK"-wPFll7TW W.F K...Evaluation of a Compound Cycle Engine for Shipboard Gensets," Report No. DTNSRDC-PASD-CR-1886. 6.19 Mills , R.G., "Innovative Technology on Steam

  11. Building Technology Forecast and Evaluation (BTFE). Volume 2. Evaluation of Two Structural Systems

    Science.gov (United States)

    1990-11-01

    are all one piece of formwork , erected and stripped as a single unit. Use of half or full tunnels depends on the room width, form weight, and crane...expected to be good. 1.3 Formwork / Tunnel lorms are metal sheets and are E Temporary designed adequately; scaffolding is Supports used as required. 1.4...structural systems with potential advantage to the U.S. Army Corps of Engineers (USACE): tunnel forming systems and composite panellzed systems. Volume 11

  12. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    Science.gov (United States)

    1993-11-01

    U.S. Department of Transportation PFe 1rs Aviation Administration DOT/FAA/CT-88/10 HANDBOOK- VOLUME H DIGITAL SYSTEMS VALIDATION - CHAPTER 18 tw...18-29 improve identification, control, and auditing of software. SCM and SQA methods in RTCA/DO-178A are drawn directly from proven methods of hardware...procedures, and practices; reviews and audits ; configuration management; medium control; testing; supplier control; and appropriate records. A brief

  13. Advances on Propulsion Technology for High-Speed Aircraft. Volume 2

    Science.gov (United States)

    2007-03-01

    drag, or cruise. As the prime contractor for NASA, ATK GASL had the unique perspective of seeing the complete program from start to finish. Although... prime contractor to NASA for the execution of the program and had overall responsibility for the detailed design and manufacture and support to flight...Digest, Volume 26, Number 4 (2005) [9] 9 Tyll, J., Erdos , J., Bakos, R., Low-Cost Free-Flight Testing of Hypersonic Airbreathing I Engines, ISOABE 2001

  14. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  15. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  16. A small evaluation suite for Ada compilers

    Science.gov (United States)

    Wilke, Randy; Roy, Daniel M.

    1986-01-01

    After completing a small Ada pilot project (OCC simulator) for the Multi Satellite Operations Control Center (MSOCC) at Goddard last year, the use of Ada to develop OCCs was recommended. To help MSOCC transition toward Ada, a suite of about 100 evaluation programs was developed which can be used to assess Ada compilers. These programs compare the overall quality of the compilation system, compare the relative efficiencies of the compilers and the environments in which they work, and compare the size and execution speed of generated machine code. Another goal of the benchmark software was to provide MSOCC system developers with rough timing estimates for the purpose of predicting performance of future systems written in Ada.

  17. Proof-Carrying Code with Correct Compilers

    Science.gov (United States)

    Appel, Andrew W.

    2009-01-01

    In the late 1990s, proof-carrying code was able to produce machine-checkable safety proofs for machine-language programs even though (1) it was impractical to prove correctness properties of source programs and (2) it was impractical to prove correctness of compilers. But now it is practical to prove some correctness properties of source programs, and it is practical to prove correctness of optimizing compilers. We can produce more expressive proof-carrying code, that can guarantee correctness properties for machine code and not just safety. We will construct program logics for source languages, prove them sound w.r.t. the operational semantics of the input language for a proved-correct compiler, and then use these logics as a basis for proving the soundness of static analyses.

  18. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  19. Code Generation in the Columbia Esterel Compiler

    Directory of Open Access Journals (Sweden)

    Jia Zeng

    2007-02-01

    Full Text Available The synchronous language Esterel provides deterministic concurrency by adopting a semantics in which threads march in step with a global clock and communicate in a very disciplined way. Its expressive power comes at a cost, however: it is a difficult language to compile into machine code for standard von Neumann processors. The open-source Columbia Esterel Compiler is a research vehicle for experimenting with new code generation techniques for the language. Providing a front-end and a fairly generic concurrent intermediate representation, a variety of back-ends have been developed. We present three of the most mature ones, which are based on program dependence graphs, dynamic lists, and a virtual machine. After describing the very different algorithms used in each of these techniques, we present experimental results that compares twenty-four benchmarks generated by eight different compilation techniques running on seven different processors.

  20. New Approach to Develop a Bilingual Compiler

    Directory of Open Access Journals (Sweden)

    Shampa Banik

    2014-02-01

    Full Text Available This research work presents a development of a Bangla programming language along with its compiler with an aim to introduce the programming language to the beginner through mother tongue. The syntax and construction of the programming language has been kept similar to BASIC language by considering the fact that BASIC is very easier in terms of its syntax, which is reasonably applicable as an introductory language for new programmer. A compiler has been developed for the proposed programming language that compile the source code into an intermediate code which is optimized. We have developed our system in Java. Our software is an efficient translation engine which can translate English source code to Bangla source code. We have implemented the system with a lot of test cases to identify what aspects of the system best explain their relative performance.

  1. Evaluation of spacecraft technology programs (effects on communication satellite business ventures), volume 1

    Science.gov (United States)

    Greenburg, J. S.; Gaelick, C.; Kaplan, M.; Fishman, J.; Hopkins, C.

    1985-01-01

    Commercial organizations as well as government agencies invest in spacecraft (S/C) technology programs that are aimed at increasing the performance of communications satellites. The value of these programs must be measured in terms of their impacts on the financial performane of the business ventures that may ultimately utilize the communications satellites. An economic evaluation and planning capability was developed and used to assess the impact of NASA on-orbit propulsion and space power programs on typical fixed satellite service (FSS) and direct broadcast service (DBS) communications satellite business ventures. Typical FSS and DBS spin and three-axis stabilized spacecraft were configured in the absence of NASA technology programs. These spacecraft were reconfigured taking into account the anticipated results of NASA specified on-orbit propulsion and space power programs. In general, the NASA technology programs resulted in spacecraft with increased capability. The developed methodology for assessing the value of spacecraft technology programs in terms of their impact on the financial performance of communication satellite business ventures is described. Results of the assessment of NASA specified on-orbit propulsion and space power technology programs are presented for typical FSS and DBS business ventures.

  2. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  3. COMPILATION OF CURRENT HIGH ENERGY PHYSICS EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.; Horne, C.P.; Hutchinson, M.S.; Rittenberg, A.; Trippe, T.G.; Yost, G.P.; Addis, L.; Ward, C.E.W.; Baggett, N.; Goldschmidt-Clermong, Y.; Joos, P.; Gelfand, N.; Oyanagi, Y.; Grudtsin, S.N.; Ryabov, Yu.G.

    1981-05-01

    This is the fourth edition of our compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about April 1981, and (2) had not completed taking of data by 1 January 1977. We emphasize that only approved experiments are included.

  4. Study of power management technology for orbital multi-100KWe applications. Volume 3: Requirements

    Science.gov (United States)

    Mildice, J. W.

    1980-01-01

    Mid to late 1980's power management technology needs to support development of a general purpose space platform, capable of suplying 100 to 250 KWe to a variety of users in low Earth orbit are examined. A typical, shuttle assembled and supplied space platform is illustred, along with a group of payloads which might reasonably be expected to use such a facility. Examination of platform and user power needs yields a set of power requirements used to evaluate power management options for life cycle cost effectivness. The most cost effective ac/dc and dc systems are evaluated, specifically to develop system details which lead to technology goals, including: array and transmission voltages, best frequency for ac power transmission, and advantages and disadvantages of ac and dc systems for this application. System and component requirements are compared with the state-of-the-art to identify areas where technological development is required.

  5. Advanced Processing Technology semiannual report, March--December 1991. Volume 1, Number 1

    Energy Technology Data Exchange (ETDEWEB)

    Adamson, M.; Kline-Simon, K. [eds.

    1991-12-31

    This first issue of the APT Semiannual Report focuses on APT`s defense-related technologies. These technologies are a continuation of the research, development, and engineering work performed by LLNLs Special Isotope Separation (SIS) Program. SIS was the first large-scale DOE venture that had environmentally conscious manufacturing processes and facilities as its deliverables. The objectives were to create a facility where the only outputs were either usable products or disposable wastes, and to comply with existing and anticipated federal, state, and local regulations related to safeguards, security, health and safety. To meet these objectives, revolutionary changes were needed in plutonium processing operations, chemistry, and equipment. New processes had to be developed that enhanced worker safety, minimized operator radiation dose, minimized waste at the point of generation, and provided for built-in recycling of residues. The SIS Program developed and demonstrated the technology (both chemistry and physics) necessary to provide plutonium with individual isotopic tailoring. This process made it possible to transform fuel-grade plutonium into weapon-grade material. However, due to the changing world political climate, the country`s need for plutonium to make new weapons has decreased dramatically. As a result, the planned SIS plutonium-separation plant will not be built. After the SIS Program was canceled in 199 1, Congress directed that the plutonium processing technologies under development for the SIS Program be redirected to the weapons program. APT took over the development of the innovative SIS technologies and is applying them to the development of a new, reconfigured Nuclear Weapons Complex -- Complex 21. ``Close Out of the SIS Program`` describes the completion of the SIS research and development work, and the transfer of key technologies to support this reconfiguration effort.

  6. Advanced Processing Technology semiannual report, March--December 1991. Volume 1, Number 1

    Energy Technology Data Exchange (ETDEWEB)

    Adamson, M.; Kline-Simon, K. [eds.

    1991-12-31

    This first issue of the APT Semiannual Report focuses on APT`s defense-related technologies. These technologies are a continuation of the research, development, and engineering work performed by LLNLs Special Isotope Separation (SIS) Program. SIS was the first large-scale DOE venture that had environmentally conscious manufacturing processes and facilities as its deliverables. The objectives were to create a facility where the only outputs were either usable products or disposable wastes, and to comply with existing and anticipated federal, state, and local regulations related to safeguards, security, health and safety. To meet these objectives, revolutionary changes were needed in plutonium processing operations, chemistry, and equipment. New processes had to be developed that enhanced worker safety, minimized operator radiation dose, minimized waste at the point of generation, and provided for built-in recycling of residues. The SIS Program developed and demonstrated the technology (both chemistry and physics) necessary to provide plutonium with individual isotopic tailoring. This process made it possible to transform fuel-grade plutonium into weapon-grade material. However, due to the changing world political climate, the country`s need for plutonium to make new weapons has decreased dramatically. As a result, the planned SIS plutonium-separation plant will not be built. After the SIS Program was canceled in 199 1, Congress directed that the plutonium processing technologies under development for the SIS Program be redirected to the weapons program. APT took over the development of the innovative SIS technologies and is applying them to the development of a new, reconfigured Nuclear Weapons Complex -- Complex 21. ``Close Out of the SIS Program`` describes the completion of the SIS research and development work, and the transfer of key technologies to support this reconfiguration effort.

  7. Technology assessment of solar energy systems. Scenario development and methodology. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    Schiffman, Y.M.

    1981-07-01

    Included are a general overview of the Technology Assessment of Solar Energy systems (TASE) project and a description of the study approach, the development of the TASE scenarios, energy and environmental assumptions, and assumptions and forecasts of the FOSSIL2 National Energy Model upon which the TASE scenarios were based. The Strategic Environmental Assessment System (SEAS) model was used to generate the analytical data base for TASE. Improvements made to SEAS to allow it to model solar and biomass energy technologies are also described.

  8. Shuttle Ground Operations Efficiencies/Technologies Study (SGOE/T). Volume 5: Technical Information Sheets (TIS)

    Science.gov (United States)

    Scholz, A. L.; Hart, M. T.; Lowry, D. J.

    1987-01-01

    The Technology Information Sheet was assembled in database format during Phase I. This document was designed to provide a repository for information pertaining to 144 Operations and Maintenance Instructions (OMI) controlled operations in the Orbiter Processing Facility (OPF), Vehicle Assembly Building (VAB), and PAD. It provides a way to accumulate information about required crew sizes, operations task time duration (serial and/or parallel), special Ground Support Equipment (GSE). required, and identification of a potential application of existing technology or the need for the development of a new technolgoy item.

  9. Fiscal years 1994--1998 Information Technology Strategic Plan. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    A team of senior managers from across the US Nuclear Regulatory Commission (NRC), working with the Office of Information Resources Management (IRM), has completed an NRC Strategic Information Technology (IT) Plan. The Plan addresses three major areas: (1) IT Program Management, (2) IT Infrastructure, and (3) Information and Applications Management. Key recommendations call for accelerating the replacement of Agency workstations, implementing a new document management system, applying business process reengineering to selected Agency work processes, and establishing an Information Technology Council to advise the Director of IRM.

  10. Feasibility Study for the Establishment of a Pharmacy Technology Program. Volume 11, Number 8.

    Science.gov (United States)

    Bourke, Patricia G.; And Others

    In December 1980, a study was conducted by William Rainey Harper College (WRHC) to determine the feasibility of establishing a pharmacy technology program. Professional Life Science and Human Services staff members telephoned 13 hospital pharmacies and four retail pharmacies in WRHC's service area. It was felt that if the 15 responding pharmacies…

  11. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    Science.gov (United States)

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  12. Sandia technology. Volume 13, number 2 Special issue : verification of arms control treaties.

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    Nuclear deterrence, a cornerstone of US national security policy, has helped prevent global conflict for over 40 years. The DOE and DoD share responsibility for this vital part of national security. The US will continue to rely on nuclear deterrence for the foreseeable future. In the late 1950s, Sandia developed satellite-borne nuclear burst detection systems to support the treaty banning atmospheric nuclear tests. This activity has continued to expand and diversify. When the Non-Proliferation Treaty was ratified in 1970, we began to develop technologies to protect nuclear materials from falling into unauthorized hands. This program grew and now includes systems for monitoring the movement and storage of nuclear materials, detecting tampering, and transmiting sensitive data securely. In the late 1970s, negotiations to further limit underground nuclear testing were being actively pursued. In less than 18 months, we fielded the National Seismic Station, an unattended observatory for in-country monitoring of nuclear tests. In the mid-l980s, arms-control interest shifted to facility monitoring and on-site inspection. Our Technical On-site Inspection Facility is the national test bed for perimeter and portal monitoring technology and the prototype for the inspection portal that was recently installed in the USSR under the Intermediate-Range Nuclear Forces accord. The articles in the special issue of Sundiu Technology describe some of our current contributions to verification technology. This work supports the US policy to seek realistic arms control agreements while maintaining our national security.

  13. Global Journal of Computer Science and Technology. Volume 1.2

    Science.gov (United States)

    Dixit, R. K.

    2009-01-01

    Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…

  14. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    Science.gov (United States)

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  15. How to compile a curriculum vitae.

    Science.gov (United States)

    Fish, J

    The previous article in this series tackled the best way to apply for a job. Increasingly, employers request a curriculum vitae as part of the application process. This article aims to assist you in compiling a c.v. by discussing its essential components and content.

  16. Medical History: Compiling Your Medical Family Tree

    Science.gov (United States)

    ... history. Or, you can compile your family's health history on your computer or in a paper file. If you encounter reluctance from your family, consider these strategies: Share your ... have a family history of certain diseases or health conditions. Offer to ...

  17. Communications techniques and equipment: A compilation

    Science.gov (United States)

    1975-01-01

    This Compilation is devoted to equipment and techniques in the field of communications. It contains three sections. One section is on telemetry, including articles on radar and antennas. The second section describes techniques and equipment for coding and handling data. The third and final section includes descriptions of amplifiers, receivers, and other communications subsystems.

  18. Western oil shale development: a technology assessment. Volume 1. Main report

    Energy Technology Data Exchange (ETDEWEB)

    1981-11-01

    The general goal of this study is to present the prospects of shale oil within the context of (1) environmental constraints, (2) available natural and economic resources, and (3) the characteristics of existing and emerging technology. The objectives are: to review shale oil technologies objectively as a means of supplying domestically produced fuels within environmental, social, economic, and legal/institutional constraints; using available data, analyses, and experienced judgment, to examine the major points of uncertainty regarding potential impacts of oil shale development; to resolve issues where data and analyses are compelling or where conclusions can be reached on judgmental grounds; to specify issues which cannot be resolved on the bases of the data, analyses, and experienced judgment currently available; and when appropriate and feasible, to suggest ways for the removal of existing uncertainties that stand in the way of resolving outstanding issues.

  19. Compilation of tRNA sequences.

    Science.gov (United States)

    Sprinzl, M; Grueter, F; Spelzhaus, A; Gauss, D H

    1980-01-11

    This compilation presents in a small space the tRNA sequences so far published. The numbering of tRNAPhe from yeast is used following the rules proposed by the participants of the Cold Spring Harbor Meeting on tRNA 1978 (1,2;Fig. 1). This numbering allows comparisons with the three dimensional structure of tRNAPhe. The secondary structure of tRNAs is indicated by specific underlining. In the primary structure a nucleoside followed by a nucleoside in brackets or a modification in brackets denotes that both types of nucleosides can occupy this position. Part of a sequence in brackets designates a piece of sequence not unambiguosly analyzed. Rare nucleosides are named according to the IUPACIUB rules (for complicated rare nucleosides and their identification see Table 1); those with lengthy names are given with the prefix x and specified in the footnotes. Footnotes are numbered according to the coordinates of the corresponding nucleoside and are indicated in the sequence by an asterisk. The references are restricted to the citation of the latest publication in those cases where several papers deal with one sequence. For additional information the reader is referred either to the original literature or to other tRNA sequence compilations (3-7). Mutant tRNAs are dealt with in a compilation by J. Celis (8). The compilers would welcome any information by the readers regarding missing material or erroneous presentation. On the basis of this numbering system computer printed compilations of tRNA sequences in a linear form and in cloverleaf form are in preparation.

  20. Municipal Solid Waste (MSW) to Liquid Fuels Synthesis, Volume 1: Availability of Feedstock and Technology

    Energy Technology Data Exchange (ETDEWEB)

    Valkenburt, Corinne; Walton, Christie W.; Thompson, Becky L.; Gerber, Mark A.; Jones, Susanne B.; Stevens, Don J.

    2008-12-01

    This report investigated the potential of using municipal solid waste (MSW) to make synthesis gas (syngas) suitable for production of liquid fuels. Issues examined include: • MSW physical and chemical properties affecting its suitability as a gasifier feedstock and for liquid fuels synthesis • expected process scale required for favorable economics • the availability of MSW in quantities sufficient to meet process scale requirements • the state-of-the-art of MSW gasification technology.

  1. Economic Feasibility and Market Readiness of Solar Technologies. Draft Final Report. Volume I.

    Energy Technology Data Exchange (ETDEWEB)

    Flaim, Silvio J.; Buchanan, Deborah L.; Christmas, Susan; Fellhauer, Cheryl; Glenn, Barbara; Ketels, Peter A.; Levary, Arnon; Mourning, Pete; Steggerda, Paul; Trivedi, Harit; Witholder, Robert E.

    1978-09-01

    Systems descriptions, costs, technical and market readiness assessments are reported for ten solar technologies: solar heating and cooling of buildings (SHACOB), passive, agricultural and industrial process heat (A/IPH), biomass, ocean thermal (OTEC), wind (WECS), solar thermal electric, photovoltaics, satellite power station (SPS), and solar total energy systems (STES). Study objectives, scope, and methods. are presented. of Joint Task The cost and market analyses portion 5213/6103 will be used to make commercialization assessments in the conclusions of. the final report.

  2. ICAM (Integrated Computer-Aided Manufacturing) Manufacturing Cost/Design Guide. Volume 7. Technology Transfer Summary.

    Science.gov (United States)

    1984-09-01

    advanced composite structures for production. Conduct material and manufacturing trade-off studies on ATF advanced design, NASA composite wing, and...255-7371. D-17 .* . . . . . . .. . .. .. . . . . . - . , o , . . - .’. TTD4502 60000 12 Sept 1984 CC tE % > LL 00 o0h K; Cr A .’ TTD450260000 12... coatings on the internal "Top-of-the-Line" Manu- observing the effect of the surfaces of hollowf~acturing Technology Success proposed change. Potential

  3. Engineering Technology Reports, Volume 1: Laboratory Directed Research and Development FY00

    Energy Technology Data Exchange (ETDEWEB)

    Baron, A L; Langland, R T; Minichino, C

    2001-10-03

    In FY-2000, Engineering at Lawrence Livermore National Laboratory faced significant pressures to meet critical project milestones, and immediate demands to facilitate the reassignment of employees as the National Ignition Facility (the 600-TW laser facility being designed and built at Livermore, and one of the largest R&D construction projects in the world) was in the process of re-baselining its plan while executing full-speed its technology development efforts. This drive for change occurred as an unprecedented level of management and program changes were occurring within LLNL. I am pleased to report that we met many key milestones and achieved numerous technological breakthroughs. This report summarizes our efforts to perform feasibility and reduce-to-practice studies, demonstrations, and/or techniques--as structured through our technology centers. Whether using computational engineering to predict how giant structures like suspension bridges will respond to massive earthquakes or devising a suitcase-sized microtool to detect chemical and biological agents used by terrorists, we have made solid technical progress. Five Centers focus and guide longer-term investments within Engineering, as well as impact all of LLNL. Each Center is responsible for the vitality and growth of the core technologies it represents. My goal is that each Center will be recognized on an international scale for solving compelling national problems requiring breakthrough innovation. The Centers and their leaders are as follows: Center for Complex Distributed Systems--David B. McCallen; Center for Computational Engineering--Kyran D. Mish; Center for Microtechnology--Raymond P. Mariella, Jr.; Center for Nondestructive Characterization--Harry E. Martz, Jr.; and Center for Precision Engineering--Keith Carlisle.

  4. Technology Scenario for the Year 2005. Volume II. Detailed Scenes for Scenarios.

    Science.gov (United States)

    1981-10-01

    from a very considerable distance, acoustic devices that analyse radiated sound signatures, and IR receivers that analyse heat radiation patterns...to counter with nasking tactics of one kind or another--such a.s radiation of false electromagnetic signatures, electronic jamming, etc. Only a...ries Science, Technology, and I novat ion. Treparedl 1,r tile :atioana[ Science Foundation. Columbus: Bat telleo, 1973. .ieittv, .1. Kell Iv, ’ Solair

  5. 2005 5th Annual CMMI Technology Conference and User Group. Volume 3 - Wednesday

    Science.gov (United States)

    2005-11-17

    formerly Sperry Flight Systems) and 2 years with Tracor Aerospace, developed or managed the development of embedded software for avionics systems...estimating, tracking, forecasting, and benchmarking. Mr. Ross, during 17 years with Honeywell Air Transport Systems (formerly Sperry Flight Systems...Implementing CMMI in a Services Environment Thomas E. Zience and Roger W. Lee BAE Systems Information Technology All Rights Reserved by BAE Systems Information

  6. WSTIAC: Weapon Systems Technology Information Analysis Center. Volume 6, Number 1

    Science.gov (United States)

    2005-01-01

    Computational Neuroscience Program 2004. Associate Editor; Journal of Counter Ordinance Technology. Joel L EDavis, Program Manager, Cognitive and Neural...tests were against unitary (nonseparating) targets representative of the Journal of the Optical Society of America A, involves of SCUD-type ballistic...Agent ( CSEA ) and prime contractor for the Aegis Weapon System agencies. A single query searches across 30 databases and and Vertical Launch System

  7. Compiled Proceedings: Helping Indochinese Families in Transition Conference (May 11-12, 1981).

    Science.gov (United States)

    Meredith, William H., Ed.; Tweten, Bette J., Ed.

    This compilation presents the proceedings of a conference on Indochinese refugee families. The papers included in the volume are the following: (1) "Counseling Vietnamese Women in Transition," by Tran Nhu Choung; (2) "Preliminary Nutritional and Demographic Assessment of Hmong Refugees in the Area of Puget Sound, Washington,"…

  8. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  9. Annual Proceedings of Selected Papers on the Practice of Educational Communications and Technology Presented at the Annual Convention of the Association for Educational Communications and Technology (32nd, Louisville, KY, 2009). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.

    2009-01-01

    For the thirty-second year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. This volume includes papers presented at the national convention of the Association for Educational Communications and Technology held in Louisville, KY. This…

  10. Turbo system technology for downsized high volume engines with PZEV capability

    Energy Technology Data Exchange (ETDEWEB)

    Bjoernsson, Haakan; Johansson, Lena [Volvo Car Corp., Gothenburg (Sweden)

    2008-07-01

    Turbo charging is not anymore only used for exotic high performance vehicles. Instead boosting technology will be used to provide high specific power output, reduced fuel consumption and consequently lower CO2 emissions for all types of vehicles in the future. The main reason for this change is the obvious need for a shift towards more fuel efficient down-sized engines which output that enables a lower fuel consumption. This new way of using turbo charging introduces a new set of demands which needs to be fulfilled before this technology can be applied efficiently in to mass production. Since one of the most basic down-sizing features is a high specific low end torque with an associated excellent transient behavior at low speeds, the overall charging efficiency, must be high over a broad speed and load range. The down-sizing effect also imply a more frequent use of the high specific power, which means that smaller engines needs to cope with a high exhaust gas temperature more often. High specific power output also has to be combined with strict US and EC emission legislation i.e. fulfillment of PZEV and Euro 6 emissions. This will add new challenges to turbo system development. Simultaneously, material prices have increased significantly over the last five years. Unfortunately, these new requirements have a tendency to push the piece price in the wrong direction, which means that down-sizing have a tendency to add high on-cost to the most price sensitive products. Therefore, the aim of this paper is to discuss possible solutions for future turbo system technology. (orig.)

  11. Testing-Based Compiler Validation for Synchronous Languages

    Science.gov (United States)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  12. Bibliography on Cold Regions Science and Technology Volume 53, Part 2

    Science.gov (United States)

    1999-12-01

    the velocity distribution in ping-pong- Martian north polar cap 1996-1997 [1999, eng] 53-3536 Ishii , S. ball avalanches [1998, eng] 53-2020 Iwasaki...warming [1988, Siberian Platform [1997, eng] 53-1740 Scharroo, R. eng] 53-1118 Sawa ,Y. Antarctic elevation change from 1992 to 1996 [1998, Schlich,R. Size...freezing technology for environmental remediation. road surface. Ishii , S., et al, [1998,eng] 53-2958 Atlantic Oceans-Reykjanes Ridge Dash, J.G., [1999,eng

  13. Light sources for high-volume manufacturing EUV lithography: technology, performance, and power scaling

    Science.gov (United States)

    Fomenkov, Igor; Brandt, David; Ershov, Alex; Schafgans, Alexander; Tao, Yezheng; Vaschenko, Georgiy; Rokitski, Slava; Kats, Michael; Vargas, Michael; Purvis, Michael; Rafac, Rob; La Fontaine, Bruno; De Dea, Silvia; LaForge, Andrew; Stewart, Jayson; Chang, Steven; Graham, Matthew; Riggs, Daniel; Taylor, Ted; Abraham, Mathew; Brown, Daniel

    2017-06-01

    Extreme ultraviolet (EUV) lithography is expected to succeed in 193-nm immersion multi-patterning technology for sub-10-nm critical layer patterning. In order to be successful, EUV lithography has to demonstrate that it can satisfy the industry requirements in the following critical areas: power, dose stability, etendue, spectral content, and lifetime. Currently, development of second-generation laser-produced plasma (LPP) light sources for the ASML's NXE:3300B EUV scanner is complete, and first units are installed and operational at chipmaker customers. We describe different aspects and performance characteristics of the sources, dose stability results, power scaling, and availability data for EUV sources and also report new development results.

  14. Army Science and Technology Master Plan, Fiscal Year 1997 - Volume 2.

    Science.gov (United States)

    1996-12-01

    be demonstrated in FY97 and 98, respectively, to exploit both commercial CDMA and BCDMA technology for MSE access. In order to extend ATM services to...Florence John Appel MAJ Paul Begeman TARDEC SARD-TT Armor Center, DFD (810)574-5473 697-8432 (502)624-8994 DSN : 786-5473 227-8432 464-8994 III.G.06. Hit...AATD SARDA-TF Avn Center (DCD) ATZQ-CD 804-878-4130 703-697-8434 334-255-3370/3489 DSN 927-4130 DSN 227-8434 DSN 558 A-4-2 D. AIR AND SPACE VEHICLES

  15. Conceptual design and systems analysis of photovoltaic power systems. Volume III(1). Technology

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, P.F.

    1977-05-01

    Conceptual designs were made and analyses were performed on three types of solar photovoltaic power systems. Included were Residential (1 to 10 kW), Intermediate (0.1 to 10 MW), and Central (50 to 1000 MW) Power Systems to be installed in the 1985 to 2000 time period. Subsystem technology presented here includes: insolation, concentration, silicon solar cell modules, CdS solar cell module, array structure, battery energy storage, power conditioning, residential power system architectural designs, intermediate power system structural design, and central power system facilities and site survey.

  16. Planetary/DOD entry technology flight experiments. Volume 2: Planetary entry flight experiments

    Science.gov (United States)

    Christensen, H. E.; Krieger, R. J.; Mcneilly, W. R.; Vetter, H. C.

    1976-01-01

    The technical feasibility of launching a high speed, earth entry vehicle from the space shuttle to advance technology for the exploration of the outer planets' atmospheres was established. Disciplines of thermodynamics, orbital mechanics, aerodynamics propulsion, structures, design, electronics and system integration focused on the goal of producing outer planet environments on a probe shaped vehicle during an earth entry. Major aspects of analysis and vehicle design studied include: planetary environments, earth entry environment capability, mission maneuvers, capabilities of shuttle upper stages, a comparison of earth entry planetary environments, experiment design and vehicle design.

  17. Industrial applications study. Volume III. Technology data base evaluation of waste recovery systems. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Harry L.; Hamel, Bernard B.; Karamchetty, Som; Steigelmann, William H.; Gajanana, Birur C.; Agarwal, Anil P.; Klock, Lawrence W.; Henderson, James M.; Calobrisi, Gary; Hedman, Bruce A.; Koluch, Michael; Biancardi, Frank; Bass, Robert; Landerman, Abraham; Peters, George; Limaye, Dilip; Price, Jeffrey; Farr, Janet

    1977-01-01

    An analytical study was undertaken to estimate the present and potential technical and economic characteristics of a wide range of components and complete systems capable of converting industrial and commercial waste heat into mechanical or electrical power and/or building and process heating and cooling. The component and system technologies evaluated include: Rankine-, Stirling-, and Brayton-cycle power systems; reciprocating-, rotary-, and turbo-expanders; heat exchangers and heat pumps; thermally driven cooling and dehumidification systems; and integrated systems capable of providing multiple outputs. Extensive analyses were conducted of Rankine-cycle systems using steam, halogenated hydrocarbons, and other organic compounds as working fluids. Performance characteristics, recoverable output power, and installed costs were estimated and are presented herein for Rankine-cycle systems utilizing selected working fluids over a range of waste heat source temperatures between approximately 200 and 1000/sup 0/F. Data describing the performance capabilities, technology and installed costs of heat exchangers, expanders and thermally driven absorption, vapor compression, steam-jet cooling and refrigeration systems are presented herein together with limited performance and cost estimates for Stirling-cycle power recovery systems. The component and system data were used to provide a preliminary assessment of the recoverable energy and associated system costs when integrated with generalized waste heat sources identified by Drexel University from their two-digit SIC industrial energy survey.

  18. Renewable Electricity Futures Study. Volume 2: Renewable Electricity Generation and Storage Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Augustine, C.; Bain, R.; Chapman, J.; Denholm, P.; Drury, E.; Hall, D.G.; Lantz, E.; Margolis, R.; Thresher, R.; Sandor, D.; Bishop, N.A.; Brown, S.R.; Cada, G.F.; Felker, F.

    2012-06-01

    The Renewable Electricity Futures (RE Futures) Study investigated the challenges and impacts of achieving very high renewable electricity generation levels in the contiguous United States by 2050. The analysis focused on the sufficiency of the geographically diverse U.S. renewable resources to meet electricity demand over future decades, the hourly operational characteristics of the U.S. grid with high levels of variable wind and solar generation, and the potential implications of deploying high levels of renewables in the future. RE Futures focused on technical aspects of high penetration of renewable electricity; it did not focus on how to achieve such a future through policy or other measures. Given the inherent uncertainties involved with analyzing alternative long-term energy futures as well as the multiple pathways that might be taken to achieve higher levels of renewable electricity supply, RE Futures explored a range of scenarios to investigate and compare the impacts of renewable electricity penetration levels (30%-90%), future technology performance improvements, potential constraints to renewable electricity development, and future electricity demand growth assumptions. RE Futures was led by the National Renewable Energy Laboratory (NREL) and the Massachusetts Institute of Technology (MIT).

  19. Renewable Electricity Futures Study. Volume 2. Renewable Electricity Generation and Storage Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Augustine, Chad [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bain, Richard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chapman, Jamie [Texas Tech Univ., Lubbock, TX (United States); Denholm, Paul [National Renewable Energy Lab. (NREL), Golden, CO (United States); Drury, Easan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hall, Douglas G. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lantz, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Thresher, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bishop, Norman A. [Knight Piesold, Denver, CO (United States); Brown, Stephen R. [HDR/DTA, Portland, ME (Untied States); Cada, Glenn F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Felker, Fort [National Renewable Energy Lab. (NREL), Golden, CO (United States); Fernandez, Steven J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Goodrich, Alan C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hagerman, George [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); O' Neil, Sean [Ocean Renewable Energy Coalition, Portland, OR (United States); Paquette, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tegen, Suzanne [National Renewable Energy Lab. (NREL), Golden, CO (United States); Young, Katherine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-06-15

    The Renewable Electricity Futures (RE Futures) Study investigated the challenges and impacts of achieving very high renewable electricity generation levels in the contiguous United States by 2050. The analysis focused on the sufficiency of the geographically diverse U.S. renewable resources to meet electricity demand over future decades, the hourly operational characteristics of the U.S. grid with high levels of variable wind and solar generation, and the potential implications of deploying high levels of renewables in the future. RE Futures focused on technical aspects of high penetration of renewable electricity; it did not focus on how to achieve such a future through policy or other measures. Given the inherent uncertainties involved with analyzing alternative long-term energy futures as well as the multiple pathways that might be taken to achieve higher levels of renewable electricity supply, RE Futures explored a range of scenarios to investigate and compare the impacts of renewable electricity penetration levels (30%–90%), future technology performance improvements, potential constraints to renewable electricity development, and future electricity demand growth assumptions. RE Futures was led by the National Renewable Energy Laboratory (NREL) and the Massachusetts Institute of Technology (MIT). Learn more at the RE Futures website. http://www.nrel.gov/analysis/re_futures/

  20. Nanoprobe NAPPA Arrays for the Nanoconductimetric Analysis of Ultra-Low-Volume Protein Samples Using Piezoelectric Liquid Dispensing Technology

    Directory of Open Access Journals (Sweden)

    Eugenia Pechkova

    2015-03-01

    Full Text Available In the last years, the evolution and the advances of the nanobiotechnologies applied to the systematic study of proteins, namely proteomics, both structural and functional, and specifically the development of more sophisticated and largescale protein arrays, have enabled scientists to investigate protein interactions and functions with an unforeseeable precision and wealth of details. Here, we present a further advancement of our previously introduced and described Nucleic Acid Programmable Protein Arrays (NAPPA-based nanoconductometric sensor. We coupled Quartz Crystal Microbalance with Dissipation factor Monitoring (QCM_D with piezoelectric inkjet printing technology (namely, the newly developed ActivePipette, which enables to significantly reduce the volume of probe required for genes/proteins arrays. We performed a negative control (with master mix, or MM and a positive control (MM_p53 plus MDM2. We performed this experiment both in static and in flow, computing the apparent dissociation constant of p53-MDM2 complex (130 nM, in excellent agreement with the published literature. We compared the results obtained with the ActivePipette printing and dispensing technology vs. pin spotting. Without the ActivePipette, after MDM2 addition the shift in frequency (Δf was 7575 Hz and the corresponding adsorbed mass was 32.9 μg. With the ActivePipette technology, after MDM2 addition Δf was 7740 Hz and the corresponding adsorbed mass was 33.6 μg. With this experiment, we confirmed the sensing potential of our device, being able to discriminate each gene and protein as well as their interactions, showing for each one of them a unique conductance curve. Moreover, we obtained a better yield with the ActivePipette technology.

  1. Complete blood count using VCS (volume, conductivity, light scatter) technology is affected by hyperlipidemia in a child with acute leukemia.

    Science.gov (United States)

    Gokcebay, D G; Azik, F M; Isik, P; Bozkaya, I O; Kara, A; Tavil, E B; Yarali, N; Tunc, B

    2011-12-01

    Asparaginase, an effective drug in the treatment of childhood acute lymphoblastic leukemia (ALL), has become an important component of most childhood ALL regimens during the remission induction or intensification phases of treatment. The incidence range of asparaginase-associated lipid abnormalities that are seen in children is 67-72%. Lipemia causes erroneous results, which uses photometric methods to analyze blood samples. We describe a case of l-asparaginase-associated severe hyperlipidemia with complete blood count abnormalities. Complete blood count analysis was performed with Beckman COULTER(®) GEN·S™ system, which uses the Coulter Volume, Conductivity, Scatter technology to probe hydrodynamically focused cells. Although an expected significant inaccuracy in hemoglobin determination occurred starting from a lipid value of 3450 mg/dl, we observed that triglyceride level was 1466 mg/dl. Complete blood count analysis revealed that exceptionally high hemoglobin, mean corpuscular hemoglobin, and mean corpuscular hemoglobin concentration levels vs. discordant with red blood cell count, mean corpuscular volume, and hematocrit levels. Total leukocyte count altered spontaneously in a wide range, and was checked with blood smear. Platelet count was in expected range (Table 1). Thus, we thought it was a laboratory error, and the patient's follow-up especially for red cell parameters was made by red blood cell and hematocrit values.

  2. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK`s symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  3. Automatic code generation in SPARK: Applications of computer algebra and compiler-compilers

    Energy Technology Data Exchange (ETDEWEB)

    Nataf, J.M.; Winkelmann, F.

    1992-09-01

    We show how computer algebra and compiler-compilers are used for automatic code generation in the Simulation Problem Analysis and Research Kernel (SPARK), an object oriented environment for modeling complex physical systems that can be described by differential-algebraic equations. After a brief overview of SPARK, we describe the use of computer algebra in SPARK's symbolic interface, which generates solution code for equations that are entered in symbolic form. We also describe how the Lex/Yacc compiler-compiler is used to achieve important extensions to the SPARK simulation language, including parametrized macro objects and steady-state resetting of a dynamic simulation. The application of these methods to solving the partial differential equations for two-dimensional heat flow is illustrated.

  4. Process compilation methods for thin film devices

    Science.gov (United States)

    Zaman, Mohammed Hasanuz

    This doctoral thesis presents the development of a systematic method of automatic generation of fabrication processes (or process flows) for thin film devices starting from schematics of the device structures. This new top-down design methodology combines formal mathematical flow construction methods with a set of library-specific available resources to generate flows compatible with a particular laboratory. Because this methodology combines laboratory resource libraries with a logical description of thin film device structure and generates a set of sequential fabrication processing instructions, this procedure is referred to as process compilation, in analogy to the procedure used for compilation of computer programs. Basically, the method developed uses a partially ordered set (poset) representation of the final device structure which describes the order between its various components expressed in the form of a directed graph. Each of these components are essentially fabricated "one at a time" in a sequential fashion. If the directed graph is acyclic, the sequence in which these components are fabricated is determined from the poset linear extensions, and the component sequence is finally expanded into the corresponding process flow. This graph-theoretic process flow construction method is powerful enough to formally prove the existence and multiplicity of flows thus creating a design space {cal D} suitable for optimization. The cardinality Vert{cal D}Vert for a device with N components can be large with a worst case Vert{cal D}Vert≤(N-1)! yielding in general a combinatorial explosion of solutions. The number of solutions is hence controlled through a-priori estimates of Vert{cal D}Vert and condensation (i.e., reduction) of the device component graph. The mathematical method has been implemented in a set of algorithms that are parts of the software tool MISTIC (Michigan Synthesis Tools for Integrated Circuits). MISTIC is a planar process compiler that generates

  5. Compiling CIL Rewriting Language for Multiprocessors

    Institute of Scientific and Technical Information of China (English)

    田新民; 王鼎兴; 等

    1994-01-01

    The high-level Conpiler Intermediate Language CIL is a general-purpose description language of parallel graph rewriting computational model intended for paralled implementation of declarative languages on multiprocessor systems.In this paper,we first outline a new Hybrid Execution Model(HEM) and corresponding parallel abstract machine PAM/TGR based on extended parallel Graph Rewriting Computational Model EGRCM for implementing CIL language on distributed memory multiprocessor systems.Then we focus on the compiling CIL language with various optimizing techniques such as pattern matching,rule indexing,node ordering and compile-time partial scheduling.The experimental results on a 16-node transputer Array demonstrates the effectiveness of our model and strategies.

  6. Distributed technologies in California's energy future. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, M.; Craig, P.; McGuire, C.B.; Simmons, M. (eds.)

    1977-09-01

    This interim report contains eight of the eighteen chapters included in the complete report. In Chapter I, pertinent data, facts, and observations are made following an initial summary. Chapter II is an introduction, citing especially the writings of Amory Lovins. The criteria used in defining distributed systems, suggested by Lovins, are that the technologies be renewable, environmentally benign, local, subject to graceful failure, foolproof, flexible, comprehensible, and matched in energy quality. The following chapters are: The Energy Predicament; The California Setting; Energy Resources for California's Future; Alternative Energy Futures for California; Issues and Problems; and Directions for Future Work. Six appendices deal with residential heating loads and air conditioning, allocations, co-generation, population projections, and the California wind energy resource. (MCW)

  7. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    Science.gov (United States)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  8. Lewis Structures Technology, 1988. Volume 3: Structural Integrity Fatigue and Fracture Wind Turbines HOST

    Science.gov (United States)

    1988-01-01

    The charter of the Structures Division is to perform and disseminate results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practioners of structural engineering mechanics beyond the aerospace arena. The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.

  9. Building America Best Practices Series, Volume 6: High-Performance Home Technologies: Solar Thermal & Photovoltaic Systems

    Energy Technology Data Exchange (ETDEWEB)

    Baechler, Michael C.; Gilbride, Theresa L.; Ruiz, Kathleen A.; Steward, Heidi E.; Love, Pat M.

    2007-06-04

    This guide is was written by PNNL for the US Department of Energy's Building America program to provide information for residential production builders interested in building near zero energy homes. The guide provides indepth descriptions of various roof-top photovoltaic power generating systems for homes. The guide also provides extensive information on various designs of solar thermal water heating systems for homes. The guide also provides construction company owners and managers with an understanding of how solar technologies can be added to their homes in a way that is cost effective, practical, and marketable. Twelve case studies provide examples of production builders across the United States who are building energy-efficient homes with photovoltaic or solar water heating systems.

  10. Solar thermal technology development: Estimated market size and energy cost savings. Volume 1: Executive summary

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. The fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. STT R&D is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), dependng on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest.

  11. Proceedings of the Twentieth International Symposium on Space Technology and Science. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-10-31

    The 20th International Symposium on Space Technology and Science was held in Japan on May 19-25, 1996, and a lot of papers were made public. This proceedings has 252 papers of all the papers read in the symposium including the following: Computational fluid dynamics in the design of M-V rocket motors in the propulsion field; Joint structures of carbon-carbon composites in the field of materials and structures; On-orbit attitude control experiment of ETS-VI in the field of astrodynamics, navigation, guidance and control; Magnetic transport of bubbles in liquid in microgravity; The outline and development status of JEM-EF in the field of on-orbit and ground support systems. The proceedings also includes the papers titled Conceptual study of H-IIA rocket in the space transportation field; Microgravity research in the microgravity science field; `Project Genesys` in the field of satellite communications and broadcasting.

  12. Specialized Silicon Compilers for Language Recognition.

    Science.gov (United States)

    1984-07-01

    the circu its produced by a compiler can be vcrified by formal methods. Fzach primitive cell can be checked independently of the others. When all... primitive cell , each non-terminal corresponds to a more complex combination of cells, and each production corresponds to a construction rule. A...terminal symbol is reached during the parse, the corresponding primitive cell is added to the circuit. 14 "The following grammar for regular expressions is

  13. 1991 OCRWM bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year`s Bulletins.

  14. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  15. Transforming Science and Technology: Our Future Depends on It. Volume 1 [and] Volume 2: Proceedings and Contributions to the International Gender and Science and Technology Conference (7th, Waterloo, Ontario, Canada, July 31-August 5, 1993) = Transformer les sciences et la technologie: notre avenir en depend. Volume 1 [and] Volume 2. Les soumissions a la septieme conference internationale sur l'equite des sexes en science et en technologie (du 31 juillet au 5 aout 1993).

    Science.gov (United States)

    Haggerty, Sharon, Ed.; Holmes, Ann, Ed.

    This two-volume set of papers was produced for the seventh International Gender and Science and Technology (GASTA) Conference. Abstracts of all papers and other presentations have been translated and are published in both English and French. Papers are published in the language in which they were submitted (English or French). GASAT provides a…

  16. Study of power management technology for orbital multi-100KWe applications. Volume 2: Study results

    Science.gov (United States)

    Mildice, J. W.

    1980-07-01

    The preliminary requirements and technology advances required for cost effective space power management systems for multi-100 kilowatt requirements were identified. System requirements were defined by establishing a baseline space platform in the 250 KE KWe range and examining typical user loads and interfaces. The most critical design parameters identified for detailed analysis include: increased distribution voltages and space plasma losses, the choice between ac and dc distribution systems, shuttle servicing effects on reliability, life cycle costs, and frequency impacts to power management system and payload systems for AC transmission. The first choice for a power management system for this kind of application and size range is a hybrid ac/dc combination with the following major features: modular design and construction-sized minimum weight/life cycle cost; high voltage transmission (100 Vac RMS); medium voltage array or = 440 Vdc); resonant inversion; transformer rotary joint; high frequency power transmission line or = 20 KHz); energy storage on array side or rotary joint; fully redundant; and 10 year life with minimal replacement and repair.

  17. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  18. Educacion en Ciencia, Tecnologia y Sociedad: Teoria y Practica (Education in Science, Technology, and Society: Theory and Practice).

    Science.gov (United States)

    Pena Borrero, Margarita, Ed.

    This volume compiles Spanish translations of seven articles on different aspects of Science, Technology and Society Education. The papers, originally written in English, were used during the first in-service training seminar for high school science teachers, which took place in Mayaguez under (Puerto Rico) joint sponsorship of the National…

  19. An Object-Oriented Approach to C++ Compiler Technology

    NARCIS (Netherlands)

    Sminchisescu, Cristian; Telea, Alexandru

    1999-01-01

    This paper focuses on the use of object-oriented approaches to syntactical and semantical analysis for complex object-oriented languages like C++. We are interested in these issues both from a design and implementation point of view. We implement a semantic analyzer in an object-oriented manner, usi

  20. Developments and innovation in carbon dioxide (CO{sub 2}) capture and storage technology. Volume 1: carbon dioxide (CO{sub 2}) capture, transport and industrial applications

    Energy Technology Data Exchange (ETDEWEB)

    Mercedes Maroto-Valer, M. (ed.)

    2010-07-01

    This volume initially reviews the economics, regulation and planning of CCS for power plants and industry, and goes on to explore developments and innovation in post- and pre-combustion and advanced combustion processes and technologies for CO{sub 2} capture in power plants. This coverage is extended with sections on CO{sub 2} compression, transport and injection and industrial applications of CCS technology, including in the cement and concrete and iron and steel industries.

  1. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  2. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  3. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  4. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  5. Compiling ER Specifications into Declarative Programs

    CERN Document Server

    Braßel, Bernd; Muller, Marion

    2007-01-01

    This paper proposes an environment to support high-level database programming in a declarative programming language. In order to ensure safe database updates, all access and update operations related to the database are generated from high-level descriptions in the entity- relationship (ER) model. We propose a representation of ER diagrams in the declarative language Curry so that they can be constructed by various tools and then translated into this representation. Furthermore, we have implemented a compiler from this representation into a Curry program that provides access and update operations based on a high-level API for database programming.

  6. Final report from VFL Technologies for the pilot-scale thermal treatment of Lower East Fork Poplar Creek floodplain soils. LEFPC appendices. Volume 5. Appendix V-D

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    This final report from VFL Technologies for the pilot-scale thermal treatment of lower East Fork Poplar Creek floodplain soils dated September 1994 contains LEFPC Appendices, Volume 5, Appendix V - D. This appendix includes the final verification run data package (PAH, TCLP herbicides, TCLP pesticides).

  7. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (36th, Anaheim, California, 2013). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2013-01-01

    For the thirty-sixth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Anaheim, California. The Proceedings of AECT's Convention are published in two…

  8. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (38th, Indianapolis, Indiana, 2015). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2015-01-01

    For the thirty-eighth time, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Indianapolis, Indiana. The Proceedings of AECT's Convention are published in two…

  9. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (32nd, Louisville, KY, 2009). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2009-01-01

    For the thirty-second year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the national AECT Convention in Louisville, Kentucky. The Proceedings of AECT's Convention are published in two…

  10. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (37th, Jacksonville, Florida, 2014). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2014-01-01

    For the thirty-seventh year, the Research and Theory Division and the Division of Instructional Design of the Association for Educational Communications and Technology (AECT) sponsored the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Jacksonville, Florida. This year's Proceedings…

  11. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (36th, Anaheim, California, 2013). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.

    2013-01-01

    For the thirty-sixth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Anaheim, California. The Proceedings of AECT's Convention are published in two…

  12. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (33rd, Anaheim, California, 2010). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2010-01-01

    For the thirty-third year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the national AECT Convention in Anaheim, California. The Proceedings of AECT's Convention are published in two…

  13. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (35th, Louisville, Kentucky, 2012). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.

    2012-01-01

    For the thirty-fifth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the national AECT Convention in Louisville, Kentucky. The Proceedings of AECT's Convention are published in two…

  14. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology (35th, Louisville, Kentucky, 2012). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.

    2012-01-01

    For the thirty-fifth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the national AECT Convention in Louisville, Kentucky. The Proceedings of AECT's Convention are published in two…

  15. Annual Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology (28th, Orlando, Florida, 2005). Volume 1

    Science.gov (United States)

    Simonson, Michael, Ed.; Crawford, Margaret, Ed.

    2005-01-01

    For the twenty-eighth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the National AECT Convention in Orlando, Florida. The Proceedings of AECT's Convention are published in two…

  16. Annual Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology (28th, Orlando, Florida, 2005). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.; Crawford, Margaret, Ed.

    2005-01-01

    For the twenty-eighth year, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the National AECT Convention in Orlando, Florida. The Proceedings of AECT's Convention are published in two…

  17. Technology of high-level nuclear waste disposal. Advances in the science and engineering of the management of high-level nuclear wastes. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, P.L. (ed.)

    1982-01-01

    The twenty papers in this volume are divided into three parts: site exploration and characterization; repository development and design; and waste package development and design. These papers represent the status of technology that existed in 1981 and 1982. Individual papers were processed for inclusion in the Energy Data Base.

  18. Industrial Sector Technology Use Model (ISTUM): industrial energy use in the United States, 1974-2000. Volume 1. Primary model documentation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bohn, Roger E.; Herod, J. Steven; Andrews, Gwen L.; Budzik, Philip M.; Eissenstat, Richard S.; Grossmann, John R.; Reiner, Gary M.; Roschke, Thomas E.; Shulman, Michael J.; Toppen, Timothy R.; Veno, William R.; Violette, Daniel M.; Smolinski, Michael D.; Habel, Deborah; Cook, Alvin E.

    1979-10-01

    ISTUM is designed to predict the commercial market penetration of various energy technologies in the industrial sector out to the year 2000. It is a refinement and further development of Market Oriented Program Planning Study task force in 1977. ISTUM assesses the comparative economic competitiveness of each technology and competes over 100 energy technologies - conventionals, fossil/energy, conservation, cogeneration, solar, and geothermal. A broad overview of the model, the solution of the model, and an in-depth discussion of strength and limitations of the model are provided in Volume I. (MCW)

  19. Compiling a Corpus for Teaching Medical Translation

    Directory of Open Access Journals (Sweden)

    Elizabeth de la Teja Bosch

    2014-04-01

    Full Text Available Background: medical translation has countless documentary sources; the major difficulty lies in knowing how to assess them. The corpus is the ideal tool to perform this activity in a rapid and reliable way, and to define the learning objectives based on text typology and oriented towards professional practice.Objective: to compile an electronic corpus that meets the requirements of the professional practice to perform specialized medical translation. Methods: a pedagogical research was conducted in the province of Cienfuegos. The units of analysis involved records from translators of the Provincial Medical Sciences Information Center and specialized translators in this field, who completed a questionnaire to accurately determine their information needs, conditioning the corpus design criteria. The analysis of a set of texts extracted from highly reputable sources led to the text selection and final compilation. Subsequently, the validation of the corpus as a documentary tool for teaching specialized medical translation was performed. Results: there was a concentration of translation assignments in the topics: malignant tumors, hypertension, heart disease, diabetes mellitus and pneumonias. The predominant text typologies were: evaluative and dissemination of current research, with plenty original articles and reviews. The text corpus design criteria were: unannotated, documented, specialized, monitor and comparable. Conclusions: the corpus is a useful tool to show the lexical, terminological, semantic, discursive and contextual particularities of biomedical communication. It allows defining learning objectives and translation problems. Key words: teaching; translating; medicine

  20. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    1994-01-01

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  1. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  2. Proceedings of waste stream minimization and utilization innovative concepts: An experimental technology exchange. Volume 1, Industrial solid waste processing municipal waste reduction/recycling

    Energy Technology Data Exchange (ETDEWEB)

    Lee, V.E. [ed.; Watts, R.L.

    1993-04-01

    This two-volume proceedings summarizes the results of fifteen innovations that were funded through the US Department of Energy`s Innovative Concept Program. The fifteen innovations were presented at the sixth Innovative Concepts Fair, held in Austin, Texas, on April 22--23, 1993. The concepts in this year`s fair address innovations that can substantially reduce or use waste streams. Each paper describes the need for the proposed concept, the concept being proposed, and the concept`s economics and market potential, key experimental results, and future development needs. The papers are divided into two volumes: Volume 1 addresses innovations for industrial solid waste processing and municipal waste reduction/recycling, and Volume 2 addresses industrial liquid waste processing and industrial gaseous waste processing. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  3. Compilation of Sandia coal char combustion data and kinetic analyses

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.E.; Hurt, R.H.; Baxter, L.L.; Hardesty, D.R.

    1992-06-01

    An experimental project was undertaken to characterize the physical and chemical processes that govern the combustion of pulverized coal chars. The experimental endeavor establishes a database on the reactivities of coal chars as a function of coal type, particle size, particle temperature, gas temperature, and gas and composition. The project also provides a better understanding of the mechanism of char oxidation, and yields quantitative information on the release rates of nitrogen- and sulfur-containing species during char combustion. An accurate predictive engineering model of the overall char combustion process under technologically relevant conditions in a primary product of this experimental effort. This document summarizes the experimental effort, the approach used to analyze the data, and individual compilations of data and kinetic analyses for each of the parent coals investigates.

  4. 1988 DOE model conference proceedings: Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    1988-01-01

    These Proceedings of the October 3-7, 1988, DOE Model Conference are a compilation of the papers that were presented in the technical or poster sessions at the conference. Papers and posters not submitted for publication are not included in the Proceedings. The Table of Contents lists the titles of papers as well as the names of the presenters. These individuals are not, in all cases, the primary authors of the papers published. The actual title pages, appearing later with the papers, show the primary author(s) and all co-authors. The papers in all three volumes of the Proceedings appear as they were originally submitted for publication and have not been edited or changed in any way. Topics discussed in Volume 4 include site characterization and remediation projects, environmental monitoring and modeling; disposal site selection and facility design, risk assessment, safety and health issues, and site remediation technology.

  5. 1988 DOE model conference proceedings: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    1988-01-01

    These Proceedings of the October 3-7, 1988, DOE Model Conference are a compilation of the papers that were presented in the technical or poster sessions at the conference. Papers and posters not submitted for publication are not included in the Proceedings. The Table of Contents lists the titles of papers as well as the names of the presenters. These individuals are not, in all cases, the primary authors of the papers published. The actual title pages, appearing later with the papers, show the primary author(s) and all co-authors. The papers in all three volumes of the Proceedings appear as they were originally submitted for publication and have not been edited or changed in any way. Topics included in Volume 1 are Environmental Data Management, Site characterization technology, Wastewater treatment, Waste management in foreign countries, Transuranic waste management, and Groundwater characterization and treatment.

  6. 1988 DOE model conference proceedings: Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    1988-01-01

    These Proceedings of the October 3--7, 1988 DOE Model Conference are a compilation of the papers that were presented in the technical or poster sessions at the conference papers and posters not submitted for publication are not included in the Proceedings. The Table of Contents lists the titles of papers as well as the names of the presenters. These individuals are not, in all cases, the primary authors of the papers published. The actual title pages, appearing later with the papers, show the primary author(s) and all co-authors. The papers in all three volumes of the Proceedings appear as they were originally submitted for publication and have not been edited or changed in any way. Topics discussed in Volume 5 include environmental assessments and program strategies, waste treatment technologies, and regulations and compliance studies.

  7. Low-level radioactive waste from commercial nuclear reactors. Volume 1. Recommendations for technology developments with potential to significantly improve low-level radioactive waste management

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, B.R.; Jolley, R.L.

    1986-02-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 1 provides an executive summary and a general introduction to the four-volume set, in addition to recommendations for research and development (R and D) for low-level radioactive waste (LLRW) treatment. Generic, long-range, and/or high-risk programs identified and prioritized as needed R and D in the LLRW field include: (1) systems analysis to develop decision methodology; (2) alternative processes for dismantling, decontaminating, and decommissioning; (3) ion exchange; (4) incinerator technology; (5) disposal technology; (6) demonstration of advanced technologies; (7) technical assistance; (8) below regulatory concern materials; (9) mechanical treatment techniques; (10) monitoring and analysis procedures; (11) radical process improvements; (12) physical, chemical, thermal, and biological processes; (13) fundamental chemistry; (14) interim storage; (15) modeling; and (16) information transfer. The several areas are discussed in detail.

  8. Performance Evaluation of Advanced Retrofit Roof Technologies Using Field-Test Data Phase Three Final Report, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Kaushik [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Childs, Phillip W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Atchley, Jerald Allen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    This article presents some miscellaneous data from two low-slope and two steep-slope experimental roofs. The low-slope roofs were designed to compare the performance of various roof coatings exposed to natural weatherization. The steep-slope roofs contained different combinations of phase change material, rigid insulation, low emittance surface and above-sheathing ventilation, with standing-seam metal panels on top. The steep-slope roofs were constructed on a series of adjacent attics separated at the gables using thick foam insulation. This article describes phase three (3) of a study that began in 2009 to evaluate the energy benefits of a sustainable re-roofing technology utilizing standing-seam metal roofing panels combined with energy efficient features like above-sheathing-ventilation (ASV), phase change material (PCM) and rigid insulation board. The data from phases 1 and 2 have been previously published and reported [Kosny et al., 2011; Biswas et al., 2011; Biswas and Childs, 2012; Kosny et al., 2012]. Based on previous data analyses and discussions within the research group, additional test roofs were installed in May 2012, to test new configurations and further investigate different components of the dynamic insulation systems. Some experimental data from phase 3 testing from May 2012 to December 2013 and some EnergyPlus modeling results have been reported in volumes 1 and 3, respectively, of the final report [Biswas et al., 2014; Biswas and Bhandari, 2014].

  9. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  10. Advanced compilation techniques in the PARADIGM compiler for distributed-memory multicomputers

    Science.gov (United States)

    Su, Ernesto; Lain, Antonio; Ramaswamy, Shankar; Palermo, Daniel J.; Hodges, Eugene W., IV; Banerjee, Prithviraj

    1995-01-01

    The PARADIGM compiler project provides an automated means to parallelize programs, written in a serial programming model, for efficient execution on distributed-memory multicomputers. .A previous implementation of the compiler based on the PTD representation allowed symbolic array sizes, affine loop bounds and array subscripts, and variable number of processors, provided that arrays were single or multi-dimensionally block distributed. The techniques presented here extend the compiler to also accept multidimensional cyclic and block-cyclic distributions within a uniform symbolic framework. These extensions demand more sophisticated symbolic manipulation capabilities. A novel aspect of our approach is to meet this demand by interfacing PARADIGM with a powerful off-the-shelf symbolic package, Mathematica. This paper describes some of the Mathematica routines that performs various transformations, shows how they are invoked and used by the compiler to overcome the new challenges, and presents experimental results for code involving cyclic and block-cyclic arrays as evidence of the feasibility of the approach.

  11. Annual Proceedings of Selected Papers on the Practice of Education Communications and Technology Presented at the Annual Convention of the Association for Educational Communications and Technology (38th, Indianapolis, Indiana, 2015). Volume 2

    Science.gov (United States)

    Simonson, Michael, Ed.

    2015-01-01

    For the thirty-eighth time, the Research and Theory Division of the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Indianapolis, Indiana. The Proceedings of AECT's Convention are published in two…

  12. Tribological Technology. Volume II.

    Science.gov (United States)

    1982-09-01

    TABLE 7 PRESSURE-VISCOSITY COEFFICIENTS (REF. 11) I - . ..- 445 103 -- 02 -s 10 00 A04wCm eve * oPe’-re’ -.* : ;-T’ enl mne-SI C. oC-ftht’E 050 100...821745~r A𔃿 32 l C 225 Mt25 5 12 o60 310 it 34C 6 85 io 11 * 3f5 . 295 t~ri~ Lortoing.~ eve lrsi~in4635 W-ongor St. K-o,- CrY. Wi~- 84 112 -7.BLE 11...forming on the surfaces and this phenomenon has been used by March and Rabinowicz (1976) for incipient fatigu 6 investigations using a rolling four-ball

  13. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    The performance of many parallel applications relies not on instruction-level parallelism but on loop-level parallelism. Unfortunately, automatic parallelization of loops is a fragile process; many different obstacles affect or prevent it in practice. To address this predicament we developed...... an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection......, resulting in scalable parallelized code that runs up to 8.3 times faster on an eightcore Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should be combined...

  14. A survey of compiler optimization techniques

    Science.gov (United States)

    Schneck, P. B.

    1972-01-01

    Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.

  15. Programming cells: towards an automated 'Genetic Compiler'.

    Science.gov (United States)

    Clancy, Kevin; Voigt, Christopher A

    2010-08-01

    One of the visions of synthetic biology is to be able to program cells using a language that is similar to that used to program computers or robotics. For large genetic programs, keeping track of the DNA on the level of nucleotides becomes tedious and error prone, requiring a new generation of computer-aided design (CAD) software. To push the size of projects, it is important to abstract the designer from the process of part selection and optimization. The vision is to specify genetic programs in a higher-level language, which a genetic compiler could automatically convert into a DNA sequence. Steps towards this goal include: defining the semantics of the higher-level language, algorithms to select and assemble parts, and biophysical methods to link DNA sequence to function. These will be coupled to graphic design interfaces and simulation packages to aid in the prediction of program dynamics, optimize genes, and scan projects for errors.

  16. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    Research in the design of aspect-oriented programming languages requires a workbench that facilitates easy experimentation with new language features and implementation techniques. In particular, new features for AspectJ have been proposed that require extensions in many dimensions: syntax, type...... checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  17. Industrial Sector Technology Use Model (ISTUM): industrial energy use in the United States, 1974-2000. Volume 3. Appendix on service and fuel demands. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    This book is the third volume of the ISTUM report. The first volume of the report describes the primary model logic and the model's data inputs. The second volume lists and evaluates the results of one model run. This and the fourth volume give supplementary information in two sets of model data - the energy consumption base and technology descriptions. Chapter III of Vol. I, Book 1 describes the ISTUM demand base and explains how that demand base was developed. This volume serves as a set of appendices to that chapter. The chapter on demands in Vol. I describes the assumptions and methodology used in constructing the ISTUM demand base; this volume simply lists tables of data from that demand base. This book divides the demand tables into two appendices. Appendix III-1 contains detailed tables on ISTUM fuel-consumption estimates, service-demand forecasts, and size and load-factor distributions. Appendix III-2 contains tables detailing ISTUM allocations of each industry's fuel consumption to service sectors. The tables show how the ECDB was used to develop the ISTUM demand base.

  18. Environmental Restoration/Waste Management - applied technology. Semiannual report, July 1992--June 1993, Volume 1, Number 2, and Volume 2, Number 1

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, P.W.; Bruner, J.M.; Price, M.E.; Talaber, C.J. [eds.

    1993-12-31

    The Environmental Restoration/Waste Management-Applied Technology (ER/WM-AT) Program is developing restoration and waste treatment technologies needed for the ongoing environmental cleanup of the Department of Energy (DOE) complex and treatment technologies for wastes generated in the nuclear weapons production complex. These technologies can find application to similar problems nationally and even worldwide. They can be demonstrated at the Livermore site, which mirrors (on a small scale) many of the environmental and waste management problems of the rest of the DOE complex. Their commercialization should speed cleanup, and the scope of the task should make it attractive to US industry. The articles in this semi-annual report cover the following areas: ceramic final forms for residues of mixed waste treatment; treatment of wastes containing sodium nitrate; actinide volatility in thermal oxidation processes; in situ microbial filters for remediating contaminated soils; collaboration with scientists in the former Soviet Union on new ER/WM technologies; and fiber-optic sensors for chlorinated organic solvents.

  19. Beyond Volume: Hospital-Based Healthcare Technology for Better Outcomes in Cerebrovascular Surgical Patients Diagnosed With Ischemic Stroke: A Population-Based Nationwide Cohort Study From 2002 to 2013.

    Science.gov (United States)

    Kim, Jae-Hyun; Park, Eun-Cheol; Lee, Sang Gyu; Lee, Tae-Hyun; Jang, Sung-In

    2016-03-01

    We examined whether the level of hospital-based healthcare technology was related to the 30-day postoperative mortality rates, after adjusting for hospital volume, of ischemic stroke patients who underwent a cerebrovascular surgical procedure. Using the National Health Insurance Service-Cohort Sample Database, we reviewed records from 2002 to 2013 for data on patients with ischemic stroke who underwent cerebrovascular surgical procedures. Statistical analysis was performed using Cox proportional hazard models to test our hypothesis. A total of 798 subjects were included in our study. After adjusting for hospital volume of cerebrovascular surgical procedures as well as all for other potential confounders, the hazard ratio (HR) of 30-day mortality in low healthcare technology hospitals as compared to high healthcare technology hospitals was 2.583 (P technology hospitals with high volume as compared to high healthcare technology hospitals with high volume was the highest (10.014, P technology hospitals had the highest 30-day mortality rate, irrespective of hospital volume. Although results of our study provide scientific evidence for a hospital volume/30-day mortality rate relationship in ischemic stroke patients who underwent cerebrovascular surgical procedures, our results also suggest that the level of hospital-based healthcare technology is associated with mortality rates independent of hospital volume. Given these results, further research into what components of hospital-based healthcare technology significantly impact mortality is warranted.

  20. Continuation-Passing C, compiling threads to events through continuations

    CERN Document Server

    Kerneis, Gabriel

    2010-01-01

    In this paper, we introduce Continuation Passing C (CPC), a programming language for concurrent systems in which native and cooperative threads are unified and presented to the programmer as a single abstraction. The CPC compiler uses a compilation technique, based on the CPS transform, that yields efficient code and an extremely lightweight representation for contexts. We provide a complete proof of the correctness of our compilation scheme. We show in particular that lambda-lifting, a common compilation technique for functional languages, is also correct in an imperative language like C, under some conditions enforced by the CPC compiler. The current CPC compiler is mature enough to write substantial programs such as Hekate, a highly concurrent BitTorrent seeder. Our benchmark results show that CPC is as efficient, while significantly cheaper, as the most efficient thread libraries available.

  1. Design and Implementation of Java Just-in-Time Compiler

    Institute of Scientific and Technical Information of China (English)

    丁宇新; 梅嘉; 程虎

    2000-01-01

    Early Java implementations relied on interpretation, leading to poor performance compared to compiled programs. Java just-in-time (JIT) compiler can compile Java programs at runtime, so it not only improves Java's performance prominently, but also preserves Java's portability. In this paper the design and implementing techniques of Java JIT compiler based on Chinese open system are discussed in detail. To enhance the portability, a translating method which combines the static simulating method and macro expansion method is adopted. The optimization technique for JIT compiler is also discussed and a way to evaluate the hotspots in Java programs is presented. Experiments have been conducted to verify JIT compilation technique as an efficient way to accelerate Java.

  2. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  3. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  4. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  5. Performance of Compiler-Assisted Memory Safety Checking

    Science.gov (United States)

    2014-08-01

    Performance of Compiler -Assisted Memory Safety Checking David Keaton Robert C. Seacord August 2014 TECHNICAL NOTE CMU/SEI-2014-TN...014 | vii Abstract Buffer overflows affect a large installed base of C code. This technical note describes the criteria for deploying a compiler ...describes a modification to the LLVM compiler to enable hoisting bounds checks from loops and functions. This proof-of-concept prototype has been used

  6. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  7. DisBlue+: A distributed annotation-based C# compiler

    Directory of Open Access Journals (Sweden)

    Samir E. AbdelRahman

    2010-06-01

    Full Text Available Many programming languages utilize annotations to add useful information to the program but they still result in more tokens to be compiled and hence slower compilation time. Any current distributed compiler breaks the program into scattered disjoint pieces to speed up the compilation. However, these pieces cooperate synchronously and depend highly on each other. This causes massive overhead since messages, symbols, or codes must be roamed throughout the network. This paper presents two promising compilers named annotation-based C# (Blue+ and distributed annotation-based C# (DisBlue+. The proposed Blue+ annotation is based on axiomatic semantics to replace the if/loop constructs. As the developer tends to use many (complex conditions and repeat them in the program, such annotations reduce the compilation scanning time and increases the whole code readability. Built on the top of Blue+, DisBlue+ presents its proposed distributed concept which is to divide each program class to its prototype and definition, as disjoint distributed pieces, such that each class definition is compiled with only its related compiled prototypes (interfaces. Such concept reduces the amount of code transferred over the network, minimizes the dependencies among the disjoint pieces, and removes any possible synchronization between them. To test their efficiencies, Blue+ and DisBlue+ were verified with large-size codes against some existing compilers namely Javac, DJavac, and CDjava.

  8. Properties of lanthanum hexaboride a compilation

    CERN Document Server

    Fisher, D J

    2013-01-01

    Lanthanum hexaboride is useful because it possesses a high melting point (2210C), a low work function, one of the highest known electron emissivities, and is stable in vacuum. This volume summarises the extant data on the properties of this material, including the: bulk modulus, conductivity, crystal structure, Debye temperature, defect structure, elastic constants, electronic structure, emissivity, Fermi surface, hardness, heat capacity, magnetoresistance, reflectivity, resistivity, specific heat, surface structure, thermal conductivity, thermoelectric power, toughness and work function. The

  9. Annual Proceedings of Selected Research and Development Papers Presented at the Annual Convention of the Association for Educational Communications and Technology - Volume 1 and Selected Papers on the Practice of Educational Communications and Technology - Volume 2 (34th, Jacksonville, Florida, 2011)

    Science.gov (United States)

    Simonson, Michael, Ed.

    2011-01-01

    For the thirty-fourth year, the Association for Educational Communications and Technology (AECT) is sponsoring the publication of these Proceedings. Papers published in this volume were presented at the annual AECT Convention in Jacksonville, FL. A limited quantity of these Proceedings were printed and sold in both hardcopy and electronic…

  10. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  11. Discretized Volumes in Numerical Methods

    CERN Document Server

    Antal, Miklós

    2007-01-01

    We present two techniques novel in numerical methods. The first technique compiles the domain of the numerical methods as a discretized volume. Congruent elements are glued together to compile the domain over which the solution of a boundary value problem is sought. We associate a group and a graph to that volume. When the group is symmetry of the boundary value problem under investigation, one can specify the structure of the solution, and find out if there are equispectral volumes of a given type. The second technique uses a complex mapping to transplant the solution from volume to volume and a correction function. Equation for the correction function is given. A simple example demonstrates the feasibility of the suggested method.

  12. Technology, safety and costs of decommissioning a reference small mixed oxide fuel fabrication plant. Volume 2. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, C. E.; Murphy, E. S.; Schneider, K. J.

    1979-01-01

    Volume 2 contains appendixes on small MOX fuel fabrication facility description, site description, residual radionuclide inventory estimates, decommissioning, financing, radiation dose methodology, general considerations, packaging and shipping of radioactive materials, cost assessment, and safety (JRD)

  13. Office of Technology Development`s Research, Development, Demonstration, Testing and Evaluation Mid-Year Program Review. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This document, Volume 2, presents brief summaries of programs being investigated at USDOE sites for waste processing, remedial action, underground storage tank remediation, and robotic applications in waste management.

  14. Y-12 Plant decontamination and decommissioning technology logic diagram for Building 9201-4. Volume 3: Technology evaluation data sheets; Part A: Characterization, dismantlement

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 (TLD) was developed to provide a decision-support tool that relates decontamination and decommissioning (D and D) problems at Bldg. 9201-4 to potential technologies that can remediate these problems. The TLD uses information from the Strategic Roadmap for the Oak Ridge Reservation, the Oak Ridge K-25 Site Technology Logic Diagram, the Oak Ridge National Laboratory Technology Logic Diagram, and a previous Hanford logic diagram. This TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to D and D and waste management (WM) activities. It is essential that follow-on engineering studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in the TLD and by finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk. This report consists of the characterization and dismantlement data sheets.

  15. Pick'n'Fix: Capturing Control Flow in Modular Compilers

    DEFF Research Database (Denmark)

    Day, Laurence E.; Bahr, Patrick

    2014-01-01

    We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control s...

  16. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon;

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  17. NUAPC:A Parallelizing Compiler for C++

    Institute of Scientific and Technical Information of China (English)

    朱根江; 谢立; 等

    1997-01-01

    is paper presents a model for automatically parallelizing compiler based on C++ which consists of compile-time and run-time parallelizing facilities.The paper also describes a method for finding both intra-object and inter-object parallelism.The parallelism detection is completely transparent to users.

  18. 38 CFR 45.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Semi-annual compilation. 45.600 Section 45.600 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) NEW RESTRICTIONS ON LOBBYING Agency Reports § 45.600 Semi-annual compilation. (a) The head...

  19. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing tas

  20. Risk-based systems analysis of emerging high-level waste tank remediation technologies. Volume 2: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Peters, B.B.; Cameron, R.J.; McCormack, W.D. [Enserch Environmental Corp., Richland, WA (United States)

    1994-08-01

    The objective of DOE`s Radioactive Waste Tank Remediation Technology Focus Area is to identify and develop new technologies that will reduce the risk and/or cost of remediating DOE underground waste storage tanks and tank contents. There are, however, many more technology investment opportunities than the current budget can support. Current technology development selection methods evaluate new technologies in isolation from other components of an overall tank waste remediation system. This report describes a System Analysis Model developed under the US Department of Energy (DOE) Office of Technology Development (OTD) Underground Storage Tank-Integrated Demonstration (UST-ID) program. The report identifies the project objectives and provides a description of the model. Development of the first ``demonstration`` version of this model and a trial application have been completed and the results are presented. This model will continue to evolve as it undergoes additional user review and testing.

  1. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  2. Research investigations in oil shale, tar sand, coal research, advanced exploratory process technology, and advanced fuels research: Volume 1 -- Base program. Final report, October 1986--September 1993

    Energy Technology Data Exchange (ETDEWEB)

    Smith, V.E.

    1994-05-01

    Numerous studies have been conducted in five principal areas: oil shale, tar sand, underground coal gasification, advanced process technology, and advanced fuels research. In subsequent years, underground coal gasification was broadened to be coal research, under which several research activities were conducted that related to coal processing. The most significant change occurred in 1989 when the agreement was redefined as a Base Program and a Jointly Sponsored Research Program (JSRP). Investigations were conducted under the Base Program to determine the physical and chemical properties of materials suitable for conversion to liquid and gaseous fuels, to test and evaluate processes and innovative concepts for such conversions, to monitor and determine environmental impacts related to development of commercial-sized operations, and to evaluate methods for mitigation of potential environmental impacts. This report is divided into two volumes: Volume 1 consists of 28 summaries that describe the principal research efforts conducted under the Base Program in five topic areas. Volume 2 describes tasks performed within the JSRP. Research conducted under this agreement has resulted in technology transfer of a variety of energy-related research information. A listing of related publications and presentations is given at the end of each research topic summary. More specific and detailed information is provided in the topical reports referenced in the related publications listings.

  3. Code Commentary and Automatic Refactorings using Feedback from Multiple Compilers

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Probst, Christian W.; Karlsson, Sven

    2014-01-01

    Optimizing compilers are essential to the performance of parallel programs on multi-core systems. It is attractive to expose parallelism to the compiler letting it do the heavy lifting. Unfortunately, it is hard to write code that compilers are able to optimize aggressively and therefore tools...... exist that can guide programmers with refactorings allowing the compilers to optimize more aggressively. We target the problem with many false positives that these tools often generate, where the amount of feedback can be overwhelming for the programmer. Our approach is to use a filtering scheme based...... on feedback from multiple compilers and show how we are able to filter out 87.6% of the comments by only showing the most promising comments....

  4. Fully Countering Trusting Trust through Diverse Double-Compiling

    CERN Document Server

    Wheeler, David A

    2010-01-01

    An Air Force evaluation of Multics, and Ken Thompson's Turing award lecture ("Reflections on Trusting Trust"), showed that compilers can be subverted to insert malicious Trojan horses into critical software, including themselves. If this "trusting trust" attack goes undetected, even complete analysis of a system's source code will not find the malicious code that is running. Previously-known countermeasures have been grossly inadequate. If this attack cannot be countered, attackers can quietly subvert entire classes of computer systems, gaining complete control over financial, infrastructure, military, and/or business systems worldwide. This dissertation's thesis is that the trusting trust attack can be detected and effectively countered using the "Diverse Double-Compiling" (DDC) technique, as demonstrated by (1) a formal proof that DDC can determine if source code and generated executable code correspond, (2) a demonstration of DDC with four compilers (a small C compiler, a small Lisp compiler, a small malic...

  5. THE RESEARCH OF CORBA AND IDL COMPILER%CORBA及其IDL编译器研究

    Institute of Scientific and Technical Information of China (English)

    李志蜀; 尹皓

    2000-01-01

    介绍了CORBA规范作为分布式对象计算(DOC,Distributed Obiect Computing)的主流技术.分析了CORBA系统的体系结构以及IDL编译器的工作原理.%CORBA specification, as one of the main technologies in distributed object computing, has been developed rapidly. Authors discuss the structure of CORBA and analyse the job principle of IDL compiler.

  6. Archive Compiles New Resource for Global Tropical Cyclone Research

    Science.gov (United States)

    Knapp, Kenneth R.; Kruk, Michael C.; Levinson, David H.; Gibney, Ethan J.

    2009-02-01

    The International Best Track Archive for Climate Stewardship (IBTrACS) compiles tropical cyclone best track data from 11 tropical cyclone forecast centers around the globe, producing a unified global best track data set (M. C. Kruk et al., A technique for merging global tropical cyclone best track data, submitted to Journal of Atmospheric and Oceanic Technology, 2008). Best track data (so called because the data generally refer to the best estimate of a storm's characteristics) include the position, maximum sustained winds, and minimum central pressure of a tropical cyclone at 6-hour intervals. Despite the significant impact of tropical cyclones on society and natural systems, there had been no central repository maintained for global best track data prior to the development of IBTrACS in 2008. The data set, which builds upon the efforts of the international tropical forecasting community, has become the most comprehensive global best track data set publicly available. IBTrACS was created by the U.S. National Oceanic and Atmospheric Administration's National Climatic Data Center (NOAA NCDC) under the auspices of the World Data Center for Meteorology.

  7. Ethical Hacking: Research and Course Compilation

    OpenAIRE

    Matero, Ida

    2016-01-01

    The constant leaps forward in all technological areas have begun to cause increasing amounts of concern to both business owners and private individuals. Security is one of the areas where constant education and improvement is required in order to keep a system inaccessible for unauthorized personnel. Ethical hacking is a form of penetration testing where the tester takes the role of a legitimate attacker and attempts to access the system through unauthorized means. This attack shows the v...

  8. Choice for All: ICAART 88. Proceedings of the International Conference of the Association for the Advancement of Rehabilitation Technology (Montreal, Canada, June 25-30, 1988). Volume III = Choix pour tous: ICAART 88. Compte rendu conference internationale pour le developpement de la technologie en readaptation (Montreal, Canada, Juin 25-30, 1988).

    Science.gov (United States)

    RESNA: Association for the Advancement of Rehabilitation Technology, Washington, DC.

    In these proceedings are compiled 290 papers from 15 countries, demonstrating applications of rehabilitation engineering and technology. The 16 Special Interest Groups of the Association for the Advancement of Rehabilitation Technology (RESNA) present papers in the following interest areas: Service Delivery Practice, Personal Transportation,…

  9. NASA Historical Data Book. Volume 6; NASA Space Applications, Aeronautics and Space Research and Technology, Tracking and Data Acquisition/Support Operations, Commercial Programs and

    Science.gov (United States)

    Rumerman, Judy A.

    2000-01-01

    This sixth volume of the NASA Historical Data Book is a continuation of those earlier efforts. This fundamental reference tool presents information, much of it statistical, documenting the development of several critical areas of NASA responsibility for the period between 1979 and 1988. This volume includes detailed information on the space applications effort, the development and operation of aeronautics and space research and technology programs, tracking and data acquisition/space operations, commercial programs, facilities and installations, personnel, and finances and procurement during this era. Special thanks are owed to the student research assistants who gathered and input much of the tabular material-a particularly tedious undertaking. There are numerous people at NASA associated with historical study, technical information, and the mechanics of publishing who helped in myriad ways in the preparation of this historical data book.

  10. Verification of aerial photo stand volume tables for southeast Alaska.

    Science.gov (United States)

    Theodore S. Setzer; Bert R. Mead

    1988-01-01

    Aerial photo volume tables are used in the multilevel sampling system of Alaska Forest Inventory and Analysis. These volume tables are presented with a description of the data base and methods used to construct the tables. Volume estimates compiled from the aerial photo stand volume tables and associated ground-measured values are compared and evaluated.

  11. Compilation of kinetic data for geochemical calculations

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, R.C. [Monitor Scientific, LLC., Denver, Colorado (United States); Savage, D. [Quintessa, Ltd., Nottingham (United Kingdom); Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu [Japan Nuclear Cycle Development Inst., Tokai, Ibaraki (Japan). Tokai Works

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the

  12. A special purpose silicon compiler for designing supercomputing VLSI systems

    Science.gov (United States)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  13. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  14. V meeting of research and technological development of radioactive waste management. Volume II; V Jornadas de investigacion y desarrollo tecnologico en gestion de residuos radiactivos. volumen II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    Since ENRESA's establishment, the Company has been Developing Solutions through its R and D Programme as key factor of knowledge, scientific and technological development providing solutions to those aspects related where there is not available conventional and industrial capabilities. Main works have been developed concerning to HLW, LILW, Radiological Protection, Dismantling and Closure, Facilities supporting, Old Uranium Mill Tailing Environmental Restoration, and improvements of aspects related to safety, are the pillars supporting the different ENRESA's R and D Plans. Nowadays, ENRESA has finished its current R and D Plan 1999-2003. The results obtained under this framework are the basis to build next Plan 2004-2008 as the ENRESA's Fifth R and D Plan, according to General Radioactive Waste Plan and in parallel to the co-operation opportunities offered through the International and National Framework of R and D (Spanish National Programme of R and D, EU Sixth R and D Framework, NEA/OCDE, IAEA/OIEA, and main bilateral agreements between main national agencies, etc.). In this sense, ENRESA takes advantage in order to offer the main scientific and technological results reached so far in this field results reached and showed to the Spanish society in the open summit, on last 1 to 4 December 2003, in Tarragona City. The present document is a compilation of the oral presentations carried out under the framework of this meeting. Likewise, this meeting was a key action to transmit to the society in a clear and transparent way the effort that ENRESA is making according to the environmental sustainable criteria in order to Build the Future. (Author)

  15. V meeting of research and technological development of radioactive waste management. Volume 1; V Jornadas de investigacion y desarrollo en gestion de residuos radiactivos. volumen 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    Since ENRESA's establishment, the Company has been Developing Solutions through its R AND D Programme as key factor of knowledge, scientific and technological development providing solutions to those aspects related where there is not available conventional and industrial capabilities. Main works have been developed concerning to HLW, LILW, Radiological Protection, Dismantling and Closure, Facilities supporting, Old Uranium Mill Tailing Environmental Restoration, and improvements of aspects related to safety, are the pillars supporting the different ENRESA's R and D Plans. Nowadays, ENRESA has finished its current R and D Plan 1999-2003. The results obtained under this framework are the basis to build next Plan 2004-2008 as the ENRESA's Fifth R and D Plan, according to General Radioactive Waste Plan and in parallel to the co-operation opportunities offered through the International and National Framework of R and D (Spanish National Programme of R and D, EU Sixth R and D Framework, NEA/OCDE, IAEA/OIEA, and main bilateral agreements between main national agencies, etc.). In this sense, ENRESA takes advantage in order to offer the main scientific and technological results reached so far in this field results reached and showed to the Spanish society in the open summit, on last 1 to 4 December 2003, in Tarragona City. The present document is a compilation of the oral presentations carried out under the framework of this meeting. Likewise, this meeting was a key action to transmit to the society in a clear and transparent way the effort that ENRESA is making according to the environmental sustainable criteria in order to Build the Future. (Author)

  16. Technology, safety and costs of decommissioning a reference boiling water reactor power station. Volume 2. Appendices. Technical report, September 1977-October 1979

    Energy Technology Data Exchange (ETDEWEB)

    Oak, H.D.; Holter, G.M.; Kennedy, W.E. Jr.; Konzek, G.J.

    1980-06-01

    Technology, safety and cost information is given for the conceptual decommissioning of a large (1100MWe) boiling water reactor (BWR) power station. Three approaches to decommissioning, immediate dismantlement, safe storage with deferred dismantlement and entombment, were studied to obtain comparisons between costs, occupational radiation doses, potential dose to the public and other safety impacts. It also shows the sensitivity of decommissioning safety and costs to the power rating of a BWR in the range of 200 to 1100 MWE. This volume contains the appendices.

  17. A Compiler for CPPNs: Transforming Phenotypic Descriptions Into Genotypic Representations

    DEFF Research Database (Denmark)

    Risi, Sebastian

    2013-01-01

    , the question of how to start evolution from a promising part of the search space becomes more and more important. To address this challenge, we introduce the concept of a CPPN-Compiler, which allows the user to directly compile a high-level description of the desired starting structure into the CPPN itself......-specific regularities like symmetry or repetition. Thus the results presented in this paper open up a new research direction in GDS, in which specialized CPPN-Compilers for different domains could help to overcome the black box of evolutionary optimization....

  18. Efficient Compilation of a Class of Variational Forms

    CERN Document Server

    Kirby, Robert C

    2012-01-01

    We investigate the compilation of general multilinear variational forms over affines simplices and prove a representation theorem for the representation of the element tensor (element stiffness matrix) as the contraction of a constant reference tensor and a geometry tensor that accounts for geometry and variable coefficients. Based on this representation theorem, we design an algorithm for efficient pretabulation of the reference tensor. The new algorithm has been implemented in the FEniCS Form Compiler (FFC) and improves on a previous loop-based implementation by several orders of magnitude, thus shortening compile-times and development cycles for users of FFC.

  19. Compiler Optimization: A Case for the Transformation Tool Contest

    Directory of Open Access Journals (Sweden)

    Sebastian Buchwald

    2011-11-01

    Full Text Available An optimizing compiler consists of a front end parsing a textual programming language into an intermediate representation (IR, a middle end performing optimizations on the IR, and a back end lowering the IR to a target representation (TR built of operations supported by the target hardware. In modern compiler construction graph-based IRs are employed. Optimization and lowering tasks can then be implemented with graph transformation rules. This case provides two compiler tasks to evaluate the participating tools regarding performance.

  20. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  1. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  2. Space Applications of Automation, Robotics and Machine Intelligence Systems (ARAMIS), phase 2. Volume 1: Telepresence technology base development

    Science.gov (United States)

    Akin, D. L.; Minsky, M. L.; Thiel, E. D.; Kurtzman, C. R.

    1983-01-01

    The field of telepresence is defined, and overviews of those capabilities that are now available, and those that will be required to support a NASA telepresence effort are provided. Investigation of NASA's plans and goals with regard to telepresence, extensive literature search for materials relating to relevant technologies, a description of these technologies and their state of the art, and projections for advances in these technologies over the next decade are included. Several space projects are examined in detail to determine what capabilities are required of a telepresence system in order to accomplish various tasks, such as servicing and assembly. The key operational and technological areas are identified, conclusions and recommendations are made for further research, and an example developmental program is presented, leading to an operational telepresence servicer.

  3. Advanced Technology Section semiannual progress report, April 1-September 30, 1977. Volume 1. Biotechnology and environmental programs. [Lead Abstract

    Energy Technology Data Exchange (ETDEWEB)

    Pitt, W.W. Jr.; Mrochek, J.E. (comps.)

    1980-06-01

    Research efforts in six areas are reported. They include: centrifugal analyzer development; advanced analytical systems; environmental research; bioengineering research;bioprocess development and demonstration; and, environmental control technology. Individual abstracts were prepared for each section for ERA/EDB. (JCB)

  4. Alaska NWRS Legacy Seabird Monitoring Data Inventory and Compilation

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The objective of this project is to compile and standardize data from the Alaska Peninsula/Becharof, Kodiak, Togiak, and Yukon Delta National Wildlife Refuges. This...

  5. Compiler for Fast, Accurate Mathematical Computing on Integer Processors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposers will develop a computer language compiler to enable inexpensive, low-power, integer-only processors to carry our mathematically-intensive comptutations...

  6. Compilation and Synthesis for Fault-Tolerant Digital Microfluidic Biochips

    DEFF Research Database (Denmark)

    Alistar, Mirela

    of electrodes to perform operations such as dispensing, transport, mixing, split, dilution and detection. Researchers have proposed compilation approaches, which, starting from a biochemical application and a biochip architecture, determine the allocation, resource binding, scheduling, placement and routing...

  7. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  8. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  9. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  10. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  11. A Compilation of Vs30 Values in the United States

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Compiled Vs30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, there are 2,997 sites in the...

  12. Specification and compilation of real-time stream processing applications

    NARCIS (Netherlands)

    Geuns, Stephanus Joannes

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically hav

  13. Budget estimates, fiscal year 1995. Volume 1: Agency summary, human space flight, and science, aeronautics and technology

    Science.gov (United States)

    1994-01-01

    The NASA budget request has been restructured in FY 1995 into four appropriations: human space flight; science, aeronautics, and technology; mission support; and inspector general. The human space flight appropriations provides funding for NASA's human space flight activities. This includes the on-orbit infrastructure (space station and Spacelab), transportation capability (space shuttle program, including operations, program support, and performance and safety upgrades), and the Russian cooperation program, which includes the flight activities associated with the cooperative research flights to the Russian Mir space station. These activities are funded in the following budget line items: space station, Russian cooperation, space shuttle, and payload utilization and operations. The science, aeronautics, and technology appropriations provides funding for the research and development activities of NASA. This includes funds to extend our knowledge of the earth, its space environment, and the universe and to invest in new technologies, particularly in aeronautics, to ensure the future competitiveness of the nation. These objectives are achieved through the following elements: space science, life and microgravity sciences and applications, mission to planet earth, aeronautical research and technology, advanced concepts and technology, launch services, mission communication services, and academic programs.

  14. Trident: An FPGA Compiler Framework for Floating-Point Algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Tripp J. L. (Justin L.); Peterson, K. D. (Kristopher D.); Poznanovic, J. D. (Jeffrey Daniel); Ahrens, C. M. (Christine Marie); Gokhale, M. (Maya)

    2005-01-01

    Trident is a compiler for floating point algorithms written in C, producing circuits in reconfigurable logic that exploit the parallelism available in the input description. Trident automatically extracts parallelism and pipelines loop bodies using conventional compiler optimizations and scheduling techniques. Trident also provides an open framework for experimentation, analysis, and optimization of floating point algorithms on FPGAs and the flexibility to easily integrate custom floating point libraries.

  15. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry;

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite...... by replacing the equivalence test with a constraint-specific measure of distance. We demonstrate the value of the approach for approximate and exact MDD compilation and evaluate its benefits in one of the main MDD application domains, interactive configuration....

  16. Thermal Power Systems, Point-Focusing Distributed Receiver Technology Project. Annual technical report, Fiscal Year 1978. Volume II. Detailed report

    Energy Technology Data Exchange (ETDEWEB)

    1979-03-15

    Thermal or electrical power from the sun's radiated energy through Point-Focusing Distributed Receiver technology is the goal of this Project. The energy thus produced must be economically competitive with other sources. This Project supports the industrial development of technology and hardware for extracting energy from solar power to achieve the stated goal. Present studies are working to concentrate the solar energy through mirrors or lenses, to a working fluid or gas, and through a power converter change it to an energy source useful to man. Rankine-cycle and Brayton-cycle engines are currently being developed as the most promising energy converters for our near future needs. Accomplishments on point-focusing technology in FY 1978 are detailed.

  17. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  18. On search guide phrase compilation for recommending home medical products.

    Science.gov (United States)

    Luo, Gang

    2010-01-01

    To help people find desired home medical products (HMPs), we developed an intelligent personal health record (iPHR) system that can automatically recommend HMPs based on users' health issues. Using nursing knowledge, we pre-compile a set of "search guide" phrases that provides semantic translation from words describing health issues to their underlying medical meanings. Then iPHR automatically generates queries from those phrases and uses them and a search engine to retrieve HMPs. To avoid missing relevant HMPs during retrieval, the compiled search guide phrases need to be comprehensive. Such compilation is a challenging task because nursing knowledge updates frequently and contains numerous details scattered in many sources. This paper presents a semi-automatic tool facilitating such compilation. Our idea is to formulate the phrase compilation task as a multi-label classification problem. For each newly obtained search guide phrase, we first use nursing knowledge and information retrieval techniques to identify a small set of potentially relevant classes with corresponding hints. Then a nurse makes the final decision on assigning this phrase to proper classes based on those hints. We demonstrate the effectiveness of our techniques by compiling search guide phrases from an occupational therapy textbook.

  19. High-temperature liquid-metal technology review. A Bimonthly Technical Progress Review, Volume 7, Number 2, April 1969

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1969-04-01

    The purpose of the High-Temperature Liquid-Metal Technology Review is to provide up-to-date information on the various research and development programs in the United States in the field of high-temperature liquid-metal technology. The method is to publish reviews prepared by members of the Department of Applied Science of the Brookhaven National Laboratory on current topical and progress reports submitted by contracting organizations. When results and conclusions are reported, it is intended that the individual reviews become both summaries and critiques. Thirteen reviews are presented in this issue.

  20. Technology Commercialization Program 1991

    Energy Technology Data Exchange (ETDEWEB)

    1991-11-01

    This reference compilation describes the Technology Commercialization Program of the Department of Energy, Defense Programs. The compilation consists of two sections. Section 1, Plans and Procedures, describes the plans and procedures of the Defense Programs Technology Commercialization Program. The second section, Legislation and Policy, identifies legislation and policy related to the Program. The procedures for implementing statutory and regulatory requirements are evolving with time. This document will be periodically updated to reflect changes and new material.

  1. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  2. MELT - a Translated Domain Specific Language Embedded in the GCC Compiler

    Directory of Open Access Journals (Sweden)

    Basile Starynkevitch

    2011-09-01

    Full Text Available The GCC free compiler is a very large software, compiling source in several languages for many targets on various systems. It can be extended by plugins, which may take advantage of its power to provide extra specific functionality (warnings, optimizations, source refactoring or navigation by processing various GCC internal representations (Gimple, Tree, .... Writing plugins in C is a complex and time-consuming task, but customizing GCC by using an existing scripting language inside is impractical. We describe MELT, a specific Lisp-like DSL which fits well into existing GCC technology and offers high-level features (functional, object or reflexive programming, pattern matching. MELT is translated to C fitted for GCC internals and provides various features to facilitate this. This work shows that even huge, legacy, software can be a posteriori extended by specifically tailored and translated high-level DSLs.

  3. Technology, safety and costs of decommissioning a reference small mixed oxide fuel fabrication plant. Volume 1. Main report

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, C. E.; Murphy, E. S.; Schneider, K J

    1979-01-01

    Detailed technology, safety and cost information are presented for the conceptual decommissioning of a reference small mixed oxide fuel fabrication plant. Alternate methods of decommissioning are described including immediate dismantlement, safe storage for a period of time followed by dismantlement and entombment. Safety analyses, both occupational and public, and cost evaluations were conducted for each mode.

  4. FINESSE: study of the issues, experiments and facilities for fusion nuclear technology research and development. Interim report. Volume II

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M.

    1984-10-01

    The Nuclear Fusion Issues chapter contains a comprehensive list of engineering issues for fusion reactor nuclear components. The list explicitly defines the uncertainties associated with the engineering option of a fusion reactor and addresses the potential consequences resulting from each issue. The next chapter identifies the fusion nuclear technology testing needs up to the engineering demonstration stage. (MOW)

  5. The development of coal-based technologies for Department of Defense facilities: Phase 1 final report. Volume 1: Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, B.G.; Morrison, J.L.; Pisupati, S.V. [Pennsylvania State Univ., University Park, PA (United States). Energy and Fuels Research Center] [and others

    1997-01-31

    The first phase of a three-phase project investigating the development of coal-based technologies for Department of Defense facilities has been completed. The objectives of the project are to: decrease DOD`s dependence on foreign oil and increase its use of coal; promote public and private sector deployment of technologies for utilizing coal-based fuels in oil-designed combustion equipment; and provide a continuing environment for research and development of coal-based fuel technologies for small-scale applications at a time when market conditions in the US are not favorable for the introduction of coal-fired equipment in the commercial and industrial capacity ranges. The Phase 1 activities were focused on developing clean, coal-based combustion technologies for the utilization of both micronized coal-water mixtures (MCWMs) and dry, micronized coal (DMC) in fuel oil-designed industrial boilers. The specific objective in Phase 1 was to deliver fully engineered retrofit options for a fuel oil-designed watertube boiler located on a DOD installation to fire either MCWM or DMC. This was achieved through a project consisting of fundamental, pilot-sale, and demonstration-scale activities investigating coal beneficiation and preparation, and MCWM and DMC combustion performance. In addition, detailed engineering designs and an economic analysis were conducted for a boiler located at the Naval Surface Warfare Center, near Crane, Indiana. Results are reported on MCWM and DMC combustion performance evaluation; engineering design; and cost/economic analysis.

  6. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  7. Demonstration project as a procedure for accelerating the application of new technology (Charpie Task Force report). Volume II

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-02-01

    This report examines the issues associated with government programs proposed for the ''commercialization'' of new energy technologies; these programs are intended to hasten the pace at which target technologies are adopted by the private sector. The ''commercial demonstration'' is the principal tool used in these programs. Most previous government interventions in support of technological change have focused on R and D and left to the private sector the decision as to adoption for commercial utilization; thus there is relatively little in the way of analysis or experience which bears direct application. The analysis is divided into four sections. First, the role of R, D, and D within the structure of the national energy goals and policies is examined. The issue of ''prices versus gaps'' is described as a crucial difference of viewpoint concerning the role of the government in the future of the energy system. Second, the process of technological change as it occurs with respect to energy technologies is then examined for possible sources of misalignment of social and private incentives. The process is described as a series of investments. Third, correction of these sources of misalignment then becomes the goal of commercial demonstration programs as this goal and the means for attaining it are explored. Government-supported commercialization may be viewed as a subsidy to the introduction stage of the process; the circumstances under which such subsidies are likely to affect the success of the subsequent diffusion stage are addressed. The discussion then turns to the political, legal, and institutional problems. Finally, methods for evaluation and planning of commercial demonstration programs are analyzed. The critical areas of ignorance are highlighted and comprise a research agenda for improved analytical techniques to support decisions in this area.

  8. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels...

  9. Proceedings of the Fifth CANMET/ACI International Conference on Recent Advances in Concrete Technology : volume 1 and supplementary papers

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, V.M. (ed.) [Natural Resources Canada, Ottawa, ON (Canada). CANMET Energy Technology Centre; Venturino, M. (comp.)

    2001-07-01

    This conference brought together researchers from industry, academia, and government agencies from around the world to discuss the recent advances in concrete technology and to discuss areas that need more research. The presentations focused on all aspects of concrete technology and sustainability with most of them dealing with the issue of supplementing cementing materials with admixtures such as fly ash. In addition to the referenced proceedings, a book of supplementary papers was also published. Cement blends were found to prolong the longevity of concrete. An added benefit is that they avoid the huge cost of repairs and replacement cycles. The use of fly ash in cement is considered to be a viable waste product material for cement mixtures. A total of 64 papers were presented at this conference, of which 13 have been processed separately for inclusion in the database. refs., tabs., figs.

  10. Summary Report on Phase I Results from the 3D Printing in Zero G Technology Demonstration Mission, Volume I

    Science.gov (United States)

    Prater, T. J.; Bean, Q. A.; Beshears, R. D.; Rolin, T. D.; Werkheiser, N. J.; Ordonez, E. A.; Ryan, R. M.; Ledbetter, F. E., III

    2016-01-01

    Human space exploration to date has been confined to low-Earth orbit and the Moon. The International Space Station (ISS) provides a unique opportunity for researchers to prove out the technologies that will enable humans to safely live and work in space for longer periods of time and venture beyond the Earth/Moon system. The ability to manufacture parts in-space rather than launch them from Earth represents a fundamental shift in the current risk and logistics paradigm for human spaceflight. In September 2014, NASA, in partnership with Made In Space, Inc., launched the 3D Printing in Zero-G technology demonstration mission to explore the potential of additive manufacturing for in-space applications and demonstrate the capability to manufacture parts and tools on orbit using fused deposition modeling. This Technical Publication summarizes the results of testing to date of the ground control and flight prints from the first phase of this ISS payload.

  11. FINESSE: study of the issues, experiments and facilities for fusion nuclear technology research and development. Interim report. Volume III

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M.

    1984-10-01

    This chapter deals with the analysis and engineering scaling of solid breeded blankets. The limits under which full component behavior can be achieved under changed test conditions are explored. The characterization of these test requirements for integrated testing contributes to the overall test matrix and test plan for the understanding and development of fusion nuclear technology. The second chapter covers the analysis and engineering scaling of liquid metal blankets. The testing goals for a complete blanket program are described. (MOW)

  12. FINESSE: study of the issues, experiments and facilities for fusion nuclear technology research and development. Interim report. Volume III

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M.

    1984-10-01

    This chapter deals with the analysis and engineering scaling of solid breeded blankets. The limits under which full component behavior can be achieved under changed test conditions are explored. The characterization of these test requirements for integrated testing contributes to the overall test matrix and test plan for the understanding and development of fusion nuclear technology. The second chapter covers the analysis and engineering scaling of liquid metal blankets. The testing goals for a complete blanket program are described. (MOW)

  13. Technology, safety and costs of decommissioning a Reference Boiling Water Reactor Power Station. Main report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oak, H.D.; Holter, G.M.; Kennedy, W.E. Jr.; Konzek, G.J.

    1980-06-01

    Technology, safety and cost information is given for the conceptual decommissioning of a large (1100MWe) boiling water reactor (BWR) power station. Three approaches to decommissioning, immediate dismantlement, safe storage with deferred dismantlement and entombment, were studied to obtain comparisons between costs, occupational radiation doses, potential dose to the public and other safety impacts. It also shows the sensitivity of decommissioning safety and costs to the power rating of a BWR in the range of 200 to 1100 MWe.

  14. Heavy-Section Steel Technology Program. Semiannual progress report, October 1991--March 1992: Volume 9, No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, W E [Oak Ridge National Lab., TN (United States)

    1992-11-01

    The Heavy-Section Steel Technology (HSST) Program is conducted for the Nuclear Regulatory Commission (NRC) by Oak Ridge National Laboratory (ORNL). The program focus is on the development and validation of technology for the assessment of fracture-prevention margins in commercial nuclear reactor pressure vessels. The HSST Program is organized in 11 tasks: program management, fracture methodology and analysis, material characterization and properties, special technical assistance, fracture analysis computer programs, cleavage-crack initiation, cladding evaluations, pressurized-thermal-shock technology, analysis methods validation, fracture evaluation tests, and warm prestressing. The program tasks have been structured to place emphasis on the resolution fracture issues with near-term licensing significance. Resources to execute the research tasks are drawn from ORNL with subcontract support from universities and other research laboratories. Close contact is maintained with the sister Heavy-Section Steel Irradiation (HSSI) Program at ORNL and with related research programs both in the United States and abroad. This report provides an overview of principal developments in each of the II program tasks from October 1, 1991 to March 31, 1992.

  15. Technology in Education

    Science.gov (United States)

    Roden, Kasi

    2011-01-01

    This paper was written to support a position on using technology in education. The purpose of this study was to support the use of technology in education by synthesizing previous research. A variety of sources including books and journal articles were studied in order to compile an overview of the benefits of using technology in elementary,…

  16. Retargeting of existing FORTRAN program and development of parallel compilers

    Science.gov (United States)

    Agrawal, Dharma P.

    1988-01-01

    The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.

  17. CAPS OpenACC Compilers: Performance and Portability

    CERN Document Server

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  18. Technology.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  19. IMPACT XP: An Integrated Performance/Power Analysis Framework for Compiler and Architecturefor Research

    Institute of Scientific and Technical Information of China (English)

    WANGYongwen; ZHANGMinxuan

    2004-01-01

    Modern high performance microprocessor design requires making power and performance tradeoffs at early phase, while traditional power analysis tools can't satisfy this requirement. Wattch provides an architectural power evaluation methodology within the SimpleScalar toolset; however current high-level power estimating tools still have limitations in functionality and range of research. In this paper, IMPACT XP is presented as a novel micro-architecture level power and performance analysis framework for both superscalar and EPIC architecture. In this framework, processors are divided into blocks at architecture abstraction layer. Wattch powermodels are implemented to estimate the peak power of each block and a cycle-accurate simulator is setup to calculate the dynamic energy dissipations and performance statistics. User-friendly interfaces are also provided. Experiments show that the framework maintains error within 10% of the estimated results against the reported values for exampling commercial microprocessors. Based on this framework, quantitative researches have been made on the additional power effects of modern compiler's performance optimization technologies and the inherited power characteristics of superscalar and EPIC architecture. IMPACT XP provides a power and performance analysis methodology by implementing analytical power models within the portable and familiar IMPACT compiler. It will facilitate architecture and compiler research in the poweraware field.

  20. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  1. Documentation of the analysis of the benefits and costs of aeronautical research and technology models, volume 1

    Science.gov (United States)

    Bobick, J. C.; Braun, R. L.; Denny, R. E.

    1979-01-01

    The analysis of the benefits and costs of aeronautical research and technology (ABC-ART) models are documented. These models were developed by NASA for use in analyzing the economic feasibility of applying advanced aeronautical technology to future civil aircraft. The methodology is composed of three major modules: fleet accounting module, airframe manufacturing module, and air carrier module. The fleet accounting module is used to estimate the number of new aircraft required as a function of time to meet demand. This estimation is based primarily upon the expected retirement age of existing aircraft and the expected change in revenue passenger miles demanded. Fuel consumption estimates are also generated by this module. The airframe manufacturer module is used to analyze the feasibility of the manufacturing the new aircraft demanded. The module includes logic for production scheduling and estimating manufacturing costs. For a series of aircraft selling prices, a cash flow analysis is performed and a rate of return on investment is calculated. The air carrier module provides a tool for analyzing the financial feasibility of an airline purchasing and operating the new aircraft. This module includes a methodology for computing the air carrier direct and indirect operating costs, performing a cash flow analysis, and estimating the internal rate of return on investment for a set of aircraft purchase prices.

  2. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  3. Heavy-section steel technology program: Semiannual progress report for April--September 1996. Volume 13, Number 2

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, W.E. [Oak Ridge National Lab., TN (United States)

    1998-03-01

    The Heavy-Section Steel Technology (HSST) Program is conducted for the US Nuclear Regulatory Commission (NRC) by the Oak Ridge National Laboratory (ORNL). The program focus is on the development and validation of technology for the assessment of fracture-prevention margins in commercial nuclear reactor pressure vessels. The HSST Program is organized in seven tasks: (1) program management, (2) constraint effects analytical development and validation, (3) evaluation of cladding effects, (4) ductile to cleavage fracture mode conversion, (5) fracture analysis methods development and applications, (6) material property data and test methods, and (7) integration of results into a state-of-the-art methodology. The program tasks have been structured to place emphasis on the resolution fracture issues with near-term licensing significance. Resources to execute the research tasks are drawn from ORNL with subcontract support from universities and other research laboratories. Close contact is maintained with the sister Heavy-Section Steel irradiation Program at ORNL and with related research programs both in the US and abroad. This report provides an overview of principal developments in each of the seven program tasks from April 1996--September 1996.

  4. Data summary of municipal solid waste management alternatives. Volume 7, Appendix E -- Material recovery/material recycling technologies

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-10-01

    The enthusiasm for and commitment to recycling of municipal solid wastes is based on several intuitive benefits: Conservation of landfill capacity; Conservation of non-renewable natural resources and energy sources; Minimization of the perceived potential environmental impacts of MSW combustion and landfilling; Minimization of disposal costs, both directly and through material resale credits. In this discussion, ``recycling`` refers to materials recovered from the waste stream. It excludes scrap materials that are recovered and reused during industrial manufacturing processes and prompt industrial scrap. Materials recycling is an integral part of several solid waste management options. For example, in the preparation of refuse-derived fuel (RDF), ferrous metals are typically removed from the waste stream both before and after shredding. Similarly, composting facilities, often include processes for recovering inert recyclable materials such as ferrous and nonferrous metals, glass, Plastics, and paper. While these two technologies have as their primary objectives the production of RDF and compost, respectively, the demonstrated recovery of recyclables emphasizes the inherent compatibility of recycling with these MSW management strategies. This appendix discusses several technology options with regard to separating recyclables at the source of generation, the methods available for collecting and transporting these materials to a MRF, the market requirements for post-consumer recycled materials, and the process unit operations. Mixed waste MRFs associated with mass bum plants are also presented.

  5. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  6. A DRAM compiler algorithm for high performance VLSI embedded memories

    Science.gov (United States)

    Eldin, A. G.

    1992-01-01

    In many applications, the limited density of the embedded SRAM does not allow integrating the memory on the same chip with other logic and functional blocks. In such cases, the embedded DRAM provides the optimum combination of very high density, low power, and high performance. For ASIC's to take full advantage of this design strategy, an efficient and highly reliable DRAM compiler must be used. The embedded DRAM architecture, cell, and peripheral circuit design considerations and the algorithm of a high performance memory compiler are presented .

  7. Compilation of current high-energy-physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976.

  8. The necessity of do needs analysis in textbook compilation

    Institute of Scientific and Technical Information of China (English)

    姚茂

    2014-01-01

    <正>Needs analysis plays an important role in textbook compilation.Compile an excellent textbook need to meet a lot of conditions,but the starting point of any textbook should be meet the needs of users.So do need analysis in order to understand users’need,to make textbook to better reflect the correlation and the practicability.Only textbook writers to fully understand the users’(students,teachers,education department managers)actual demands of teaching textbook,they would be able to write out the applicable materials.

  9. Assessment of the use of space technology in the monitoring of oil spills and ocean pollution: Technical volume. Executive summary

    Science.gov (United States)

    Alvarado, U. R. (Editor); Chafaris, G.; Chestek, J.; Contrad, J.; Frippel, G.; Gulatsi, R.; Heath, A.; Hodara, H.; Kritikos, H.; Tamiyasu, K.

    1980-01-01

    The potential of space systems and technology for detecting and monitoring ocean oil spills and waste pollution was assessed as well as the impact of this application on communication and data handling systems. Agencies charged with responsibilities in this area were identified and their measurement requirements were ascertained in order to determine the spatial resolution needed to characterize operational and accidental discharges. Microwave and optical sensors and sensing techniques were evaluated as candidate system elements. Capabilities are described for the following: synthetic aperture radar, microwave scatterometer, passive microwave radiometer, microwave altimeter, electro-optical sensors currently used in airborne detection, existing space-based optical sensors, the thematic mapper, and the pointable optical linear array.

  10. 5{sup th} international geothermal conference. Conference volume. Risk management, financing, power plant technology, EGS/HFR

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Jochen; Hoffmann, Nadine; Brian, Marcus (eds.)

    2009-07-01

    Within the 5th International Geothermal Conference at 27th to 28th April, 2009, in Freiburg (Federal Republic of Germany) the following lectures were held: (a) Worldwide development of geothermal energy (Ladislaus Rypach); (b) Geothermal developments and applications in Turkey (Orhan Mertoglu); (c) Guermat Elektrik: Turkish experiences in geothermal financings (John F. Wolfe); (d) Geothermal exploration success: Using data and best practices from the oil and gas industry (Jan-Diederik van Wees); (e) Implementing geothermal power projects - risk management and financing from the investor's point of view (Christian Jokiel); (f) Risks and risk mitigation in the Upper Rhine Graben geothermal province (Christian Hecht); (g) The Soultz geothermal plant: from the concept to the first geothermal kWh (Albert Genter); (h) Binary power plant technologies for geothermal power generation (Kathrin Rohloff); (i) Kalina power plants - 10 years of operational experience (Gestur R. Bardarson); (j) 1,200 MW experience with innovative geothermal power plants (Hilel Legmann); (k) Challenges of managing geothermal power plant projects (Norbert Hartlieb); (l) Requirements for geothermal power plants (Athanasios Tsoubaklis); (m) Credit programme on productivity risk in deep geothermal projects (Karin Freier, Peter Hasenbein, Stephan Jacob); (n) Geothermal projects in the light of the financial crisis (Thomas G. Engelmann); (o) Insurability of geothermal projects (Matthias Kliesch); (p) Requirements for equity investors to finance a geothermal project (Thoma G. Engelmann); (q) Aspects of project development from an investor's perspective (Bernhard Gubo); (r) Project requirements and challenges in geothermal projects (Olaf Heil); (s) The 'quest' for appropriate locations for HFR projects in Southern Germany (Wolfgang Bauer); (t) Status of the Soultz geothermal power plant and the deep reservoir after some months of circulation (Albert Genter); (u) Hot

  11. Advanced Industrial Materials (AIM) program. Compilation of project summaries and significant accomplishments FY 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    In many ways, the Advanced Industrial Materials (AIM) Program underwent a major transformation in Fiscal Year 1995 and these changes have continued to the present. When the Program was established in 1990 as the Advanced Industrial Concepts (AIC) Materials Program, the mission was to conduct applied research and development to bring materials and processing technologies from the knowledge derived from basic research to the maturity required for the end use sectors for commercialization. In 1995, the Office of Industrial Technologies (OIT) made radical changes in structure and procedures. All technology development was directed toward the seven {open_quotes}Vision Industries{close_quotes} that use about 80% of industrial energy and generated about 90% of industrial wastes. These are: (1) Aluminum; (2) Chemical; (3) Forest Products; (4) Glass; (5) Metal Casting; (6) Refineries; and (7) Steel. This report is a compilation of project summaries and significant accomplishments on materials.

  12. Technical support for the Ohio Clean Coal Technology Program. Volume 2, Baseline of knowledge concerning process modification opportunities, research needs, by-product market potential, and regulatory requirements: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Olfenbuttel, R.; Clark, S.; Helper, E.; Hinchee, R.; Kuntz, C.; Means, J.; Oxley, J.; Paisley, M.; Rogers, C.; Sheppard, W.; Smolak, L. [Battelle, Columbus, OH (United States)

    1989-08-28

    This report was prepared for the Ohio Coal Development Office (OCDO) under Grant Agreement No. CDO/R-88-LR1 and comprises two volumes. Volume 1 presents data on the chemical, physical, and leaching characteristics of by-products from a wide variety of clean coal combustion processes. Volume 2 consists of a discussion of (a) process modification waste minimization opportunities and stabilization considerations; (b) research and development needs and issues relating to clean coal combustion technologies and by-products; (c) the market potential for reusing or recycling by-product materials; and (d) regulatory considerations relating to by-product disposal or reuse.

  13. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster

  14. 22 CFR 519.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 519.600 Section 519.600 Foreign Relations BROADCASTING BOARD OF GOVERNORS NEW RESTRICTIONS ON LOBBYING Agency Reports § 519.600...

  15. 15 CFR 28.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Semi-annual compilation. 28.600 Section 28.600 Commerce and Foreign Trade Office of the Secretary of Commerce NEW RESTRICTIONS ON LOBBYING...

  16. 22 CFR 138.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 138.600 Section 138.600 Foreign Relations DEPARTMENT OF STATE MISCELLANEOUS NEW RESTRICTIONS ON LOBBYING Agency Reports...

  17. 22 CFR 712.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 712.600 Section 712.600 Foreign Relations OVERSEAS PRIVATE INVESTMENT CORPORATION ADMINISTRATIVE PROVISIONS NEW RESTRICTIONS ON...

  18. 22 CFR 311.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Senate and the Committee on Foreign Affairs of the House of Representatives or the Committees on Armed... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 311.600 Section 311.600 Foreign Relations PEACE CORPS NEW RESTRICTIONS ON LOBBYING Agency Reports § 311.600 Semi-annual...

  19. 22 CFR 227.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 227.600 Section 227.600 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT NEW RESTRICTIONS ON LOBBYING Agency Reports...

  20. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  1. Effective Compiler Error Message Enhancement for Novice Programming Students

    Science.gov (United States)

    Becker, Brett A.; Glanville, Graham; Iwashima, Ricardo; McDonnell, Claire; Goslin, Kyle; Mooney, Catherine

    2016-01-01

    Programming is an essential skill that many computing students are expected to master. However, programming can be difficult to learn. Successfully interpreting compiler error messages (CEMs) is crucial for correcting errors and progressing toward success in programming. Yet these messages are often difficult to understand and pose a barrier to…

  2. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  3. Compiler Optimization Pass Visualization: The Procedural Abstraction Case

    Science.gov (United States)

    Schaeckeler, Stefan; Shang, Weijia; Davis, Ruth

    2009-01-01

    There is an active research community concentrating on visualizations of algorithms taught in CS1 and CS2 courses. These visualizations can help students to create concrete visual images of the algorithms and their underlying concepts. Not only "fundamental algorithms" can be visualized, but also algorithms used in compilers. Visualizations that…

  4. Compilation of a global inventory of emissions of nitrous oxide.

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing, oceans, fossil fuel and bi

  5. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    S.G.A. Flantua; H. Hooghiemstra; E.C. Grimm; H. Behling; M.B Bush; C. González-Arrango; W.D. Gosling; M.-P. Ledru; S. Lozano-Garciá; A. Maldonado; A.R. Prieto; V. Rull; J.H. van Boxel

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of s

  6. 5 CFR 9701.524 - Compilation and publication of data.

    Science.gov (United States)

    2010-01-01

    ... MANAGEMENT SYSTEM (DEPARTMENT OF HOMELAND SECURITY-OFFICE OF PERSONNEL MANAGEMENT) DEPARTMENT OF HOMELAND SECURITY HUMAN RESOURCES MANAGEMENT SYSTEM Labor-Management Relations § 9701.524 Compilation and... agreements and arbitration decisions and publish the texts of its impasse resolution decisions and...

  7. Calculating Certified Compilers for Non-deterministic Languages

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2015-01-01

    Reasoning about programming languages with non-deterministic semantics entails many difficulties. For instance, to prove correctness of a compiler for such a language, one typically has to split the correctness property into a soundness and a completeness part, and then prove these two parts...

  8. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  9. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  10. Experience with PASCAL compilers on mini-computers

    CERN Document Server

    Bates, D

    1977-01-01

    This paper relates the history of an implementation of the language PASCAL on a minicomputer. The unnecessary difficulties encountered on the way, led the authors to reflect on the distribution of 'portable' compilers in general and suggest some guidelines for the future. (4 refs).

  11. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  12. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  13. Application of spatial technologies in wildlife biology.

    Science.gov (United States)

    Thomas A. O' Neil; Pete Bettinger; Bruce G. Marcot; B. Wayne Luscombe; Gregory T. Koeln; Howard J. Bruner; Charley Barrett; Jennifer A. Pollock; Susan. Bernatas

    2005-01-01

    The Information Age is here, and technology has a large and important role in gathering, compiling, and synthesizing data. The old adage of analyzing wildlife data over "time and space" today entails using technologies to help gather, compile, and synthesize remotely sensed information, and to integrate results into research, monitoring and evaluation. Thus,...

  14. Bitwise identical compiling setup: prospective for reproducibility and reliability of earth system modeling

    Directory of Open Access Journals (Sweden)

    R. Li

    2015-11-01

    Full Text Available Reproducibility and reliability are fundamental principles of scientific research. A compiling setup that includes a specific compiler version and compiler flags is essential technical supports for Earth system modeling. With the fast development of computer software and hardware, compiling setup has to be updated frequently, which challenges the reproducibility and reliability of Earth system modeling. The existing results of a simulation using an original compiling setup may be irreproducible by a newer compiling setup because trivial round-off errors introduced by the change of compiling setup can potentially trigger significant changes in simulation results. Regarding the reliability, a compiler with millions of lines of codes may have bugs that are easily overlooked due to the uncertainties or unknowns in Earth system modeling. To address these challenges, this study shows that different compiling setups can achieve exactly the same (bitwise identical results in Earth system modeling, and a set of bitwise identical compiling setups of a model can be used across different compiler versions and different compiler flags. As a result, the original results can be more easily reproduced; for example, the original results with an older compiler version can be reproduced exactly with a newer compiler version. Moreover, this study shows that new test cases can be generated based on the differences of bitwise identical compiling setups between different models, which can help detect software bugs or risks in the codes of models and compilers and finally improve the reliability of Earth system modeling.

  15. Volume 9: A Review of Socioeconomic Impacts of Oil Shale Development WESTERN OIL SHALE DEVELOPMENT: A TECHNOLOGY ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Rotariu, G. J.

    1982-02-01

    recognize that the rate of development, the magnitude of development, and the technology mix that will actually take place remain uncertain. Although we emphasize that other energy and mineral resources besides oil shale may be developed, the conclusions reached in this study reflect only those impacts that would be felt from the oil shale scenario. Socioeconomic impacts in the region reflect the uneven growth rate implied by the scenario and will be affected by the timing of industry developments, the length and magnitude of the construction phase of development, and the shift in employment profiles predicted in the scenario. The facilities in the southern portion of the oil shale region, those along the Colorado River and Parachute Creek, show a peak in the construction work force in the mid-1980s, whereas those f acil it i es in the Piceance Creek Bas into the north show a construction peak in the late 1980s. Together, the facilities will require a large construction work force throughout the decade, with a total of 4800 construction workers required in 1985. Construction at the northern sites and second phase construction in the south will require 6000 workers in 1988. By 1990, the operation work force will increase to 7950. Two important characteristics of oil shale development emerge from the work force estimates: (1) peak-year construction work forces will be 90-120% the size of the permanent operating work force; and (2) the yearly changes in total work force requirements will be large, as much as 900 in one year at one facility. To estimate population impacts on individual communities, we devised a population distribution method that is described in Sec. IV. Variables associated with the projection of population impacts are discussed and methodologies of previous assessments are compared. Scenario-induced population impacts estimated by the Los Alamos method are compared to projections of a model employed by the Colorado West Area Council of Governments. Oil shale

  16. Measures of improving engineering budget compiling accuracy%提高工程预算编制准确性的措施

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    The thesis analyzes the necessity of improving building engineering budget compiling accuracy,studies factors influencing engineering budget compiling accuracy,and puts forward measures of improving engineering budget compiling accuracy,such as improving compilation level, accurately calculating BOQ,being familiar with market conditions,and accurately determining engineering construction technologies and so on.%分析了提高建筑工程预算编制准确性的必要性,对影响工程预算编制准确性的因素进行了研究,提出了提高编制水平、准确计算工程量、了解市场行情、准确确定工程施工工艺等促进工程预算编制准确性的措施。

  17. 一种简单高级语言编译器的设计%Design of a Simple and High-level Language Compiler

    Institute of Scientific and Technical Information of China (English)

    石晓敬

    2014-01-01

    在编译原理和虚拟机技术的基础上,采用一种高级语言设计了一个简单的编译器。通过词法分析、语法分析和中间代码、虚拟机等进程,将源程序编译成目标程序,实现了复杂编译器的简单设计。%On the basis of the compiler theory and virtual machine technology,it designs a simple compiler with a high-level language. The source code is compiled into the target program through the processes of lexical analysis, syntax analysis,generation of intermediate code and virtual machine,which achieved the simple design of complex compiler.

  18. Compilation of VS30 Data for the United States

    Science.gov (United States)

    Yong, Alan; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Odum, Jack K.; Stephenson, William J.; Haefner, Scott

    2016-01-01

    VS30, the time-averaged shear-wave velocity (VS) to a depth of 30 meters, is a key index adopted by the earthquake engineering community to account for seismic site conditions. VS30 is typically based on geophysical measurements of VS derived from invasive and noninvasive techniques at sites of interest. Owing to cost considerations, as well as logistical and environmental concerns, VS30 data are sparse or not readily available for most areas. Where data are available, VS30 values are often assembled in assorted formats that are accessible from disparate and (or) impermanent Web sites. To help remedy this situation, we compiled VS30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, we have compiled VS30 values for 2,997 sites in the United States, along with metadata for each measurement from government-sponsored reports, Web sites, and scientific and engineering journals. Most of the data in our VS30 compilation originated from publications directly reporting the work of field investigators. A small subset (less than 20 percent) of VS30 values was previously compiled by the USGS and other research institutions. Whenever possible, VS30 originating from these earlier compilations were crosschecked against published reports. Both downhole and surface-based VS30 estimates are represented in our VS30 compilation. Most of the VS30 data are for sites in the western contiguous United States (2,141 sites), whereas 786 VS30 values are for sites in the Central and Eastern United States; 70 values are for sites in other parts of the United States, including Alaska (15 sites), Hawaii (30 sites), and Puerto Rico (25 sites). An interactive map is hosted on the primary USGS Web site for accessing VS30 data (http://earthquake.usgs.gov/research/vs30/).

  19. Regulatory and Technical Reports (Abstract Index Journal). Annual compilation for 1995, Volume 20, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.

    1995-04-01

    The Nuclear Regulatory Commission`s annual summary of licensed nuclear power reactor data is based primarily on the report of operating data submitted by licensees for each unit for the month of December because that report contains data for the month of December, the year to date (in this case calendar year 1994) and cumulative data, usually from the date of commercial operation. The data is not independently verified, but various computer checks are made. The report is divided into two sections. The first contains summary highlights and the second contains data on each individual unit in commercial operation Section 1 capacity and availability factors are simple arithmetic averages. Section 2 items in the cumulative column are generally as reported by the licensee and notes as to the use of weighted averages and starting dates other than commercial operation are provided.

  20. Bearing tester data compilation, analysis and reporting and bearing math modeling, volume 1

    Science.gov (United States)

    Marshall, D. D.; Montgomery, E. E.; New, L. S.; Stone, M. A.; Tiller, B. K.

    1984-01-01

    Thermal and mechanical models of high speed angular contact ball bearings operating in LOX and LN2 were developed and verified with limited test data in an effort to further understand the parameters that determine or effect the SSME turbopump bearing operational characteristics and service life. The SHABERTH bearing analysis program which was adapted to evaluate shaft bearing systems in cryogenics is not capable of accommodating varying thermal properties and two phase flow. A bearing model with this capability was developed using the SINDA thermal analyzer. Iteration between the SHABERTH and the SINDA models enable the establishment of preliminary bounds for stable operation in LN2. These limits were established in terms of fluid flow, fluid inlet temperature, and axial load for a shaft speed of 30,000 RPM.

  1. Ocean Engineering Studies Compiled 1991. Volume 7. Acrylic Windows- Diverse Design Features and Types of Service

    Science.gov (United States)

    1991-01-01

    titanium hatches (Fig. 4) were inserted into the polar penetrations and scaled against leakage with room vulcanizing silicone rubber squirted into...higher than in the thick shell (0.31 versus 0.25 percent) (Table 5). Discussion Spherical Shell. The tesi results generated by the pressure cycling

  2. Ocean Engineering Studies Compiled 1991. Volume 11. Pressure-Resistant Glass Light Enclosures

    Science.gov (United States)

    1991-01-01

    DISTRIBUTION STATEMENT (of this Repo ,) Approved for public release; distribution unlimited. 17 DISTRIBUTION STATEMENT (of the abstract entered in Block 20...for NUC Undersea Elevator. NUC Technical Report TP 315, Naval Undersea Center, San Diego, California, September 1972. 11. J.R. Maison and J.D. Stachiw

  3. Ocean Engineering Studies Compiled 1991. Volume 6. Acrylic Windows - Typical Applications in Pressure Housings

    Science.gov (United States)

    1991-01-01

    Nostrand Inc., Princeton, New Jersey, March 1956. 20. Southwest Research Institute, Final Repo .’t SWRI 034090-001, "Finite Element Anlalysis of an... REPO " T .. OI3t ,Any other nun,,.,. that troy be Cl ~re d.1 IC’ D-vRflUJTiON STATEMENT Distribution of this document is unlimited. Naval Undersea...No. 4, 1971. 6. Maison , J. R., and Stachiw, J. D., "Acrylic Pressure Hull for Johnson-Sea-Link Sub- mersible," ASME Paper No. 71-WA/Unt-6. 7. Wilson, E

  4. Low level waste management: a compilation of models and monitoring techniques. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mosier, J.E.; Fowler, J.R.; Barton, C.J. (comps.)

    1980-04-01

    In support of the National Low-Level Waste (LLW) Management Research and Development Program being carried out at Oak Ridge National Laboratory, Science Applications, Inc., conducted a survey of models and monitoring techniques associated with the transport of radionuclides and other chemical species from LLW burial sites. As a result of this survey, approximately 350 models were identified. For each model the purpose and a brief description are presented. To the extent possible, a point of contact and reference material are identified. The models are organized into six technical categories: atmospheric transport, dosimetry, food chain, groundwater transport, soil transport, and surface water transport. About 4% of the models identified covered other aspects of LLW management and are placed in a miscellaneous category. A preliminary assessment of all these models was performed to determine their ability to analyze the transport of other chemical species. The models that appeared to be applicable are identified. A brief survey of the state-of-the-art techniques employed to monitor LLW burial sites is also presented, along with a very brief discussion of up-to-date burial techniques.

  5. Compilation of Energy Efficient Concepts in Advanced Aircraft Design and Operations. Volume 2. Abstract Data Base

    Science.gov (United States)

    1980-11-05

    60 B.2 PROPULSION TECNOLOGY B-2.1 GAS TURBINES B.2.1.i NAVY FUNDED I B2-1I NADC-79239-60 B.2.1.1.l SSIE: CQN 975316 3/79 to Cont Colorado State...which are also beter for fuel and operating cost economy) push the desired bypass ratio up further. Effects on fuel consumption of design field...design features include high-aspect-ratio wings, thickness ratio and range. It is conclu&d that wing aspect ratios of future aircraft are I - push of

  6. A Compilation of Moored Current Meter and Wind Recorder Observations. Volume 26, (1972 Measurements)

    Science.gov (United States)

    1981-05-01

    Model 850. Both instruments use a Savonius rotor to measure water speed and a vane and int3rnal compass to measure direction. In the VACM, East and North... Savonius rotor to measure water speed and a vane and internal compass to measure direction. In the VACM, EasL and North components are calculated from the

  7. Volume 9: A Review of Socioeconomic Impacts of Oil Shale Development WESTERN OIL SHALE DEVELOPMENT: A TECHNOLOGY ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Rotariu, G. J.

    1982-02-01

    recognize that the rate of development, the magnitude of development, and the technology mix that will actually take place remain uncertain. Although we emphasize that other energy and mineral resources besides oil shale may be developed, the conclusions reached in this study reflect only those impacts that would be felt from the oil shale scenario. Socioeconomic impacts in the region reflect the uneven growth rate implied by the scenario and will be affected by the timing of industry developments, the length and magnitude of the construction phase of development, and the shift in employment profiles predicted in the scenario. The facilities in the southern portion of the oil shale region, those along the Colorado River and Parachute Creek, show a peak in the construction work force in the mid-1980s, whereas those f acil it i es in the Piceance Creek Bas into the north show a construction peak in the late 1980s. Together, the facilities will require a large construction work force throughout the decade, with a total of 4800 construction workers required in 1985. Construction at the northern sites and second phase construction in the south will require 6000 workers in 1988. By 1990, the operation work force will increase to 7950. Two important characteristics of oil shale development emerge from the work force estimates: (1) peak-year construction work forces will be 90-120% the size of the permanent operating work force; and (2) the yearly changes in total work force requirements will be large, as much as 900 in one year at one facility. To estimate population impacts on individual communities, we devised a population distribution method that is described in Sec. IV. Variables associated with the projection of population impacts are discussed and methodologies of previous assessments are compared. Scenario-induced population impacts estimated by the Los Alamos method are compared to projections of a model employed by the Colorado West Area Council of Governments. Oil shale

  8. Compiler analysis for irregular problems in FORTRAN D

    Science.gov (United States)

    Vonhanxleden, Reinhard; Kennedy, Ken; Koelbel, Charles; Das, Raja; Saltz, Joel

    1992-01-01

    We developed a dataflow framework which provides a basis for rigorously defining strategies to make use of runtime preprocessing methods for distributed memory multiprocessors. In many programs, several loops access the same off-processor memory locations. Our runtime support gives us a mechanism for tracking and reusing copies of off-processor data. A key aspect of our compiler analysis strategy is to determine when it is safe to reuse copies of off-processor data. Another crucial function of the compiler analysis is to identify situations which allow runtime preprocessing overheads to be amortized. This dataflow analysis will make it possible to effectively use the results of interprocedural analysis in our efforts to reduce interprocessor communication and the need for runtime preprocessing.

  9. Efficient Integration of Pipelined IP Blocks into Automatically Compiled Datapaths

    Directory of Open Access Journals (Sweden)

    Andreas Koch

    2006-12-01

    Full Text Available Compilers for reconfigurable computers aim to generate problem-specific optimized datapaths for kernels extracted from an input language. In many cases, however, judicious use of preexisting manually optimized IP blocks within these datapaths could improve the compute performance even further. The integration of IP blocks into the compiled datapaths poses a different set of problems than stitching together IPs to form a system-on-chip; though, instead of the loose coupling using standard busses employed by SoCs, the one between datapath and IP block must be much tighter. To this end, we propose a concise language that can be efficiently synthesized using a template-based approach for automatically generating lightweight data and control interfaces at the datapath level.

  10. Efficient Integration of Pipelined IP Blocks into Automatically Compiled Datapaths

    Directory of Open Access Journals (Sweden)

    Koch Andreas

    2007-01-01

    Full Text Available Compilers for reconfigurable computers aim to generate problem-specific optimized datapaths for kernels extracted from an input language. In many cases, however, judicious use of preexisting manually optimized IP blocks within these datapaths could improve the compute performance even further. The integration of IP blocks into the compiled datapaths poses a different set of problems than stitching together IPs to form a system-on-chip; though, instead of the loose coupling using standard busses employed by SoCs, the one between datapath and IP block must be much tighter. To this end, we propose a concise language that can be efficiently synthesized using a template-based approach for automatically generating lightweight data and control interfaces at the datapath level.

  11. Compilation of gallium resource data for bauxite deposits

    Science.gov (United States)

    Schulte, Ruth F.; Foley, Nora K.

    2014-01-01

    Gallium (Ga) concentrations for bauxite deposits worldwide have been compiled from the literature to provide a basis for research regarding the occurrence and distribution of Ga worldwide, as well as between types of bauxite deposits. In addition, this report is an attempt to bring together reported Ga concentration data into one database to supplement ongoing U.S. Geological Survey studies of critical mineral resources. The compilation of Ga data consists of location, deposit size, bauxite type and host rock, development status, major oxide data, trace element (Ga) data and analytical method(s) used to derive the data, and tonnage values for deposits within bauxite provinces and districts worldwide. The range in Ga concentrations for bauxite deposits worldwide is

  12. Twelve tips on how to compile a medical educator's portfolio.

    Science.gov (United States)

    Dalton, Claudia Lucy; Wilson, Anthony; Agius, Steven

    2017-09-17

    Medical education is an expanding area of specialist interest for medical professionals. Whilst most doctors will be familiar with the compilation of clinical portfolios for scrutiny of their clinical practice and provision of public accountability, teaching portfolios used specifically to gather and demonstrate medical education activity remain uncommon in many non-academic settings. For aspiring and early career medical educators in particular, their value should not be underestimated. Such a medical educator's portfolio (MEP) is a unique compendium of evidence that is invaluable for appraisal, revalidation, and promotion. It can stimulate and provide direction for professional development, and is a rich source for personal reflection and learning. We recommend that all new and aspiring medical educators prepare an MEP, and suggest twelve tips on how to skillfully compile one.

  13. Compilation of Non-Financial Balances in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Vítězslav Ondruš

    2011-09-01

    Full Text Available The System of National Accounts in the Czech Republic consists of three main parts — institutional sector accounts, input-output tables and balances of non-financial assets. All three parts are compiled interactively by common time schedule. The article deals with balances of non-financial assets and their relation to core institutional sector accounts and explains why the third parallel part of SNA in the Czech Republic was build, describes its weaknesses and future development.

  14. Compiler-Driven Performance Optimization and Tuning for Multicore Architectures

    Science.gov (United States)

    2015-04-10

    Workshop on Libraries and Automatic Tuning for Extreme Scale Systems, Lake Tahoe, CA. August 2011. J. Ramanujam, “The Tensor Contraction Engine...Hartono, M. Baskaran, L.-N. Pouchet, J. Ramanujam, and P. Sadayappan, “ Parametric Tiling of Affine Loop Nests,” in 15th Workshop on Compilers for Parallel... Parametric Tiling for Autotuning,” in Workshop on Parallel Matrix Algorithms and Applications (PMAA 2010), Basel, Switzerland, July 2010. J. Ramanujam

  15. Analysis on Establishing Urban Cemetery Planning and Compiling System

    Institute of Scientific and Technical Information of China (English)

    Kun; YANG; Xiaogang; CHEN

    2015-01-01

    Currently,there are many problems in construction of urban cemetery like improper location,low land utilization,backward greening facilities and imperfect cemetery management,which have greatly affected people’s normal production and life. This article discusses the establishment of a sustainable city cemetery planning and compiling system from three levels of " macro-view,medium-view and micro-view" in order to perfect the present cemetery system.

  16. An Adaptation of the ADA Language for Machine Generated Compilers.

    Science.gov (United States)

    1980-12-01

    Ada Augusta, Lady Lovelace , the daughter of the poet, Lord Byron, and Charles Babbage’s programmer.# 2UNIX is a Trademark/Service Mark of the Bell...AN ADAPTATION OF THE ADA LANGUAGE FOR MACHINE GENERATED COMPILE-ETC(U) JNLSIIO DEC AG M A ROGERS, L P MYERS 7k .A9 22NVLPSTRDASHOOLMONEREYCAF EE9...mmhhhhhhmhhhhlLEhhhhhmmh LEV EU NAVAL POSTGRADUATE SCHOOL Monterey, California DTIC ~ELECTEf All 0 3 198 /12 )THESIS 7 ,AN *DAPTATION OF THE ADA

  17. Recent Efforts in Data Compilations for Nuclear Astrophysics

    CERN Document Server

    Dillmann, I

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on "Nuclear Physics Data Compilation for Nucleosynthesis Modeling" held at the ECT* in Trento/ Italy from May 29th- June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The "JINA Reaclib Database" on http://www.nscl.msu.edu/\\~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections...

  18. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  19. Cumulative Reactant Species Index for Volumes I-V of the Compilation of Data Relevant to Gas Lasers. Volume VI.

    Science.gov (United States)

    1979-09-01

    cross section (relative) 2027 ICt Electron affinity 214, 1161 Extinction coefficient for continuous absorption 702, 2032 12 Electron affinity 214, 1161...864 On Ne, high-energy production of free electrons 1622 On W, elt -ctron yields from surface 865 On Xe, high-energy ionization 1624 On He, high-energy

  20. Nuclear regulatory legislation, 104th Congress. Volume 2, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-01

    This document is the second of two volumes compiling statutes and material pertaining to nuclear regulatory legislation through the 104th Congress, 2nd Session. It is intended for use as a U.S. Nuclear Regulatory Commission (NRC) internal resource document. Legislative information reproduced in this document includes portions of the Paperwork Reduction Act, various acts pertaining to low-level radioactive waste, the Clean Air Act, the Federal Water Pollution Control Act, the National Environmental Policy Act, the Hazardous Materials Transportation Act, the West Valley Demonstration Project Act, Nuclear Non-Proliferation and Export Licensing Statutes, and selected treaties, agreements, and executive orders. Other information provided pertains to Commissioner tenure, NRC appropriations, the Chief Financial Officers Act, information technology management reform, and Federal civil penalties.