WorldWideScience

Sample records for compilation brookhaven national

  1. Brookhaven National Laboratory

    Science.gov (United States)

    ... Medical Data Analysis for U.S. Veterans Making Glass Invisible: A Nanoscience-Based Disappearing Act Our Mission We ... our diverse portfolio of licensable technologies. | More Outlook Web Access Brookhaven National Lab PO Box 5000 Upton, ...

  2. Brookhaven highlights - Brookhaven National Laboratory 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    This report highlights research conducted at Brookhaven National Laboratory in the following areas: alternating gradient synchrotron; physics; biology; national synchrotron light source; department of applied science; medical; chemistry; department of advanced technology; reactor; safety and environmental protection; instrumentation; and computing and communications.

  3. BROOKHAVEN NATIONAL LABORATORY WILDLIFE MANAGEMENT PLAN.

    Energy Technology Data Exchange (ETDEWEB)

    NAIDU,J.R.

    2002-10-22

    The purpose of the Wildlife Management Plan (WMP) is to promote stewardship of the natural resources found at the Brookhaven National Laboratory (BNL), and to integrate their protection with pursuit of the Laboratory's mission.

  4. Dr. Praveen Chaudhari named director of Brookhaven National Laboratory

    CERN Multimedia

    2003-01-01

    "Brookhaven Science Associates announced today the selection of Dr. Praveen Chaudhari as Director of the U.S. Department of Energy's Brookhaven National Laboratory. Dr. Chaudhari, who will begin his new duties on April 1, joins Brookhaven Lab after 36 years of distinguished service at IBM as a scientist and senior manager of research" (1 page).

  5. Brookhaven National Laboratory site environmental report for calendar year 1991

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Royce, B.A.; Miltenberger, R.P.

    1992-09-01

    This publication presents the results of BNL`s environmental monitoring and compliance effort and provides an assessment of the impact of Brookhaven National Laboratory (BNL) operations on the environment. This document is the responsibility of the Environmental Protection Section of the Safety and Envirorunental Protection Division. Within this Section, the Environmental Monitoring Group (EMG) sample the environment, interpreted the results, performed the impact analysis of the emissions from BNL, and compiled the information presented here. In this effort, other groups of the Section: Compliance; Analytical; Ground Water; and Quality played a key role in addressing the regulatory aspects and the analysis and documentation of the data, respectively.

  6. Brookhaven National Laboratory site environmental report for calendar year 1991

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Royce, B.A.; Miltenberger, R.P.

    1992-09-01

    This publication presents the results of BNL's environmental monitoring and compliance effort and provides an assessment of the impact of Brookhaven National Laboratory (BNL) operations on the environment. This document is the responsibility of the Environmental Protection Section of the Safety and Envirorunental Protection Division. Within this Section, the Environmental Monitoring Group (EMG) sample the environment, interpreted the results, performed the impact analysis of the emissions from BNL, and compiled the information presented here. In this effort, other groups of the Section: Compliance; Analytical; Ground Water; and Quality played a key role in addressing the regulatory aspects and the analysis and documentation of the data, respectively.

  7. Brookhaven National Laboratory site environmental report for calendar year 1994

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Royce, B.A. [eds.

    1995-05-01

    This report documents the results of the Environmental Monitoring Program at Brookhaven National Laboratory and presents summary information about environmental compliance for 1994. To evaluate the effect of Brookhaven National Laboratory`s operations on the local environment, measurements of direct radiation, and a variety of radionuclides and chemical compounds in ambient air, soil, sewage effluent, surface water, groundwater, fauna and vegetation were made at the Brookhaven National Laboratory site and at sites adjacent to the Laboratory.

  8. Brookhaven National Laboratory site environmental report for calendar year 1996

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, G.L.; Paquette, D.E.; Naidu, J.R.; Lee, R.J.; Briggs, S.L.K.

    1998-01-01

    This report documents the results of the Environmental Monitoring Program at Brookhaven National Laboratory and summarizes information about environmental compliance for 1996. To evaluate the effect of Brookhaven National Laboratory`s operations on the local environment, measurements of direct radiation, and of a variety of radionuclides and chemical compounds in the ambient air, soil, sewage effluent, surface water, groundwater, fauna, and vegetation were made at the Brookhaven National Laboratory site and at adjacent sites. The report also evaluates the Laboratory`s compliance with all applicable guides, standards, and limits for radiological and non-radiological emissions and effluents to the environment.

  9. Geothermal materials development at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kukacka, L.E. [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    As part of the DOE/OGT response to recommendations and priorities established by industrial review of their overall R&D program, the Geothermal Materials Program at Brookhaven National Laboratory (BNL) is focusing on topics that can reduce O&M costs and increase competitiveness in foreign and domestic markets. Corrosion and scale control, well completion materials, and lost circulation control have high priorities. The first two topics are included in FY 1997 BNL activities, but work on lost circulation materials is constrained by budgetary limitations. The R&D, most of which is performed as cost-shared efforts with U.S. geothermal firms, is rapidly moving into field testing phases. FY 1996 and 1997 accomplishments in the development of lightweight CO{sub 2}-resistant cements for well completions; corrosion resistant, thermally conductive polymer matrix composites for heat exchange applications; and metallic, polymer and ceramic-based corrosion protective coatings are given in this paper. In addition, plans for work that commenced in March 1997 on thermally conductive cementitious grouting materials for use with geothermal heat pumps (GHP), are discussed.

  10. COMPUTATIONAL SCIENCE AT BROOKHAVEN NATIONAL LABORATORY: THREE SELECTED TOPICS.

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.W.DENG,Y.GLIMM,J.SAMULYAK,R.

    2003-09-15

    We present an overview of computational science at Brookhaven National Laboratory (BNL), with selections from three areas: fluids, nanoscience, and biology. The work at BNL in each of these areas is itself very broad, and we select a few topics for presentation within each of them.

  11. Brookhaven National Laboratory site environmental report for calendar year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Paquette, D.E.; Schroeder, G.L. [eds.] [and others

    1996-12-01

    This report documents the results of the Environmental Monitoring Program at Brookhaven National Laboratory and summarizes information about environmental compliance for 1995. To evaluate the effect of Brookhaven National Laboratory`s operations on the local environment, measurements of direct radiation, and of a variety of radionuclides and chemical compounds in the ambient air, soil, sewage effluent, surface water, groundwater, fauna, and vegetation were made at the Brookhaven National Laboratory site and at adjacent sites. The report also evaluates the Laboratory`s compliance with all applicable guides, standards, and limits for radiological and nonradiological emissions and effluents to the environment. Areas of known contamination are subject to Remedial Investigation/Feasibility Studies under the Inter Agency Agreement established by the Department of Energy, Environmental Protection Agency and the New York Department of Environmental Conservation. Except for identified areas of soil and groundwater contamination, the environmental monitoring data has continued to demonstrate that compliance was achieved with the applicable environmental laws and regulations governing emission and discharge of materials to the environment. Also, the data show that the environmental impacts at Brookhaven National Laboratory are minimal and pose no threat to the public nor to the environment. This report meets the requirements of Department of Energy Orders 5484.1, Environmental Protection, Safety, and Health Protection Information reporting requirements and 5400.1, General Environmental Protection Programs.

  12. Brookhaven National Laboratory moves to the fast lane

    CERN Multimedia

    2006-01-01

    "The U.S. Department of Energy's energy sciences network (ESnet) continues to roll out its next-generation architecture on schedule with the March 14 completion of the Long Island Metropolitan Area Network, connecting Brookhaven National Laboratory (BNL) to the ESnet point of presente (PO) 60 miles away in New York City." (1 page)

  13. BROOKHAVEN NATIONAL LABORATORY INSTITUTIONAL PLAN FY2003-2007.

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-06-10

    This document presents the vision for Brookhaven National Laboratory (BNL) for the next five years, and a roadmap for implementing that vision. Brookhaven is a multidisciplinary science-based laboratory operated for the U.S. Department of Energy (DOE), supported primarily by programs sponsored by the DOE's Office of Science. As the third-largest funding agency for science in the U.S., one of the DOE's goals is ''to advance basic research and the instruments of science that are the foundations for DOE's applied missions, a base for U.S. technology innovation, and a source of remarkable insights into our physical and biological world, and the nature of matter and energy'' (DOE Office of Science Strategic Plan, 2000 http://www.osti.gov/portfolio/science.htm). BNL shapes its vision according to this plan.

  14. BROOKHAVEN NATIONAL LABORATORY SITE ENVIRONMENTAL REPORT FOR CALENDAR YEAR 1994.

    Energy Technology Data Exchange (ETDEWEB)

    NAIDU,J.R.; ROYCE,B.A.

    1995-05-01

    This report documents the results of the Environmental Monitoring Program at Brookhaven National Laboratory and presents summary information about environmental compliance for 1994. To evaluate the effect of Brookhaven National Laboratory's operations on the local environment, measurements of direct radiation, and a variety of radionuclides and chemical compounds in ambient air, soil, sewage effluent, surface water, groundwater, fauna and vegetation were made at the Brookhaven National Laboratory site and at sites adjacent to the Laboratory. Brookhaven National Laboratory's compliance with all applicable guides, standards, and limits for radiological and nonradiological emissions and effluents to the environment were evaluated. Among the permitted facilities, two instances of pH exceedances were observed at recharge basins, possibly related to rain-water run-off to these recharge basins. Also, the discharge from the Sewage Treatment Plant to the Peconic River exceeded. on ten occasions, one each for fecal coliform and 5-day Biochemical Oxygen Demand (avg.) and eight for ammonia nitrogen. The ammonia and Biochemical Oxygen Demand exceedances were attributed to the cold winter and the routine cultivation of the sand filter beds which resulted in the hydraulic overloading of the filter beds and the possible destruction of nitrifying bacteria. The on-set of warm weather and increased aeration of the filter beds via cultivation helped to alleviate this condition. The discharge of fecal coliform may also be linked to this occurrence, in that the increase in fecal coliform coincided with the increased cultivation of the sand filter beds. The environmental monitoring data has identified site-specific contamination of groundwater and soil. These areas are subject to Remedial Investigation/Feasibility Studies under the Inter Agency Agreement. Except for the above, the environmental monitoring data has continued to demonstrate that compliance was achieved with

  15. Brookhaven National Laboratory Institutional Plan FY2001--FY2005

    Energy Technology Data Exchange (ETDEWEB)

    Davis, S.

    2000-10-01

    Brookhaven National Laboratory is a multidisciplinary laboratory in the Department of Energy National Laboratory system and plays a lead role in the DOE Science and Technology mission. The Laboratory also contributes to the DOE missions in Energy Resources, Environmental Quality, and National Security. Brookhaven strives for excellence in its science research and in facility operations and manages its activities with particular sensitivity to environmental and community issues. The Laboratory's programs are aligned continuously with the goals and objectives of the DOE through an Integrated Planning Process. This Institutional Plan summarizes the portfolio of research and capabilities that will assure success in the Laboratory's mission in the future. It also sets forth BNL strategies for our programs and for management of the Laboratory. The Department of Energy national laboratory system provides extensive capabilities in both world class research expertise and unique facilities that cannot exist without federal support. Through these national resources, which are available to researchers from industry, universities, other government agencies and other nations, the Department advances the energy, environmental, economic and national security well being of the US, provides for the international advancement of science, and educates future scientists and engineers.

  16. 1995 Annual epidemiologic surveillance report for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-31

    The US Department of Energy`s (DOE) conduct of epidemiologic surveillance provides an early warning system for health problems among workers. This program monitors illnesses and health conditions that result in an absence of five or more consecutive workdays, occupational injuries and illnesses, and disabilities and deaths among current workers. This report summarizes epidemiologic surveillance data collected from Brookhaven National Laboratory (BNL) from January 1, 1995 through December 31, 1995. The data were collected by a coordinator at BNL and submitted to the Epidemiologic Surveillance Data Center, located at Oak Ridge Institute for Science and Education, where quality control procedures and data analyses were carried out.

  17. Brookhaven National Laboratory site report for calendar year 1988

    Energy Technology Data Exchange (ETDEWEB)

    Miltenberger, R.P.; Royce, B.A.; Naidu, J.R.

    1989-06-01

    Brookhaven National Laboratory (BNL) is managed by Associated Universities Inc. (AUI). AUI was formed in 1946 by a group of nine universities whose purpose was to create and manage a laboratory in the Northeast in order to advance scientific research in areas of interest to universities, industry, and government. On January 31, 1947, the contract for BNL was approved by the Manhattan District of the Army Corps of Engineers and BNL was established on the former Camp Upton army camp. 54 refs., 21 figs., 78 tabs.

  18. Brookhaven National Laboratory site environmental report for calendar year 1990

    Energy Technology Data Exchange (ETDEWEB)

    Miltenberger, R.P.; Royce, B.A.; Naidu, J.R.

    1992-01-01

    Brookhaven National Laboratory (BNL) carries out basic and applied research in the following fields: high-energy nuclear and solid state physics; fundamental material and structure properties and the interactions of matter; nuclear medicine, biomedical and environmental sciences; and selected energy technologies. In conducting these research activities, it is Laboratory policy to protect the health and safety of employees and the public, and to minimize the impact of BNL operations on the environment. This document is the BNL environmental report for the calendar year 1990 for the safety and Environmental Protection division and corners topics on effluents, surveillance, regulations, assessments, and compliance.

  19. The High Flux Beam Reactor at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Shapiro, S.M.

    1994-12-31

    Brookhaven National Laboratory`s High Flux Beam Reactor (HFBR) was built because of the need of the scientist to always want `more`. In the mid-50`s the Brookhaven Graphite reactor was churning away producing a number of new results when the current generation of scientists, led by Donald Hughes, realized the need for a high flux reactor and started down the political, scientific and engineering path that led to the BFBR. The effort was joined by a number of engineers and scientists among them, Chemick, Hastings, Kouts, and Hendrie, who came up with the novel design of the HFBR. The two innovative features that have been incorporated in nearly all other research reactors built since are: (i) an under moderated core arrangement which enables the thermal flux to peak outside the core region where beam tubes can be placed, and (ii) beam tubes that are tangential to the core which decrease the fast neutron background without affecting the thermal beam intensity. Construction began in the fall of 1961 and four years later, at a cost of $12 Million, criticality was achieved on Halloween Night, 1965. Thus began 30 years of scientific accomplishments.

  20. High field magnet program at Brookhaven National Laboratory

    CERN Document Server

    Ghosh, A; Muratore, J; Parker, B; Sampson, W; Wanderer, P J; Willen, E

    2000-01-01

    The magnet program at Brookhaven National Laboratory (BNL) is focussed on superconducting magnets for particle accelerators. The effort includes magnet production at the laboratory and in industry, magnet R&D, and test facilities for magnets and superconductors. Nearly 2000 magnets-dipoles, quadrupoles, sextupoles and correctors for the arc and insertion regions-were produced for the Relativistic Heavy Ion Collider (RHIC), which is being commissioned. Currently, production of helical dipoles for the polarized proton program at RHIC, insertion region dipoles for the Large Hadron Collider (LHC) at CERN, and an insertion magnet system for the Hadron-Elektron-Ring- Analage (HERA) collider at Deutsches Elektronen-Synchrotron (DESY) is underway. The R&D effort is exploring dipoles with fields above 10 T for use in post-LHC colliders. Brittle superconductors-Nb/sub 3/Sn or HTS-are being used for these magnets. The superconductor test facility measures short-sample currents and other characteristics of sample...

  1. Deriving cleanup guidelines for radionuclides at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Meinhold, A.F.; Morris, S.C.; Dionne, B.; Moskowitz, P.D.

    1997-01-01

    Past activities at Brookhaven National Laboratory (BNL) resulted in soil and groundwater contamination. As a result, BNL was designated a Superfund site under the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). BNL`s Office of Environmental Restoration (OER) is overseeing environmental restoration activities at the Laboratory. With the exception of radium, there are no regulations or guidelines to establish cleanup guidelines for radionuclides in soils at BNL. BNL must derive radionuclide soil cleanup guidelines for a number of Operable Units (OUs) and Areas of Concern (AOCs). These guidelines are required by DOE under a proposed regulation for radiation protection of public health and the environment as well as to satisfy the requirements of CERCLA. The objective of this report is to propose a standard approach to deriving risk-based cleanup guidelines for radionuclides in soil at BNL. Implementation of the approach is briefly discussed.

  2. Summary of failure analysis activities at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Cowgill, M.G.; Czajkowski, C.J.; Franz, E.M.

    1996-10-01

    Brookhaven National Laboratory has for many years conducted examinations related to the failures of nuclear materials and components. These examinations included the confirmation of root cause analyses, the determination of the causes of failure, identification of the species that accelerate corrosion, and comparison of the results of nondestructive examinations with those obtained by destructive examination. The results of those examinations, which had previously appeared in various formats (formal and informal reports, journal articles, etc.), have been collected together and summarized in the present report. The report is divided into sections according to the general subject matter (for example, corrosion, fatigue, etc.). Each section presents summaries of the information contained in specific reports and publications, all of which are fully identified as to title, authors, report number or journal reference, date of publication, and FIN number under which the work was performed.

  3. WILDLAND FIRE MANAGEMENT PLAN FOR BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    ENVIRONMENTAL AND WASTE MANAGEMENT SERVICES DIVISION

    2003-09-01

    This Wildland Fire Management Plan (FMP) for Brookhaven National Lab (BNL) and the Upton Ecological and Research Reserve (Upton Reserve) is based on the U.S. Fish & Wildlife Service (FWS) fire management planning procedures and was developed in cooperation with the Department of Energy (DOE) by Brookhaven Science Associates. As the Upton Reserve is contained within the BNL 5,265-acre site, it is logical that the plan applies to both the Upton Reserve and BNL. The Department of the Interior policy for managing wildland fires requires that all areas managed by FWS that can sustain fire must have an FMP that details fire management guidelines for operational procedures and specifies values to be protected or enhanced. Fire management plans provide guidance on fire preparedness, fire prevention, wildfire suppression, and the use of controlled, ''prescribed'' fires and mechanical means to control the amount of available combustible material. Values reflected in the BNL/Upton Reserve Wildland FMP include protecting life and public safety; Lab properties, structures and improvements; cultural and historical sites; neighboring private and public properties; and endangered and threatened species and species of concern. Other values supported by the plan include the enhancement of fire-dependent ecosystems at BNL and the Upton Reserve. This FMP will be reviewed periodically to ensure the fire program advances and evolves with the missions of FWS, BNL, and the Upton Reserve. This Fire Management Plan is a modified version of the Long Island National Wildlife Refuge Complex Fire plan (updated in 2000), which contains all FWS fire plan requirements and is presented in the format specified by the national template for fire management plans adopted under the National Fire Plan. The DOE is one of the signatory agencies on the National Fire Plan. FWS shall be, through an Interagency Agreement dated November 2000 (Appendix C), responsible for coordinating and

  4. Environmental Survey preliminary report, Brookhaven National Laboratory, Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    1988-06-01

    This report presents the preliminary findings from the first phase of the Environmental Survey of the United States Department of Energy (DOE) Brookhaven National Laboratory (BNL) conducted April 6 through 17, 1987. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with BNL. The Survey covers all environmental media and all areas of environmental regulation. It is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations carried on at BNL, and interviews with site personnel. The Survey team developed a Sampling and Analysis Plan to assist in further assessing specific environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by Oak Ridge National Laboratory. When completed, the results will be incorporated into the BNL Environmental Survey Interim Report. The Interim Report will reflect the final determinations of the BNL Survey. 80 refs., 24 figs., 48 tabs.

  5. Brookhaven National Laboratory 2008 Site Environment Report Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Brookhaven National Laboratory

    2009-10-01

    Brookhaven National Laboratory (BNL) prepares an annual Site Environmental Report (SER) in accordance with DOE Order 231.1A, Environment, Safety and Health Reporting of the U.S. Department of Energy. The report is written to inform the public, regulators, employees, and other stakeholders of the Laboratory's environmental performance during the calendar year in review. Volume I of the SER summarizes environmental data; environmental management performance; compliance with applicable DOE, federal, state, and local regulations; and performance in restoration and surveillance monitoring programs. BNL has prepared annual SERs since 1971 and has documented nearly all of its environmental history since the Laboratory's inception in 1947. Volume II of the SER, the Groundwater Status Report, also is prepared annually to report on the status of and evaluate the performance of groundwater treatment systems at the Laboratory. Volume II includes detailed technical summaries of groundwater data and its interpretation, and is intended for internal BNL users, regulators, and other technically oriented stakeholders. A brief summary of the information contained in Volume II is included in this volume in Chapter 7, Groundwater Protection. Both reports are available in print and as downloadable files on the BNL web page at http://www.bnl.gov/ewms/ser/. An electronic version on compact disc is distributed with each printed report. In addition, a summary of Volume I is prepared each year to provide a general overview of the report, and is distributed with a compact disc containing the full report.

  6. CULTURAL RESOURCE MANAGEMENT PLAN FOR BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    DAVIS, M.

    2005-04-01

    The Cultural Resource Management Plan (CRMP) for Brookhaven National Laboratory (BNL) provides an organized guide that describes or references all facets and interrelationships of cultural resources at BNL. This document specifically follows, where applicable, the format of the U.S. Department of Energy (DOE) Environmental Guidelines for Development of Cultural Resource Management Plans, DOE G 450.1-3 (9-22-04[m1]). Management strategies included within this CRMP are designed to adequately identify the cultural resources that BNL and DOE consider significant and to acknowledge associated management actions. A principal objective of the CRMP is to reduce the need for additional regulatory documents and to serve as the basis for a formal agreement between the DOE and the New York State Historic Preservation Officer (NYSHPO). The BNL CRMP is designed to be a ''living document.'' Each section includes identified gaps in the management plan, with proposed goals and actions for addressing each gap. The plan will be periodically revised to incorporate new documentation.

  7. In vivo neutron activation facility at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Ma, R.; Yasumura, Seiichi; Dilmanian, F.A.

    1997-11-01

    Seven important body elements, C, N, Ca, P, K, Na, and Cl, can be measured with great precision and accuracy in the in vivo neutron activation facilities at Brookhaven National Laboratory. The facilities include the delayed-gamma neutron activation, the prompt-gamma neutron activation, and the inelastic neutron scattering systems. In conjunction with measurements of total body water by the tritiated-water dilution method several body compartments can be defined from the contents of these elements, also with high precision. In particular, body fat mass is derived from total body carbon together with total body calcium and nitrogen; body protein mass is derived from total body nitrogen; extracellular fluid volume is derived from total body sodium and chlorine; lean body mass and body cell mass are derived from total body potassium; and, skeletal mass is derived from total body calcium. Thus, we suggest that neutron activation analysis may be valuable for calibrating some of the instruments routinely used in clinical studies of body composition. The instruments that would benefit from absolute calibration against neutron activation analysis are bioelectric impedance analysis, infrared interactance, transmission ultrasound, and dual energy x-ray/photon absorptiometry.

  8. Tiger Team assessment of the Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    This report documents the results of the Department of Energy's (DOE's) Tiger Team Assessment conducted at Brookhaven National Laboratory (BNL) in Upton, New York, between March 26 and April 27, 1990. The BNL is a multiprogram laboratory operated by the Associated Universities, Inc., (AUI) for DOE. The purpose of the assessment was to provide the status of environment, safety, and health (ES H) programs at the Laboratory. The scope of the assessment included a review of management systems and operating procedures and records; observations of facility operations; and interviews at the facilities. Subteams in four areas performed the review: ES H, Occupational Safety and Health, and Management and Organization. The assessment was comprehensive, covering all areas of ES H activities and waste management operations. Compliance with applicable Federal, State, and local regulations; applicable DOE Orders; and internal BNL requirements was assessed. In addition, the assessment included an evaluation of the adequacy and effectiveness of the DOE and the site contractor, Associated Universities, Inc. (AUI), management, organization, and administration of the ES H programs at BNL.

  9. Tiger Team assessment of the Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    This report documents the results of the Department of Energy's (DOE's) Tiger Team Assessment conducted at Brookhaven National Laboratory (BNL) in Upton, New York, between March 26 and April 27, 1990. The BNL is a multiprogram laboratory operated by the Associated Universities, Inc., (AUI) for DOE. The purpose of the assessment was to provide the status of environment, safety, and health (ES H) programs at the laboratory. The scope of the assessment included a review of management systems and operating procedures and records; observations of facility operations; and interviews at the facilities. Subteams in four areas performed the review: ES H, Occupational Safety and Health, and Management and Organization. The assessment was comprehensive, covering all areas of ES H activities and waste management operations. Compliance with applicable Federal, State, and local regulations; applicable DOE Orders; and internal BNL requirements was assessed. In addition, the assessment included an evaluation of the adequacy and effectiveness of the DOE and the site contractor, Associated Universities, Inc. (AUI), management, organization, and administration of the ES H programs at BNL. This volume contains appendices.

  10. Proto-2, an ALICE detector prototype, part of the STAR experiment at the Brookhaven National Laboratory

    CERN Multimedia

    2002-01-01

    Proto-2, an LAICE detector prototype, overcame its prototype status to become a real part of the SDTAR, epxeriment at the US Brookhaven National Laboratory. After more than two years across the ocean, it has just arrived back at CERN.

  11. The Brookhaven National Laboratory electron beam ion source for RHIC.

    Science.gov (United States)

    Alessi, J G; Barton, D; Beebe, E; Bellavia, S; Gould, O; Kponou, A; Lambiase, R; Lockey, R; McNerney, A; Mapes, M; Marneris, Y; Okamura, M; Phillips, D; Pikin, A I; Raparia, D; Ritter, J; Snydstrup, L; Theisen, C; Wilinski, M

    2010-02-01

    As part of a new heavy ion preinjector that will supply beams for the Relativistic Heavy Ion Collider and the National Aeronautics and Space Administration Space Radiation Laboratory, construction of a new electron beam ion source (EBIS) is now being completed. This source, based on the successful prototype Brookhaven National Laboratory Test EBIS, is designed to produce milliampere level currents of all ion species, with q/m=(1/6)-(1/2). Among the major components of this source are a 5 T, 2-m-long, 204 mm diameter warm bore superconducting solenoid, an electron gun designed to operate at a nominal current of 10 A, and an electron collector designed to dissipate approximately 300 kW of peak power. Careful attention has been paid to the design of the vacuum system, since a pressure of 10(-10) Torr is required in the trap region. The source includes several differential pumping stages, the trap can be baked to 400 C, and there are non-evaporable getter strips in the trap region. Power supplies include a 15 A, 15 kV electron collector power supply, and fast switchable power supplies for most of the 16 electrodes used for varying the trap potential distribution for ion injection, confinement, and extraction. The EBIS source and all EBIS power supplies sit on an isolated platform, which is pulsed up to a maximum of 100 kV during ion extraction. The EBIS is now fully assembled, and operation will be beginning following final vacuum and power supply tests. Details of the EBIS components are presented.

  12. The Founding of the Brookhaven National Laboratory - Associated Universities, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    BROOKHAVEN NATIONAL LABORATORY

    1948-01-15

    At the end of the war it became apparent that the teamwork of government and scientific institutions, which had been so effective in wartime work, must somehow be perpetuated in order to insure the continued progress of nuclear science in peace time. The enormous expense of the tools needed to pursue the next steps in this research -- nuclear reactors and high energy accelerators -- and the shortage of scientifically trained personnel pointed towards the establishment of a cooperative laboratory. Such a laboratory, using government funds, could carry out a comprehensive research program that would benefit the many interested research groups throughout the country. As a result of the wartime programs under the Manhattan District, centers of research in nuclear science were already active at the Radiation Laboratory in Berkeley, California, at Los Alamos in New Mexico, at the Clinton Laboratories in Oak Ridge, Tennessee and at the Argonne Laboratory in Chicago. No analogous nuclear research laboratories, however, had developed in the Northeast, and since so much of the nation's scientific talent and industrial activities are concentrated in the northeastern states, it was proposed that a new laboratory be established near New York City. As a result of this plan, the Brookhaven National Laboratory is now in operation at Upton, Long Island. The work of this Laboratory is performed under a contract between the Atomic Energy Commission (AEC) and a corporation, Associated Universities, Inc. (AUI) , formed by representatives of nine of the larger private universities in the northeast: Columbia, Cornell, Harvard, Johns Hopkins, the Massachusetts Institute of Technology, the University of Pennsylvania, Princeton, the University of Rochester, and Yale. The purpose of this laboratory is the advancement of knowledge in the fundamentals of nuclear science, the extension of its application to other fields, and the training of young scientists in these new subjects. This

  13. Wildland Fire Management Plan for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Green,T.

    2009-10-23

    This Wildland Fire Management Plan (FMP) for Brookhaven National Lab (BNL) updates the 2003 plan incorporating changes necessary to comply with DOE Order 450.1 and DOE P 450.4, Federal Wildland Fire Management Policy and Program Review; Wildland and Prescribed Fire Management Policy and implementation Procedures Reference Guide. This current plan incorporates changes since the original draft of the FMP that result from new policies on the national level. This update also removes references and dependence on the U.S. Fish & Wildlife Service and Department of the Interior, fully transitioning Wildland Fire Management responsibilities to BNL. The Department of Energy policy for managing wildland fires requires that all areas, managed by the DOE and/or its various contractors, that can sustain fire must have a FMP that details fire management guidelines for operational procedures associated with wild fire, operational, and prescribed fires. Fire management plans provide guidance on fire preparedness, fire prevention, wildfire suppression, and the use of controlled, 'prescribed' fires and mechanical means to control the amount of available combustible material. Values reflected in the BNL Wildland FMP include protecting life and public safety; Lab properties, structures and improvements; cultural and historical sites; neighboring private and public properties; and endangered, threatened, and species of concern. Other values supported by the plan include the enhancement of fire-dependent ecosystems at BNL. This FMP will be reviewed periodically to ensure the fire program advances and evolves with the missions of the DOE and BNL. This Fire Management Plan is presented in a format that coverers all aspects specified by DOE guidance documents which are based on the national template for fire management plans adopted under the National Fire Plan. The DOE is one of the signatory agencies on the National Fire Plan. This FMP is to be used and implemented for the

  14. NATURAL RESOURCE MANAGEMENT PLAN FOR BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    GREEN,T.ET AL.

    2003-12-31

    Brookhaven National Laboratory (BNL) is located near the geographic center of Long Island, New York. The Laboratory is situated on 5,265 acres of land composed of Pine Barrens habitat with a central area developed for Laboratory work. In the mid-1990s BNL began developing a wildlife management program. This program was guided by the Wildlife Management Plan (WMP), which was reviewed and approved by various state and federal agencies in September 1999. The WMP primarily addressed concerns with the protection of New York State threatened, endangered, or species of concern, as well as deer populations, invasive species management, and the revegetation of the area surrounding the Relativistic Heavy Ion Collider (RHIC). The WMP provided a strong and sound basis for wildlife management and established a basis for forward motion and the development of this document, the Natural Resource Management Plan (NRMP), which will guide the natural resource management program for BNL. The body of this plan establishes the management goals and actions necessary for managing the natural resources at BNL. The appendices provide specific management requirements for threatened and endangered amphibians and fish (Appendices A and B respectively), lists of actions in tabular format (Appendix C), and regulatory drivers for the Natural Resource Program (Appendix D). The purpose of the Natural Resource Management Plan is to provide management guidance, promote stewardship of the natural resources found at BNL, and to integrate their protection with pursuit of the Laboratory's mission. The philosophy or guiding principles of the NRMP are stewardship, adaptive ecosystem management, compliance, integration with other plans and requirements, and incorporation of community involvement, where applicable.

  15. Brookhaven National Laboratory site environmental report for calendar year 1993

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Royce, B.A. [eds.

    1994-05-01

    This report documents the results of the Environmental Monitoring Program at BNL and presents summary information about environmental compliance for 1993. To evaluate the effect of BNL operations on the local environment, measurements of direct radiation, and a variety of radionuclides and chemical compounds in ambient air, soil, sewage effluent, surface water, ground water and vegetation were made at the BNL site and at sites adjacent to the Laboratory. Brookhaven National Laboratory`s compliance with all applicable guides, standards, and limits for radiological and nonradiological emissions to the environment were evaluated. Among the permitted facilities, two instances, of pH exceedances were observed at recharge basins, possible related to rain-water run-off to these recharge basins. Also, the discharge from the Sewage Treatment Plant (STP) to the Peconic River exceeded on five occasions, three for residual chlorine and one each for iron and ammonia nitrogen. The chlorine exceedances were related to a malfunctioning hypochlorite dosing pump and ceased when the pump was repaired. While the iron and ammonia-nitrogen could be the result of disturbances to the sand filter beds during maintenance. The environmental monitoring data has identified site-specific contamination of ground water and soil. These areas are subject to Remedial Investigation/Feasibility Studies (RI/FS) under the Inter Agency Agreement (IAG). Except for the above, the environmental monitoring data has continued to demonstrate that compliance was achieved with applicable environmental laws and regulations governing emission and discharge of materials to the environment, and that the environmental impacts at BNL are minimal and pose no threat to the public or to the environment. This report meets the requirements of DOE Orders 5484. 1, Environmental Protection, Safety, and Health Protection Information reporting requirements and 5400.1, General Environmental Protection Programs.

  16. Natural Resource Management Plan for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    green, T.

    2011-08-15

    This comprehensive Natural Resource Management Plan (NRMP) for Brookhaven National Laboratory (BNL) was built on the successful foundation of the Wildlife Management Plan for BNL, which it replaces. This update to the 2003 plan continues to build on successes and efforts to better understand the ecosystems and natural resources found on the BNL site. The plan establishes the basis for managing the varied natural resources located on the 5,265 acre BNL site, setting goals and actions to achieve those goals. The planning of this document is based on the knowledge and expertise gained over the past 10 years by the Natural Resources management staff at BNL in concert with local natural resource agencies including the New York State Department of Environmental Conservation, Long Island Pine Barrens Joint Planning and Policy Commission, The Nature Conservancy, and others. The development of this plan is an attempt at sound ecological management that not only benefits BNL's ecosystems but also benefits the greater Pine Barrens habitats in which BNL is situated. This plan applies equally to the Upton Ecological and Research Reserve (Upton Reserve). Any difference in management between the larger BNL area and the Upton Reserve are noted in the text. The purpose of the Natural Resource Management Plan (NRMP) is to provide management guidance, promote stewardship of the natural resources found at BNL, and to sustainably integrate their protection with pursuit of the Laboratory's mission. The philosophy or guiding principles of the NRMP are stewardship, sustainability, adaptive ecosystem management, compliance, integration with other plans and requirements, and the incorporation of community involvement, where applicable. The NRMP is periodically reviewed and updated, typically every five years. This review and update was delayed to develop documents associated with a new third party facility, the Long Island Solar Farm. This two hundred acre facility will result in

  17. A woman like you: Women scientists and engineers at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Benkovitz, Carmen; Bernholc, Nicole; Cohen, Anita; Eng, Susan; Enriquez-Leder, Rosario; Franz, Barbara; Gorden, Patricia; Hanson, Louise; Lamble, Geraldine; Martin, Harriet; Mastrangelo, Iris; McLane, Victoria; Villela, Maria-Alicia; Vivirito, Katherine; Woodhead, Avril

    1991-01-01

    This publication by the women in Science and Engineering introduces career possibilities in science and engineering. It introduces what work and home life are like for women who have already entered these fields. Women at Brookhaven National Laboratory work in a variety of challenging research roles -- from biologist and environmental scientist to safety engineer, from patent lawyer to technician. Brookhaven National Laboratory is a multi-program laboratory which carries out basic and applied research in the physical, biomedical and environmental sciences and in selected energy technologies. The Laboratory is managed by Associated University, Inc., under contract with the US Department of Energy. Brookhaven and the other national laboratories, because of their enormous research resources, can play a critical role in a education and training of the workforce.

  18. A woman like you: Women scientists and engineers at Brookhaven National Laboratory. Careers in action

    Energy Technology Data Exchange (ETDEWEB)

    1991-12-31

    This publication by the women in Science and Engineering introduces career possibilities in science and engineering. It introduces what work and home life are like for women who have already entered these fields. Women at Brookhaven National Laboratory work in a variety of challenging research roles -- from biologist and environmental scientist to safety engineer, from patent lawyer to technician. Brookhaven National Laboratory is a multi-program laboratory which carries out basic and applied research in the physical, biomedical and environmental sciences and in selected energy technologies. The Laboratory is managed by Associated University, Inc., under contract with the US Department of Energy. Brookhaven and the other national laboratories, because of their enormous research resources, can play a critical role in a education and training of the workforce.

  19. Aberration-Coreected Electron Microscopy at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Zhu,Y.; Wall, J.

    2008-04-01

    The last decade witnessed the rapid development and implementation of aberration correction in electron optics, realizing a more-than-70-year-old dream of aberration-free electron microscopy with a spatial resolution below one angstrom [1-9]. With sophisticated aberration correctors, modern electron microscopes now can reveal local structural information unavailable with neutrons and x-rays, such as the local arrangement of atoms, order/disorder, electronic inhomogeneity, bonding states, spin configuration, quantum confinement, and symmetry breaking [10-17]. Aberration correction through multipole-based correctors, as well as the associated improved stability in accelerating voltage, lens supplies, and goniometers in electron microscopes now enables medium-voltage (200-300kV) microscopes to achieve image resolution at or below 0.1nm. Aberration correction not only improves the instrument's spatial resolution but, equally importantly, allows larger objective lens pole-piece gaps to be employed thus realizing the potential of the instrument as a nanoscale property-measurement tool. That is, while retaining high spatial resolution, we can use various sample stages to observe the materials response under various temperature, electric- and magnetic- fields, and atmospheric environments. Such capabilities afford tremendous opportunities to tackle challenging science and technology issues in physics, chemistry, materials science, and biology. The research goal of the electron microscopy group at the Dept. of Condensed Matter Physics and Materials Science and the Center for Functional Nanomaterials, as well as the Institute for Advanced Electron Microscopy, Brookhaven National Laboratory (BNL), is to elucidate the microscopic origin of the physical- and chemical-behavior of materials, and the role of individual, or groups of atoms, especially in their native functional environments. We plan to accomplish this by developing and implementing various quantitative

  20. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  1. First magnet constructed for the LHC by Brookhaven National Laboratory, USA

    CERN Document Server

    Maximilien Brice

    2003-01-01

    CERN has taken delivery of the first US-built contribution to the LHC. The 25-tonne interaction-region dipole magnet, which will guide the LHC´s two counter-rotating beams of protons into collision, was built at the US Brookhaven National Laboratory. It is the first of 20 that the laboratory will ultimately provide and took nine months for more than 100 scientists, engineers and technicians to construct. Brookhaven´s Superconducting Magnet Division is now building the remaining 19 magnets, which will be shipped to CERN later this year. They are provided for the LHC under the terms of a 1998 agreement between CERN and the US Department of Energy and National Science Foundation.

  2. Qualitative risk evaluation of environmental restoration programs at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Morris, S.C.

    1996-05-01

    This report documents the evaluation of risks associated with environmental restoration activities at Brookhaven National Laboratory using two tools supplied by DOE to provide a consistent set of risk estimates across the DOE complex: Risk Data Sheets (RDS) and Relative Risk Ranking. The tools are described, the process taken characterized, results provided and discussed. The two approaches are compared and recommendations provided for continuing improvement of the process.

  3. 2003 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report, Revised September 2007

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Security, Office of Illness and Injury Prevention Programs

    2007-10-02

    Annual Illness and Injury Surveillance Program report for 2003 for Brookhaven National Lab. The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The IISP monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.

  4. Radiological environmental monitoring report for Brookhaven National Laboratory 1967--1970

    Energy Technology Data Exchange (ETDEWEB)

    Meinhold, C.B.; Hull, A.P.

    1998-10-01

    Brookhaven National Laboratory (BNL) was established in 1947 on the former Army Camp Upton site located in central Long Island, New York. From the very beginning, BNL has monitored the environment on and around the Laboratory site to assess the effects of its operations on the environment. This document summarizes the environmental data collected for the years 1967, 1968, 1969, and 1970. Thus, it fills a gap in the series of BNL annual environmental reports beginning in 1962. The data in this document reflect measurements for those four years of concentrations and/or amounts of airborne radioactivity, radioactivity in streams and ground water, and external radiation levels in the vicinity of BNL. Also included are estimates, made at that time, of BNL`s contribution to radioactivity in the environment. Among the major scientific facilities operated at BNL are the High Flux Beam Reactor, Medical Research Reactor, Brookhaven Graphite Research Reactor, Alternating Gradient Synchrotron, and the 60-inch Cyclotron.

  5. Assessment of Energy Efficiency Project Financing Alternatives for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, W. D.; Hail, John C.; Sullivan, Gregory P.

    2000-02-14

    This document provides findings and recommendations that resulted from an assessment of the Brookhaven National Laboratory by a team from Pacific Northwest National Laboratory to assess the site's potential for various alternative financing options as a means to implement energy-efficiency improvements. The assessment looked for life-cycle cost-effective energy-efficiency improvement opportunities, and through a series of staff interviews, evaluated the various methods by which these opportunities may be financed, while considering availability of funds, staff, and available financing options. This report summarizes the findings of the visit and the resulting recommendations.

  6. Synchrotron radiation applications in medical research at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Thomlinson, W.

    1997-08-01

    In the relatively short time that synchrotrons have been available to the scientific community, their characteristic beams of UV and X-ray radiation have been applied to virtually all areas of medical science which use ionizing radiation. The ability to tune intense monochromatic beams over wide energy ranges clearly differentiates these sources from standard clinical and research tools. The tunable spectrum, high intrinsic collimation of the beams, polarization and intensity of the beams make possible in-vitro and in-vivo research and therapeutic programs not otherwise possible. From the beginning of research operation at the National Synchrotron Light Source (NSLS), many programs have been carrying out basic biomedical research. At first, the research was limited to in-vitro programs such as the x-ray microscope, circular dichroism, XAFS, protein crystallography, micro-tomography and fluorescence analysis. Later, as the coronary angiography program made plans to move its experimental phase from SSRL to the NSLS, it became clear that other in-vivo projects could also be carried out at the synchrotron. The development of SMERF (Synchrotron Medical Research Facility) on beamline X17 became the home not only for angiography but also for the MECT (Multiple Energy Computed Tomography) project for cerebral and vascular imaging. The high energy spectrum on X17 is necessary for the MRT (Microplanar Radiation Therapy) experiments. Experience with these programs and the existence of the Medical Programs Group at the NSLS led to the development of a program in synchrotron based mammography. A recent adaptation of the angiography hardware has made it possible to image human lungs (bronchography). Fig. 1 schematically depicts the broad range of active programs at the NSLS.

  7. Risk-based priority scoring for Brookhaven National Laboratory environmental restoration programs

    Energy Technology Data Exchange (ETDEWEB)

    Morris, S.C.; Meinhold, A.F.

    1995-05-01

    This report describes the process of estimating the risk associated with environmental restoration programs under the Brookhaven National Laboratory Office of Environmental Restoration. The process was part of an effort across all Department of Energy facilities to provide a consistent framework to communicate risk information about the facilities to senior managers in the DOE Office of Environmental Management to foster understanding of risk activities across programs. the risk evaluation was a qualitative exercise. Categories considered included: Public health and safety; site personnel safety and health; compliance; mission impact; cost-effective risk management; environmental protection; inherent worker risk; environmental effects of clean-up; and social, cultural, political, and economic impacts.

  8. Applications of nuclear techniques for in vivo body composition studies at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, S.H.; Ellis, K.J.; Vartsky, D.; Vaswani, A.N.; Wielopolski, L.

    1981-01-01

    A series of technical developments and their clinical applications in various nuclear technologies at Brookhaven National Laboratory is described. These include the development of a portable neutron activation facility for measuring cadmium in vivo in kidney and liver, a technique for the measurement of body iron utilizing nuclear resonant scattering of gamma rays, a non-invasive measure of the skeletal levels of lead by an x-ray fluorescence technique, and the development of a pulsed Van de Graaff generator as a source of pulsed neutrons for the measurement of lung silicon. (ACR)

  9. Summary of proposed approach for deriving cleanup guidelines for radionuclides in soil at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Meinhold, A.F.; Morris, S.C.; Dionne, B.; Moskowitz, P.D.

    1996-11-01

    Past activities at Brookhaven National Laboratory (BNL) resulted in soil and groundwater contamination. As a result, BNL was designated a Superfund site under the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). BNL`s Office of Environmental Restoration (OER) is overseeing environmental restoration activities at the Laboratory, carried out under an Interagency Agreement (IAG) with the United States Department of Energy (DOE), the United States Environmental Protection Agency (EPA) and the New York State Department of Environmental Conservation (NYSDEC). The objective of this paper is to propose a standard approach to deriving risk-based cleanup guidelines for radionuclides in soil at BNL.

  10. Proceedings of Brookhaven National Laboratory's fusion/synfuel workshop

    Energy Technology Data Exchange (ETDEWEB)

    Fillo, J.A.; Powell, J.R. (eds.)

    1979-01-01

    The fusion synfuels workshop held at Brookhaven National Laboratory (BNL) on August 27-29, 1979 examined the current status of candidate synfuel processes and the R and D required to develop the capability for fusion synfuel production. Participants divided into five working groups, covering the following areas: (1) economics and applications; (2) high-temperature electrolysis; (3) thermochemical processes (including hybrid thermo-electrochemical); (4) blanket and materials; and (5) high-efficiency power cycles. Each working group presented a summary of their conclusions and recommendations to all participants during the third day of the Workshop. These summaries are given.

  11. Brookhaven highlights

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Cohen, A.; Greenberg, D.; Seubert, L. (eds.)

    1992-01-01

    This publication provides a broad overview of the research programs and efforts being conducted, built, designed, and planned at Brookhaven National Laboratory. This work covers a broad range of scientific disciplines. Major facilities include the Alternating Gradient Synchrotron (AGS), with its newly completed booster, the National Synchrotron Light Source (NSLS), the High Flux Beam Reactor (HFBR), and the RHIC, which is under construction. Departments within the laboratory include the AGS department, accelerator development, physics, chemistry, biology, NSLS, medical, nuclear energy, and interdepartmental research efforts. Research ranges from the pure sciences, in nuclear physics and high energy physics as one example, to environmental work in applied science to study climatic effects, from efforts in biology which are a component of the human genome project to the study, production, and characterization of new materials. The paper provides an overview of the laboratory operations during 1992, including staffing, research, honors, funding, and general laboratory plans for the future.

  12. Structural biology facilities at Brookhaven National Laboratory`s high flux beam reactor

    Energy Technology Data Exchange (ETDEWEB)

    Korszun, Z.R.; Saxena, A.M.; Schneider, D.K. [Brookhaven National Laboratory, Upton, NY (United States)

    1994-12-31

    The techniques for determining the structure of biological molecules and larger biological assemblies depend on the extent of order in the particular system. At the High Flux Beam Reactor at the Brookhaven National Laboratory, the Biology Department operates three beam lines dedicated to biological structure studies. These beam lines span the resolution range from approximately 700{Angstrom} to approximately 1.5{Angstrom} and are designed to perform structural studies on a wide range of biological systems. Beam line H3A is dedicated to single crystal diffraction studies of macromolecules, while beam line H3B is designed to study diffraction from partially ordered systems such as biological membranes. Beam line H9B is located on the cold source and is designed for small angle scattering experiments on oligomeric biological systems.

  13. CSEWG SYMPOSIUM, A CSWEG RETROSPECTIVE. 35TH ANNIVERSARY CROSS SECTION EVALUATION WORKING GROUP, NOV. 5, 2001, BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    DUNFORD, C.; HOLDEN, N.; PEARLSTEIN, S.

    2001-11-05

    This publication has been prepared to record some of the history of the Cross Section Evaluation Working Group (CSEWG). CSEWG is responsible for creating the evaluated nuclear data file (ENDF/B) which is widely used by scientists and engineers who are involved in the development and maintenance of applied nuclear technologies. This organization has become the model for the development of nuclear data libraries throughout the world. The data format (ENDF) has been adopted as the international standard. On November 5, 2001, a symposium was held at Brookhaven National Laboratory to celebrate the 50 th meeting of the CSEWG organization and the 35 th anniversary of its first meeting in November 1966. The papers presented in this volume were prepared by present and former CSEWG members for presentation at the November 2001 symposium. All but two of the presentations are included. I have included an appendix to list all of the CSEWG members and their affiliations, which has been compiled from the minutes of each of the CSEWG meetings. Minutes exist for all meetings except the 4 th meeting held in January 1968. The list includes 348 individuals from 71 organizations. The dates for each of the 50 CSEWG meetings are listed. The committee structure and chairmen of all committees and subcommittees are also included in the appendix. This volume is dedicated to three individuals whose foresight and talents made CSEWG possible and successful. They are Henry Honeck who lead the effort to develop the ENDF format and the CSEWG system, Ira Zartman, the Atomic Energy Commission program manager who provided the programmatic direction and support, and Sol Pearlstein who led the development of the CESWG organization and the ENDF/B evaluated nuclear data library.

  14. In-situ containment of buried waste at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, B.P. [Sandia National Labs., Albuquerque, NM (United States); Heiser, J. [Brookhaven National Lab., Upton, NY (United States); Stewart, W.; Phillips, S. [Applied Geotechnical Engineering and Construction, Inc., Richland, WA (United States)

    1997-12-31

    The primary objective of this project was to further develop close-coupled barrier technology for the containment of subsurface waste or contaminant migration. A close-coupled barrier is produced by first installing a conventional cement grout curtain followed by a thin inner lining of a polymer grout. The resultant barrier is a cement polymer composite that has economic benefits derived from the cement and performance benefits from the durable and chemically resistant polymer layer. The technology has matured from a regulatory investigation of issues concerning barriers and barrier materials to a pilot-scale, multiple individual column injections at Sandia National Labs (SNL) to full scale demonstration. The feasibility of this barrier concept was successfully proven in a full scale {open_quotes}cold test{close_quotes} demonstration at Hanford, WA. Consequently, a full scale deployment of the technology was conducted at an actual environmental restoration site at Brookhaven National Lab (BNL), Long Island, NY. This paper discusses the installation and performance of a technology deployment implemented at OU-1 an Environmental Restoration Site located at BNL.

  15. Interactive radiopharmaceutical facility between Yale Medical Center and Brookhaven National Laboratory. Progress report, October 1976-June 1979

    Energy Technology Data Exchange (ETDEWEB)

    Gottschalk, A.

    1979-01-01

    DOE Contract No. EY-76-S-02-4078 was started in October 1976 to set up an investigative radiochemical facility at the Yale Medical Center which would bridge the gap between current investigation with radionuclides at the Yale School of Medicine and the facilities in the Chemistry Department at the Brookhaven National Laboratory. To facilitate these goals, Dr. Mathew L. Thakur was recruited who joined the Yale University faculty in March of 1977. This report briefly summarizes our research accomplishments through the end of June 1979. These can be broadly classified into three categories: (1) research using indium-111 labelled cellular blood components; (2) development of new radiopharmaceuticals; and (3) interaction with Dr. Alfred Wolf and colleagues in the Chemistry Department of Brookhaven National Laboratory.

  16. RHIC and quark matter: proposal for a relativistic heavy ion collider at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1984-08-01

    This document describes the Brookhaven National Laboratory Proposal for the construction of a Relativistic Heavy Ion Collider (RHIC). The construction of this facility represents the natural continuation of the laboratory's role as a center for nuclear and high-energy physics research and extends and uses the existing AGS, Tandem Van de Graaff and CBA facilities at BNL in a very cost effective manner. The Administration and Congress have approved a project which will provide a link between the Tandem Van de Graaf and the AGS. Completion of this project in 1986 will provide fixed target capabilities at the AGS for heavy ions of about 14 GeV/amu with masses up to approx. 30 (sulfur). The addition of an AGS booster would extend the mass range to the heaviest ions (A approx. 200, e.g., gold); its construction could start in 1986 and be completed in three years. These two new AGS experimental facilities can be combined with the proposed Relativistic Heavy Ion Collider to extend the energy range to 100 x 100 GeV/amu for the heaviest ions. BNL proposes to start construction of RHIC in FY 86 with completion in FY 90 at a total cost of 134 M$.

  17. Department of Energy’s ARM Climate Research Facility External Data Center Operations Plan Located At Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Cialella, A. [Brookhaven National Lab. (BNL), Upton, NY (United States); Gregory, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Lazar, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Liang, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ma, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tilp, A. [Brookhaven National Lab. (BNL), Upton, NY (United States); Wagener, R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-05-01

    The External Data Center (XDC) Operations Plan describes the activities performed to manage the XDC, located at Brookhaven National Laboratory (BNL), for the Department of Energy’s Atmospheric Radiation Measurement (ARM) Climate Research Facility. It includes all ARM infrastructure activities performed by the Data Management and Software Engineering Group (DMSE) at BNL. This plan establishes a baseline of expectation within the ARM Operations Management for the group managing the XDC.

  18. Radar Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory (BNL) Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M. P. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Giangrande, S. E. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Bartholomew, M. J. [Brookhaven National Laboratory (BNL), Upton, NY (United States)

    2016-04-01

    The Radar Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory (BNL) [http://www.arm.gov/campaigns/osc2013rwpcf] campaign was scheduled to take place from 15 July 2013 through 15 July 2015 (or until shipped for the next U.S. Department of Energy Atmospheric Radiation Measurement [ARM] Climate Research Facility first Mobile Facility [AMF1] deployment). The campaign involved the deployment of the AMF1 Scintec 915 MHz Radar Wind Profiler (RWP) at BNL, in conjunction with several other ARM, BNL and National Weather Service (NWS) instruments. The two main scientific foci of the campaign were: 1) To provide profiles of the horizontal wind to be used to test and validate short-term cloud advection forecasts for solar-energy applications and 2) to provide vertical profiling capabilities for the study of dynamics (i.e., vertical velocity) and hydrometeors in winter storms. This campaign was a serendipitous opportunity that arose following the deployment of the RWP at the Two-Column Aerosol Project (TCAP) campaign in Cape Cod, Massachusetts and restriction from participation in the Green Ocean Amazon 2014/15 (GoAmazon 2014/15) campaign due to radio-frequency allocation restriction for international deployments. The RWP arrived at BNL in the fall of 2013, but deployment was delayed until fall of 2014 as work/safety planning and site preparation were completed. The RWP further encountered multiple electrical failures, which eventually required several shipments of instrument power supplies and the final amplifier to the vendor to complete repairs. Data collection began in late January 2015. The operational modes of the RWP were changed such that in addition to collecting traditional profiles of the horizontal wind, a vertically pointing mode was also included for the purpose of precipitation sensing and estimation of vertical velocities. The RWP operated well until the end of the campaign in July 2015 and collected observations for more than 20 precipitation

  19. 915-MHz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Bartholomew, M. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Giangrande, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds on the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.

  20. 915-Mhz Wind Profiler for Cloud Forecasting at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, M. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Bartholomew, M. J. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Giangrande, S. [Brookhaven National Laboratory (BNL), Upton, NY (United States)

    2016-03-01

    When considering the amount of shortwave radiation incident on a photovoltaic solar array and, therefore, the amount and stability of the energy output from the system, clouds represent the greatest source of short-term (i.e., scale of minutes to hours) variability through scattering and reflection of incoming solar radiation. Providing estimates of this short-term variability is important for determining and regulating the output from large solar arrays as they connect with the larger power infrastructure. In support of the installation of a 37-MW solar array on the grounds of Brookhaven National Laboratory (BNL), a study of the impacts of clouds on the output of the solar array has been undertaken. The study emphasis is on predicting the change in surface solar radiation resulting from the observed/forecast cloud field on a 5-minute time scale. At these time scales, advection of cloud elements over the solar array is of particular importance. As part of the BNL Aerosol Life Cycle Intensive Operational Period (IOP), a 915-MHz Radar Wind Profiler (RWP) was deployed to determine the profile of low-level horizontal winds and the depth of the planetary boundary layer. The initial deployment mission of the 915-MHz RWP for cloud forecasting has been expanded the deployment to provide horizontal wind measurements for estimating and constraining cloud advection speeds. A secondary focus is on the observation of dynamics and microphysics of precipitation during cold season/winter storms on Long Island. In total, the profiler was deployed at BNL for 1 year from May 2011 through May 2012.

  1. TWENTY-YEAR PLANNING STUDY FOR THE RELATIVISTIC HEAVY ION COLLIDER FACILITY AT BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    LUDLAM,T.ET AL.

    2003-12-31

    At the request of DOE's Office of Nuclear Physics (ONP), Brookhaven National Laboratory (BNL) has created this planning document to assemble and summarize a planning exercise that addresses the core scientific thrust of the Relativistic Heavy Ion Collider (RHIC) for the next twenty years and the facilities operation plan that will support this program. The planning work was carried out by BNL in close collaboration with the RHIC user community and within budgetary guidelines for the next five years supplied by the ONP. The resulting plans were reviewed by the BNL High Energy and Nuclear Physics Program Advisory Committee (PAC) at a special RHIC planning meeting held in December 2003. Planning input from each of the four RHIC experimental collaborations was absolutely central to the preparation of this overall Laboratory plan. Each collaboration supplied two key documents, a five-year ''Beam Use Proposal'' and a ten-year ''Decadal Plan''. These plans are posted on the BNL website http://www.bnl.gov/henp/, along with other planning documents germane to this paper, such as the complete written reports from the August and December 2003 PAC meetings that considered the five-year and decadal planning documents of the four RHIC collaborations and offered advice and commentary on these plans. Only in these collaboration documents can the full physics impact of the RHIC program be seen and the full scope of the efforts put into this planning process be appreciated. For this reason, the maximum value of the present planning paper can only be realized by making frequent reference to the collaboration documents.

  2. Strong interactions of hyperons. [Summaries of research activities of Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Nemethy, P.; Hungerbuehler, V.; Majka, R.

    1975-01-01

    A summary of the strong interaction results obtained with the Yale--FNAL--BNL hyperon beam at the Brookhaven AGS is presented. Differential cross sections are reported for hyperon-proton elastic scattering with samples of 6200 ..sigma../sup -/p events and 67 ..xi../sup -/p events. Also a report is made on a search for hyperon resonances in inelastic scattering. Finally, the prospects for new results on hyperon interactions are reviewed.

  3. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  4. Brookhaven Highlights, January 1982-March 1983

    Energy Technology Data Exchange (ETDEWEB)

    Kuper, J.B.H.; Rustad, M.C. (eds.)

    1983-01-01

    Research at Brookhaven National Laboratory is summarized. Major headings are high energy physics, physics and chemistry, life sciences, applied energy science, support activities and administration. (GHT)

  5. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    Energy Technology Data Exchange (ETDEWEB)

    Dionne, B.J.; Morris, S.C. III; Baum, J.W. [and others

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique. This document contains the Appendices for the report.

  6. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    Energy Technology Data Exchange (ETDEWEB)

    Dionne, B.J.; Morris, S. III; Baum, J.W. [and others

    1998-03-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example of a risk-based decision technique.

  7. Higher-order-mode absorbers for energy recovery linac cryomodules at Brookhaven National Laboratory

    Directory of Open Access Journals (Sweden)

    H. Hahn

    2010-12-01

    Full Text Available Several future accelerator projects at Brookhaven for the Relativistic Heavy Ion Collider (RHIC are based on energy recovery linacs (ERLs with high-charge high-current electron beams. Their stable operation mandates effective higher-order-mode (HOM damping. The development of HOM dampers for these projects is pursued actively at this laboratory. Strong HOM damping was experimentally demonstrated both at room and at superconducting (SC temperatures in a prototype research and development (R&D five-cell niobium superconducting rf (SRF cavity with ferrite dampers. Two room-temperature mock-up five-cell copper cavities were used to study various damper configurations with emphasis on capacitive antenna dampers. An innovative type of ferrite damper over a ceramic break for an R&D SRF electron gun also was developed. For future SRF linacs longer cryomodules comprised of multiple superconducting cavities with reasonably short intercavity transitions are planned. In such a configuration, the dampers, located closer to the cavities, will be at cryogenic temperatures; this will impose additional constraints and complications. This paper presents the results of simulations and measurements of several damper configurations.

  8. BROOKHAVEN NATIONAL LABORATORY SOURCE WATER ASSESSMENT FOR DRINKING WATER SUPPLY WELLS

    Energy Technology Data Exchange (ETDEWEB)

    BENNETT,D.B.; PAQUETTE,D.E.; KLAUS,K.; DORSCH,W.R.

    2000-12-18

    The BNL water supply system meets all water quality standards and has sufficient pumping and storage capacity to meet current and anticipated future operational demands. Because BNL's water supply is drawn from the shallow Upper Glacial aquifer, BNL's source water is susceptible to contamination. The quality of the water supply is being protected through (1) a comprehensive program of engineered and operational controls of existing aquifer contamination and potential sources of new contamination, (2) groundwater monitoring, and (3) potable water treatment. The BNL Source Water Assessment found that the source water for BNL's Western Well Field (comprised of Supply Wells 4, 6, and 7) has relatively few threats of contamination and identified potential sources are already being carefully managed. The source water for BNL's Eastern Well Field (comprised of Supply Wells 10, 11, and 12) has a moderate number of threats to water quality, primarily from several existing volatile organic compound and tritium plumes. The g-2 Tritium Plume and portions of the Operable Unit III VOC plume fall within the delineated source water area for the Eastern Well Field. In addition, portions of the much slower migrating strontium-90 plumes associated with the Brookhaven Graphite Research Reactor, Waste Concentration Facility and Building 650 lie within the Eastern source water area. However, the rate of travel in the aquifer for strontium-90 is about one-twentieth of that for tritium and volatile organic compounds. The Laboratory has been carefully monitoring plume migration, and has made adjustments to water supply operations. Although a number of BNL's water supply wells were impacted by VOC contamination in the late 1980s, recent routine analysis of water samples from BNL's supply wells indicate that no drinking water standards have been reached or exceeded. The high quality of the water supply strongly indicates that the operational and engineered

  9. Evaluation of alternative designs for an injectable subsurface barrier at the Brookhaven National Laboratory Site, Long Island, New York

    Science.gov (United States)

    Moridis, George J.; Finsterle, Stefan; Heiser, John

    1999-10-01

    Two alternative designs for the demonstration emplacement of a viscous liquid barrier (VLB) at the Brookhaven National Laboratory (BNL), Long Island, New York, are investigated by means of numerical simulation. The application of the VLB technology at the BNL site involved a surface-modified colloidal silica (CS), which gels upon addition of an appropriate electrolyte. Lance injection was used for the CS barrier emplacement. The lance injections occur in three stages: primary, secondary, and tertiary. The geometry of the barrier is based on the wedge model. The first design is based on optimization principles and determines the parameters that maximize uniformity and minimize permeability by minimizing an appropriate objective function while meeting the design criteria. These include a maximum hydraulic conductivity of 10-7 cm/s and a minimum thickness of 1 m. The second design aims to meet the same criteria and reflects standard chemical grouting practices. The combined effects of the key design parameters (i.e., lance spacing, injection location and spacing, gel time, injection rate, and volume) on the barrier permeability are studied. The optimization-based design is shown to have a significantly better performance than the standard engineering design. The interpenetration of adjacent CS bulbs appears to be of critical importance in meeting the barrier specifications. The three-dimensional simulations show that the barrier performance depends heavily on the path by which the final state is achieved. The in situ field measurements of the barrier permeability are consistent with, and appear to validate, the model predictions.

  10. RADIATION MEASUREMENTS BY BROOKHAVEN NATIONAL LABORATORY DURING THE WOODS HOLE OCEANOGRAPHIC INSTITUTION INTERCOMPARISON STUDY, MAY-JUNE 2000.

    Energy Technology Data Exchange (ETDEWEB)

    REYNOLDS, R.M.; BARTHOLOMEW, M.J.; MILLER, M.A.; SMITH, S.; EDWARDS, R.

    2000-12-01

    The WHOI buoy radiometer intercomparison took place during May and June, 2000 at the WHOI facility. The WHOI IMET, JAMSTEC Triton, and NOAA TAO buoy systems were operated from a beach site and the Brookhaven National Laboratory set up two Portable Radiation Package systems (P01 and P02) alongside the WHOI instrumentation on the roof of the Clark Building, about 300 m away. The BNL instruments were named ''P01'' and ''P02'' and were identical. Buoy instruments were all leveled to {+-}1{degree} to horizontal. The purpose of the project was to compare the buoy systems with precision measurements so that any differences in data collection or processing would be evaluated. BNL was pleased to participate so the PRP system could be evaluated as a calibration tool. The Portable Radiation Package is an integral component of the BNL Shipboard Oceanographic and Atmospheric Radiation (SOAR) system. It is designed to make accurate downwelling radiation measurements, including the three solar irradiance components (direct normal, diffuse and global) at six narrowband channels, aerosol optical depth measurements, and broadband longwave and shortwave irradiance measurements.

  11. Brookhaven National Laboratory's capabilities for advanced analyses of cyber threats

    Energy Technology Data Exchange (ETDEWEB)

    DePhillips, M. P. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2014-01-01

    BNL has several ongoing, mature, and successful programs and areas of core scientific expertise that readily could be modified to address problems facing national security and efforts by the IC related to securing our nation’s computer networks. In supporting these programs, BNL houses an expansive, scalable infrastructure built exclusively for transporting, storing, and analyzing large disparate data-sets. Our ongoing research projects on various infrastructural issues in computer science undoubtedly would be relevant to national security. Furthermore, BNL frequently partners with researchers in academia and industry worldwide to foster unique and innovative ideas for expanding research opportunities and extending our insights. Because the basic science conducted at BNL is unique, such projects have led to advanced techniques, unlike any others, to support our mission of discovery. Many of them are modular techniques, thus making them ideal for abstraction and retrofitting to other uses including those facing national security, specifically the safety of the nation’s cyber space.

  12. Assessment of energy efficiency project financing alternatives for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    WDM Hunt; JC Hail; GP Sullivan

    2000-03-13

    Energy reduction goals for Federal agencies were first established in the National Energy Conservation Policy Act of 1988, and directed 10{percent} reduction in facility energy use based on a 1985 baseline. Since that time, Federal sites have been actively seeking and implementing a wide variety of energy-efficiency measures in facilities across the Federal sector. In the intervening years this energy reduction goal has been progressively increased to 20{percent} through legislation (Public Law 102-486, The Energy Policy Act of 1992) and a number of Executive Orders. Executive Order 13123, Greening the Government Through Efficient Energy management (signed June 3, 1999), further increased the facility energy-efficiency improvement goal from 30{percent} in 2005 to 35{percent} by 2010 relative to the 1985 baseline.

  13. Insects of the Idaho National Laboratory: A compilation and review

    Science.gov (United States)

    Nancy Hampton

    2005-01-01

    Large tracts of important sagebrush (Artemisia L.) habitat in southeastern Idaho, including thousands of acres at the Idaho National Laboratory (INL), continue to be lost and degraded through wildland fire and other disturbances. The roles of most insects in sagebrush ecosystems are not well understood, and the effects of habitat loss and alteration...

  14. Compilation of Earthquakes from 1850-2007 within 200 miles of the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    N. Seth Carpenter

    2010-07-01

    An updated earthquake compilation was created for the years 1850 through 2007 within 200 miles of the Idaho National Laboratory. To generate this compilation, earthquake catalogs were collected from several contributing sources and searched for redundant events using the search criteria established for this effort. For all sets of duplicate events, a preferred event was selected, largely based on epicenter-network proximity. All unique magnitude information for each event was added to the preferred event records and these records were used to create the compilation referred to as “INL1850-2007”.

  15. Application of the SmartSampling Methodology to the Evaluation of Contaminated Landscape Soils at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    RAUTMAN,CHRISTOPHER A.

    2000-08-01

    Portions of the SmartSampling{trademark} analysis methodology have been applied to the evaluation of radioactive contaminated landscape soils at Brookhaven National Laboratory. Specifically, the spatial, volumetric distribution of cesium-137 ({sup 137}Cs) contamination within Area of Concern 16E-1 has been modeled probabilistically using a geostatistical methodology, with the purpose of identifying the likelihood of successfully reducing, with respect to a pre-existing, baseline remediation plan, the volume of soil that must be disposed of offsite during clean-up. The principal objective of the analysis was to evaluate the likelihood of successful deployment of the Segmented Gate System (SGS), a novel remediation approach that emphasizes real-time separation of clean from contaminated materials during remediation operations. One primary requirement for successful application of the segmented gate technology investigated is that a variety of contaminant levels exist at the deployment site, which would enable to the SGS to discriminate material above and below a specified remediation threshold value. The results of this analysis indicate that there is potential for significant volume reduction with respect to the baseline remediation plan at a threshold excavation level of 23 pCi/g {sup 137}Cs. A reduction of approximately 50%, from a baseline volume of approximately 1,064.7 yd{sup 3} to less than 550 yd{sup 3}, is possible with acceptance of only a very small level of engineering risk. The vast majority of this volume reduction is obtained by not excavating almost all of levels 3 and 4 (from 12 to 24 inches in depth), which appear to be virtually uncontaminated, based on the available data. Additional volume reductions related to soil materials on levels 1 (depths of 0--6 inches) and 2 (6--12 inches) may be possible, specifically through use of the SGS technology. Level-by-level evaluation of simulation results suggests that as much as 26 percent of level 1 and as

  16. Energy-related perturbations of the northeast coastal zone: five years (1974-1979) of oceanographic research at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, J.J.

    1980-03-01

    Since inception of oceanographic research at Brookhaven National Laboratory in 1974, over 75 cruises and 150 papers and reports have been completed. In comparison of shelf ecosystems at high, mid, and low latitudes, an understanding of the natural variability of US coastal waters has been derived. Annual carbon and nitrogen budgets suggest that the energy flow is diverted to a pelagic food web in summer-fall and a demersal food web in winter-spring within the Mid-Atlantic Bight. The impact of energy-related perturbations can now be assessed within the context of natural oscillation of the coastal food web.

  17. Brookhaven highlights, October 1, 1989--September 30, 1990

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Cohen, A.; Greenberg, D.; Seubert, L.; Kuper, J.B.H. (eds.)

    1990-01-01

    This report discusses research being conducted at Brookhaven National Laboratory. Highlights from all the department are illustrated. The main topics are on accelerator development and applications. (LSP)

  18. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  19. Brookhaven Linac Isotope Producer

    Data.gov (United States)

    Federal Laboratory Consortium — The Brookhaven Linac Isoptope Producer (BLIP)—positioned at the forefront of research into radioisotopes used in cancer treatment and diagnosis—produces commercially...

  20. Brookhaven highlights, October 1979-September 1980

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    Highlights are given for the research areas of the Brookhaven National Laboratory. These areas include high energy physics, physics and chemistry, life sciences, applied energy science (energy and environment, and nuclear energy), and support activities (including mathematics, instrumentation, reactors, and safety). (GHT)

  1. Report on summary results of the inspection of issues regarding the scope of the accident investigation of the TRISTAN Fire at the Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    The subject final report is provided to inform you of our findings and recommendations concerning our review of issues regarding the scope of the accident investigation of a March 31, 1994, fire at the Terrific Reactor Isotope Separator To Analyze Nuclides (TRISTAN) experiment at the Department of Energy (DOE) Brookhaven National Laboratory (BNL), Upton, New York. The Chicago Operations Office (CH) Manager appointed a Type B Accident Investigation Board (Board) to investigate the fire. In a June 16, 1994, letter to the Inspector General, DOE, the CH Manager requested the Inspector General to look into an allegation by a former Board member that senior Chicago management consciously violated the requirements of DOE Order 5484.1, {open_quotes}Environmental Protection, Safety, And Health Protection Information Reporting Requirements,{close_quotes} in attempting to control the investigation. The former Board member alleged that there was not a clear verbal agreement among the Board members regarding the focus of the scope of the investigation. He said that the Board Chairman wanted to focus on the physical causes of the fire, while he (the former Board member) believed that the Board should focus on the apparent management deficiencies that allowed TRISTAN to operate without a proper safety analysis and in violation of DOE orders for so many years.

  2. University-level Non-proliferation and Safeguards Education and Human Capital Development Activities at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Bachner K. M.; Pepper, S.; Gomera, J.; Einwechter, M.; Toler, L. T.

    2016-07-24

    BNL has offered Nuclear Nonproliferation, Safeguards and Security in the 21st Century,? referred to as NNSS, every year since 2009 for graduate students in technical and policy fields related to nuclear safeguards and nonproliferation. The course focuses on relevant policy issues, in addition to technical components, and is part of a larger NGSI short course initiative that includes separate courses that are delivered at three other national laboratories and NNSA headquarters. [SCHOLZ and ROSENTHAL] The course includes lectures from esteemed nonproliferation experts, tours of various BNL facilities and laboratories, and in-field and table-top exercises on both technical and policy subjects. Topics include the history of the Treaty on the Non-proliferation of Nuclear Weapons (NPT) and other relevant treaties, the history of and advances in international nuclear safeguards, current relevant political situations in countries such as Iran, Iraq, and the Democratic Peoples? Republic of Korea (DPRK), nuclear science and technology, instrumentation and techniques used for verification activities, and associated research and development. The students conduct a mock Design Information Verification (DIV) at BNL?s decommissioned Medical Research Reactor. The capstone of the course includes a series of student presentations in which students act as policy advisors and provide recommendations in response to scenarios involving a current nonproliferation related event that are prepared by the course organizers. ?The course is open to domestic and foreign students, and caters to students in, entering, or recently having completed graduate school. Interested students must complete an application and provide a resume and a statement describing their interest in the course. Eighteen to 22 students attend annually; 165 students have completed the course to date. A stipend helps to defray students? travel and subsistence expenses. In 2015, the course was shortened from three weeks to

  3. Brookhaven highlights for fiscal year 1991, October 1, 1990--September 30, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Cohen, A.; Greenberg, D.; Seubert, L.; Kuper, J.B.H.

    1991-12-31

    This report highlights Brookhaven National Laboratory`s activities for fiscal year 1991. Topics from the four research divisions: Computing and Communications, Instrumentation, Reactors, and Safety and Environmental Protection are presented. The research programs at Brookhaven are diverse, as is reflected by the nine different scientific departments: Accelerator Development, Alternating Gradient Synchrotron, Applied Science, Biology, Chemistry, Medical, National Synchrotron Light Source, Nuclear Energy, and Physics. Administrative and managerial information about Brookhaven are also disclosed. (GHH)

  4. Brookhaven highlights for fiscal year 1991, October 1, 1990--September 30, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Cohen, A.; Greenberg, D.; Seubert, L.; Kuper, J.B.H.

    1991-01-01

    This report highlights Brookhaven National Laboratory's activities for fiscal year 1991. Topics from the four research divisions: Computing and Communications, Instrumentation, Reactors, and Safety and Environmental Protection are presented. The research programs at Brookhaven are diverse, as is reflected by the nine different scientific departments: Accelerator Development, Alternating Gradient Synchrotron, Applied Science, Biology, Chemistry, Medical, National Synchrotron Light Source, Nuclear Energy, and Physics. Administrative and managerial information about Brookhaven are also disclosed. (GHH)

  5. Teaming with Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Yang, G. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-06-06

    BNL has started growing CdZnTeSe crystals for room-temperature radiation detector applications. The addition of Se to CdTe reduces the concentration of secondary phases and sub-grain boundary networks. The addition of Zn increases the energy band gap. Material characterization to understand the limiting factors of radiation detection material and to improve the properties continues to be a core element of an extensive R&D program at the LBNL’s ALS and BNL’s NSLS-II synchrotron facilities. ALS’s Beamline 3.3.2 is available for 60% of its beamtime and allows us to perform both White Beam X-ray Diffraction Topography and Micron Scale X-ray Detector Mapping. The latter technique is extremely useful when measuring scintillators because it allows us to subtract contributions from the variability in the counting statistics, and also the fluctuations due to delta electrons, and non-proportionality. BNL has recently developed a new type of thermal neutron detector using pad technology in combination with 3He gas operated in ionization mode. The new detector is used for coded aperture thermal neutron imaging.

  6. ATLAS Overview Week at Brookhaven

    CERN Multimedia

    Pilcher, J

    Over 200 ATLAS participants gathered at Brookhaven National Laboratory during the first week of June for our annual overview week. Some system communities arrived early and held meetings on Saturday and Sunday, and the detector interface group (DIG) and Technical Coordination also took advantage of the time to discuss issues of interest for all detector systems. Sunday was also marked by a workshop on the possibilities for heavy ion physics with ATLAS. Beginning on Monday, and for the rest of the week, sessions were held in common in the well equipped Berkner Hall auditorium complex. Laptop computers became the norm for presentations and a wireless network kept laptop owners well connected. Most lunches and dinners were held on the lawn outside Berkner Hall. The weather was very cooperative and it was an extremely pleasant setting. This picture shows most of the participants from a view on the roof of Berkner Hall. Technical Coordination and Integration issues started the reports on Monday and became a...

  7. Brookhaven Lab and Argonne Lab scientists invent a plasma valve

    CERN Multimedia

    2003-01-01

    Scientists from Brookhaven National Laboratory and Argonne National Laboratory have received U.S. patent number 6,528,948 for a device that shuts off airflow into a vacuum about one million times faster than mechanical valves or shutters that are currently in use (1 page).

  8. Is Overeating Behavior Similar to Drug Addiction? (427th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Gene-Jack

    2007-09-27

    The increasing number of obese individuals in the U.S. and other countries world-wide adds urgency to the need to understand the mechanisms underlying pathological overeating. Research by the speaker and others at Brookhaven National Laboratory and elsewhere is compiling evidence that the brain circuits disrupted in obesity are similar to those involved in drug addiction. Using positron emission tomography (PET), the speaker and his colleagues have implicated brain dopamine in the normal and the pathological intake of food by humans. During the 427th Brookhaven Lecture, speaker will review the findings and implications of PET studies of obese subjects and then compare them to PET research involving drug-addicted individuals. For example, in pathologically obese subjects, it was found that reductions in striatal dopamine D2 receptors are similar to those observed in drug-addicted subjects. The speaker and his colleagues have postulated that decreased levels of dopamine receptors predisposed subjects to search for strongly rewarding reinforcers, be it drugs for the drug-addicted or food for the obese, as a means to compensate for decreased sensitivity of their dopamine-regulated reward circuits. As the speaker will summarize, multiple but similar brain circuits involved in reward, motivation, learning and inhibitory control are disrupted both in drug addiction and obesity, resulting in the need for a multimodal approach to the treatment of obesity.

  9. Brookhaven leak reactor to close

    CERN Multimedia

    MacIlwain, C

    1999-01-01

    The DOE has announced that the High Flux Beam Reactor at Brookhaven is to close for good. Though the news was not unexpected researchers were angry the decision had been taken before the review to assess the impact of reopening the reactor had been concluded (1 page).

  10. Proceedings of the 1977 national synchrotron light source summer theory workshop, Brookhaven National Laboratory, Upton, New York, June 20--24, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Krinsky, S. (ed.)

    1977-01-01

    A summer study was held in 1977 to provide definitive recommendations on the design of the National Synchrotron Light Source. The group attending the workshop included electron storage ring experts. Topics covered in the proceedings include: introduction to the lattice; partial list of corrective or diagnostic equipment; closed orbit errors in the 2.5 GeV x-ray source; linear horizontal-vertical coupling, vertical dispersion, and control of the vertical closed orbit at the wiggler; bunch lengthening and longitudinal stability; longitudinal stability with a Landau cavity; transverse coupled-bundle instabilities; coupling impedance and power dissipation for a step change in the vacuum chamber; bunch lengthening and widening; aperture limitations by sextupoles; second-order effects of correction sextupoles; horizontal aperture in x-ray ring; introduction of sinusoidal wiggler into the x-ray lattice; the Touchek effect; possible mode of injection into the x-ray ring; and an inquiry into the flexibility of the x-ray lattice. (GHT)

  11. PROCEEDINGS OF THE 2001 NATIONAL OILHEAT RESEARCH ALLIANCE TECHNOLOGY CONFERENCE HELD AT BROOKHAVEN NATIONAL LABORATORY, UPTON, N.Y., APRIL 30 - MAY 1, 2001.

    Energy Technology Data Exchange (ETDEWEB)

    MCDONALD, R.J.

    2001-04-30

    BNL is proud to acknowledge all of our 2001 sponsors, with their help and support this has correctly become an oilheat industry conference. It is quite gratifying to see an industry come together to help support an activity like the technology conference, for the benefit of the industry as a whole and to celebrate the beginning of the National Oilheat Research Alliance. This meeting is the fourteenth oil heat industry technology conference to be held since 1984 and the first under a new name, NORA, the National Oilheat research Alliance, and the very first in the new century. The conference is a very important part of the effort in technology transfer, which is supported by the Oilheat Research Program. The Oilheat Research Program at BNL is under the newly assigned program management at the Office of Power Technology within the US DOE. The foremost reason for the conference is to provide a platform for the exchange of information and perspectives among international researchers, engineers, manufacturers, service technicians, and marketers of oil-fired space-conditioning equipment. The conference provides a conduit by which information and ideas can be exchanged to examine present technologies, as well as helping to develop the future course for oil heating advancement. These conferences also serve as a stage for unifying government representatives, researchers, fuel oil marketers, and other members of the oil-heat industry in addressing technology advancements in this important energy use sector. The specific objectives of the conference are to: (1) Identify and evaluate the current state-of-the-art and recommend new initiatives for higher efficiency, a cleaner environment, and to satisfy consumer needs cost-effectively, reliably, and safely; (2) Foster cooperative interactions among federal and industrial representatives for the common goal of sustained economic growth and energy security via energy conservation. Seventeen technical presentations will be made

  12. Development of an Index for Evaluating National Quality Competitiveness based on WEF and IMD Compiled Indices

    Directory of Open Access Journals (Sweden)

    Insu Cho

    2014-07-01

    Full Text Available This study was conducted to develop a national quality competitiveness index (NQCI, necessary to establish national policy standards for quality improvements. For the NQCI, the World Economic Forum (WEF and the International Institute for Management Development (IMD, internationally recognized indices for national competitiveness, were used. Based on the WEF and IMD, a NQCI composed of four perspectives (capabilities and 34 indicators was devised. Through the results of implementing the NQCI in 20 selected countries, it shows the comparison and identification of the current standings of countries in national quality competitiveness. From these findings, it is likely that countries’ quality competitiveness weaknesses can be identified and improved through policy suggestions.

  13. National Status and Trends: Bioeffects Assessment Program Sites (1986 to present) Compiled from NOAA's National Centers for Coastal Ocean Science

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains sample collection location information for the National Status and Trends, Bioeffects Assessment Project. The Bioeffects Assessment Sites data...

  14. Compiling a national resistivity atlas of Denmark based on airborne and ground-based transient electromagnetic data

    DEFF Research Database (Denmark)

    Barfod, Adrian; Møller, Ingelise; Christiansen, Anders Vest

    2016-01-01

    We present a large-scale study of the petrophysical relationship of resistivities obtained from densely sampled ground-based and airborne transient electromagnetic surveys and lithological information from boreholes. The overriding aim of this study is to develop a framework for examining...... the resistivity-lithology relationship in a statistical manner and apply this framework to gain a better description of the large-scale resistivity structures of the subsurface. In Denmark very large and extensive datasets are available through the national geophysical and borehole databases, GERDA and JUPITER...... respectively. In a 10 by 10 km grid, these data are compiled into histograms of resistivity versus lithology. To do this, the geophysical data are interpolated to the position of the boreholes, which allows for a lithological categorization of the interpolated resistivity values, yielding different histograms...

  15. Compilation of requests for nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Weston, L.W.; Larson, D.C. [eds.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time.

  16. Minutes of the Fourth Annual Meeting of the Panel on Reference Nuclear Data, Brookhaven National Laboratory, November 1-2, 1979. [BNL, Nov. 1-2, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Burrows, T.W.; Stewart, L.; Coyne, J.J. (eds.)

    1980-06-01

    After the welcome and approval of the agenda and of the minutes of the Third Annual Meeting, the participants turned to reactor physics data needs, CTR data needs, status of international and national cooperation, status and availability of data files, election of officers, status of publications, biomedical data needs, and miscellaneous action items from the Third Meeting. A summary of recommendations and action items is given. Eighteen appendixes are included. (RWR)

  17. Linear accelerator development at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Batchelor, K

    1976-01-01

    A description is given of operating experience on the 200 MeV Proton Injector Linace of the A.G.S. emphasizing developments in field phase and amplitude control and beam diagnostics. Developments in auxilliary use of the machine are also described.

  18. Brookhaven highlights, fiscal year 1985, October 1, 1984-September 30, 1985

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    Activities at Brookhaven National Laboratory are briefly discussed. These include work at the National Synchrotron Light Source, the High Flux Beam Reactor, and the Alternating Gradient Synchrotron. Areas of research include heavy ion reactions, neutrino oscillations, low-level waste, nuclear data, medicine, biology, chemistry, parallel computing, optics. Also provided are general and administrative news, a financial report. (LEW)

  19. Brookhaven Lab physicist William Willis wins the 2003 W.K.H. Panofsky prize

    CERN Multimedia

    2003-01-01

    William Willis, a senior physicist Brookhaven National Laboratory, has won the American Physical Society's 2003 W.K.H. Panofsky Prize in Experimental Particle Physics. He received the prize, which consists of $5,000 and a certificate citing his contributions to physics, at the APS meeting in Philadelphia on April 6 (1 page).

  20. Searching for the H dibaryon at Brookhaven

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, A.J. [Princeton Univ., NJ (United States)

    1994-12-01

    This paper reviews the status of current experiments at Brookhaven, searching for the six-quark H dibaryon postulated by R. Jaffe in 1977. Two experiments, E813 and E888, have recently completed running and two new experiments, E836 and E885, are approved to run. The data recorded so far is under analysis and should have good sensitivity to both short-lived and long-lived Hs.

  1. The Brookhaven electron analogue, 1953--1957

    Energy Technology Data Exchange (ETDEWEB)

    Plotkin, M.

    1991-12-18

    The following topics are discussed on the Brookhaven electron analogue: L.J. Haworth and E.L. VanHorn letters; Original G.K. Green outline for report; General description; Parameter list; Mechanical Assembly; Alignment; Degaussing; Vacuum System; Injection System; The pulsed inflector; RF System; Ferrite Cavity; Pick-up electrodes and preamplifiers; Radio Frequency power amplifier; Lens supply; Controls and Power; and RF acceleration summary.

  2. Compilation of current high-energy physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1981-05-01

    This is the fourth edition of the compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about April 1981, and (2) had not completed taking of data by 1 January 1977. Only approved experiments are included.

  3. Compilation of current high-energy-physics experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976.

  4. 76 FR 50287 - Request for Public Comments To Compile the National Trade Estimate Report on Foreign Trade...

    Science.gov (United States)

    2011-08-12

    ... Foreign Trade Barriers and Reports on Sanitary and Phytosanitary and Standards-Related Foreign Trade... Representative (USTR) is required to publish annually the National Trade Estimate Report on Foreign Trade... statutory mandate and the Obama Administration's commitment to focus on the most significant foreign trade...

  5. Brookhaven highlights. Report on research, October 1, 1992--September 30, 1993

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Belford, M.; Cohen, A.; Greenberg, D.; Seubert, L. [eds.

    1993-12-31

    This report highlights the research activities of Brookhaven National Laboratory during the period dating from October 1, 1992 through September 30, 1993. There are contributions to the report from different programs and departments within the laboratory. These include technology transfer, RHIC, Alternating Gradient Synchrotron, physics, biology, national synchrotron light source, applied science, medical science, advanced technology, chemistry, reactor physics, safety and environmental protection, instrumentation, and computing and communications.

  6. The national assessment of shoreline change: a GIS compilation of vector cliff edges and associated cliff erosion data for the California coast

    Science.gov (United States)

    Hapke, Cheryl; Reid, David; Borrelli, Mark

    2007-01-01

    The U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector cliff edges and associated rates of cliff retreat along the open-ocean California coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Cliff erosion is a chronic problem along many coastlines of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of coastal cliff retreat. There is also a critical need for these data to be consistent from one region to another. One objective of this work is to a develop standard, repeatable methodology for mapping and analyzing cliff edge retreat so that periodic, systematic, and internally consistent updates of cliff edge position and associated rates of erosion can be made at a national scale. This data compilation for open-ocean cliff edges for the California coast is a separate, yet related study to Hapke and others, 2006 documenting shoreline change along sandy shorelines of the California coast, which is itself one in a series that includes the Gulf of Mexico and the Southeast Atlantic coast (Morton and others, 2004; Morton and Miller, 2005). Future reports and data compilations will include coverage of the Northeast U.S., the Great Lakes, Hawaii and Alaska. Cliff edge change is determined by comparing the positions of one historical cliff edge digitized from maps with a modern cliff edge derived from topographic LIDAR (light detection and ranging) surveys. Historical cliff edges for the California coast represent the 1920s-1930s time-period; the most recent cliff edge was delineated using data collected between 1998 and 2002. End-point rate calculations were used to evaluate rates of erosion between the two cliff edges. Please refer to our full report on cliff edge erosion along the California

  7. Calculating correct compilers

    OpenAIRE

    Bahr, Patrick; Hutton, Graham

    2015-01-01

    In this article we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high- level semantics by systematic calculation, with all details of the implementation of the compilers falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language f...

  8. National assessment of shoreline change: A GIS compilation of vector shorelines and associated shoreline change data for the sandy shorelines of Kauai, Oahu, and Maui, Hawaii

    Science.gov (United States)

    Romine, Bradley M.; Fletcher, Charles H.; Genz, Ayesha S.; Barbee, Matthew M.; Dyer, Matthew; Anderson, Tiffany R.; Lim, S. Chyn; Vitousek, Sean; Bochicchio, Christopher; Richmond, Bruce M.

    2012-01-01

    Sandy ocean beaches are a popular recreational destination, and often are surrounded by communities that consist of valuable real estate. Development is increasing despite the fact that coastal infrastructure may be repeatedly subjected to flooding and erosion. As a result, the demand for accurate information regarding past and present shoreline changes is increasing. Working with researchers from the University of Hawaii, investigators with the U.S. Geological Survey's National Assessment of Shoreline Change Project have compiled a comprehensive database of digital vector shorelines and shoreline-change rates for the islands of Kauai, Oahu, and Maui, Hawaii. No widely accepted standard for analyzing shoreline change currently exists. Current measurement and rate-calculation methods vary from study to study, precluding the combination of study results into statewide or regional assessments. The impetus behind the National Assessment was to develop a standardized method for measuring changes in shoreline position that is consistent from coast to coast. The goal was to facilitate the process of periodically and systematically updating the measurements in an internally consistent manner. A detailed report on shoreline change for Kauai, Maui, and Oahu that contains a discussion of the data presented here is available and cited in the Geospatial Data section of this report.

  9. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  10. The National Assessment of Shoreline Change: A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the U.S. Gulf of Mexico

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.; Moore, Laura J.

    2004-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Gulf of Mexico. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates regarding coastal erosion and land loss can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the Gulf of Mexico is the first in a series that will eventually include the Atlantic Coast, Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1998-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change in the Gulf of Mexico, National Assessment of Shoreline Change: Part 1, Historical Shoreline Changes and Associated Coastal Land Loss Along the U.S. Gulf of Mexico (USGS Open File

  11. The National Assessment of Shoreline Change: a GIS compilation of vector shorelines and associated shoreline change data for the U.S. southeast Atlantic coast

    Science.gov (United States)

    Miller, Tara L.; Morton, Robert A.; Sallenger, Asbury H.

    2006-01-01

    The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive database of digital vector shorelines and shoreline change rates for the U.S. Southeast Atlantic Coast (Florida, Georgia, South Carolina, North Carolina). These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along most open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard repeatable methods for mapping and analyzing shoreline movement so that periodic updates of shorelines and shoreline change rates can be made nationally that are systematic and internally consistent. This data compilation for open-ocean, sandy shorelines of the U.S. Southeast Atlantic Coast is the second in a series that already includes the Gulf of Mexico, and will eventually include the Pacific Coast, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are based on merging three historical shorelines with a modern shoreline derived from lidar (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time periods: 1800s, 1920s-1930s, and 1970s. The most recent shoreline is derived from data collected over the period of 1997-2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are simple end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change for the U.S. Southeast Atlantic Coast at http://pubs.usgs.gov/of/2005/1401/ to get additional

  12. The National Assessment of Shoreline Change:A GIS Compilation of Vector Shorelines and Associated Shoreline Change Data for the Sandy Shorelines of the California Coast

    Science.gov (United States)

    Hapke, Cheryl J.; Reid, David

    2006-01-01

    Introduction The Coastal and Marine Geology Program of the U.S. Geological Survey has generated a comprehensive data clearinghouse of digital vector shorelines and shoreline change rates for the sandy shoreline along the California open coast. These data, which are presented herein, were compiled as part of the U.S. Geological Survey's National Assessment of Shoreline Change Project. Beach erosion is a chronic problem along many open-ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information including rates and trends of shoreline migration. There is also a critical need for shoreline change data that is consistent from one coastal region to another. One purpose of this work is to develop standard, repeatable methods for mapping and analyzing shoreline movement so that periodic, systematic, and internally consistent updates of shorelines and shoreline change rates can be made at a National Scale. This data compilation for open-ocean, sandy shorelines of the California coast is one in a series that already includes the Gulf of Mexico and the Southeast Atlantic Coast (Morton et al., 2004; Morton et al., 2005) and will eventually cover Washington, Oregon, and parts of Hawaii and Alaska. Short- and long-term shoreline change evaluations are determined by comparing the positions of three historical shorelines digitized from maps, with a modern shoreline derived from LIDAR (light detection and ranging) topographic surveys. Historical shorelines generally represent the following time-periods: 1850s-1880s, 1920s-1930s, and late 1940s-1970s. The most recent shoreline is from data collected between 1997 and 2002. Long-term rates of change are calculated by linear regression using all four shorelines. Short-term rates of change are end-point rate calculations using the two most recent shorelines. Please refer to our full report on shoreline change of the

  13. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  14. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  15. Brookhaven Lab physicists Edward Beebe and Alexander Pikin win 'Brightness Award' for achievement in ion source physics and technology

    CERN Multimedia

    2003-01-01

    "Edward Beebe and Alexander Pikin, physicists at the U.S. Department of Energy's Brookhaven National Laboratory, have been awarded the Ion Source Prize, known as the "Brightness Award," which recognizes and encourages innovative and significant recent achievements in the fields of ion source physics and technology" (1 page).

  16. Staff roster for 1979: National Center for Analysis of Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    This publication is a compilation of resumes from the current staff of the National Center for Analysis of Energy Systems. The Center, founded in January 1976, is one of four areas within the Department of Energy and Environment at Brookhaven National Laboratory. The emphasis of programs at the Center is on energy policy and planning studies at the regional, national, and international levels, involving quantitative, interdisciplinary studies of the technological, economic, social, and environmental aspects of energy systems. To perform these studies the Center has assembled a staff of experts in the areas of science, technology, economics planning, health and safety, information systems, and quantitative analysis.

  17. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  18. ASTEROID SPIN VECTOR COMPILATION V5.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a comprehensive tabulation of asteroid spin vector determinations, compiled by Agnieszka Kryszczynska and based on the earlier compilation by Per Magnusson....

  19. Increased intensity performance of the Brookhaven AGS

    Energy Technology Data Exchange (ETDEWEB)

    Raka, E.; Ahrens, L.; Frey, W.; Gill, E.; Glenn, J.W.; Sanders, R.; Weng, W.

    1985-05-01

    With the advent of H/sup -/ injection into the Brookhaven AGS, circulating beams of up to 3 x 10/sup 13/ protons at 200 MeV have been obtained. Rf capture of 2.2 x 10/sup 13/ and acceleration of 1.73 x 10/sup 13/ up to the transition energy (approx. = 8 GeV) and 1.64 x 10/sup 13/ to full energy (approx. = 29 GeV) has been achieved. This represents a 50% increase over the best performance obtained with H/sup +/ injection. The increase in circulation beam current is obtained without filling the horizontal aperture. This allows the rf capture process to utilize a larger longitudinal phase space area (approx. = 1 eV sec/bunch vs less than or equal to 0.6 eV sec with H/sup +/ operation). The resulting reduction in relative longitudinal density partially offsets the increase in space charge effects at higher currents. In order to make the capture process independent of injected beam current, a dynamic beam loading compensation loop was installed on the AGS rf system. This is the only addition to the synchrotron itself that was required to reach the new intensity records. A discussion of injection, the rf capture process, and space charge effects is presented. 9 refs., 5 figs.

  20. Bedrock Outcrop Points Compilation

    Data.gov (United States)

    Vermont Center for Geographic Information — A compilation of bedrock outcrops as points and/or polygons from 1:62,500 and 1:24,000 geologic mapping by the Vermont Geological Survey, the United States...

  1. Bedrock Outcrop Polygons Compilation

    Data.gov (United States)

    Vermont Center for Geographic Information — A compilation of bedrock outcrops as points and/or polygons from 1:62,500 and 1:24,000 geologic mapping by the Vermont Geological Survey, the United States...

  2. The Evolution of Compilers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 12; Issue 8. The Evolution of Compilers. Priti Shankar. General Article Volume 12 Issue 8 August 2007 pp 8-26 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  3. Compilation of Shona Children's

    African Journals Online (AJOL)

    Mev. R.B. Ruthven

    Institute (ALRI) team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented. Keywords: SHONA CHILDREN'S DICTIONARY, LOANWORDS, TABOO, ...

  4. Determination of floor response spectra for the Brookhaven HFBR reactor building structure

    Energy Technology Data Exchange (ETDEWEB)

    Subudhi, M.; Goradia, H.

    1978-11-01

    In order to perform the dynamic analysis of various structural components of the HFBR reactor building at Brookhaven National Laboratory (BNL) subjected to seismic disturbances, it is necessary to obtain the floor response spectra of the primary structure. The mathematical model includes the four floor levels of the internal structure, the dome, and soil spring effects. The standard time history analysis is adopted to obtain the response spectrum for each floor of the internal structure. This report summarizes the results both in tabular and graphical form for various damping values.

  5. Embedded Processor Oriented Compiler Infrastructure

    Directory of Open Access Journals (Sweden)

    DJUKIC, M.

    2014-08-01

    Full Text Available In the recent years, research of special compiler techniques and algorithms for embedded processors broaden the knowledge of how to achieve better compiler performance in irregular processor architectures. However, industrial strength compilers, besides ability to generate efficient code, must also be robust, understandable, maintainable, and extensible. This raises the need for compiler infrastructure that provides means for convenient implementation of embedded processor oriented compiler techniques. Cirrus Logic Coyote 32 DSP is an example that shows how traditional compiler infrastructure is not able to cope with the problem. That is why the new compiler infrastructure was developed for this processor, based on research. in the field of embedded system software tools and experience in development of industrial strength compilers. The new infrastructure is described in this paper. Compiler generated code quality is compared with code generated by the previous compiler for the same processor architecture.

  6. Elements of compiler design

    CERN Document Server

    Meduna, Alexander

    2007-01-01

    PREFACEINTRODUCTIONMathematical PreliminariesCompilationRewriting SystemsLEXICAL ANALYSISModelsMethodsTheorySYNTAX ANALYSISModelsMethodsTheoryDETERMINISTIC TOP-DOWN PARSINGPredictive Sets and LL GrammarsPredictive ParsingDETERMINISTIC BOTTOM-UP PARSINGPrecedence ParsingLR ParsingSYNTAX-DIRECTED TRANSLATION AND INTERMEDIATE CODE GENERATIONBottom-Up Syntax-Directed Translation and Intermediate Code GenerationTop-Down Syntax-Directed TranslationSymbol TableSemantic AnalysisSoftw

  7. Review of Brookhaven nuclear transparency measurements in (p ...

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 61; Issue 5. Review of Brookhaven nuclear transparency ... First, we describe the measurements with the newer experiment, E850, which has more complete kinematic definition of quasi-elastic events. E850 covers a larger range of incident momenta, and thus ...

  8. Brookhaven highlights, October 1978-September 1979. [October 1978 to September 1979

    Energy Technology Data Exchange (ETDEWEB)

    1979-01-01

    These highlights present an overview of the major research and development achievements at Brookhaven National Laboratory from October 1978 to September 1979. Specific areas covered include: accelerator and high energy physics programs; high energy physics research; the AGS and improvements to the AGS; neutral beam development; heavy ion fusion; superconducting power cables; ISABELLE storage rings; the BNL Tandem accelerator; heavy ion experiments at the Tandem; the High Flux Beam Reactor; medium energy physics; nuclear theory; atomic and applied physics; solid state physics; neutron scattering studies; x-ray scattering studies; solid state theory; defects and disorder in solids; surface physics; the National Synchrotron Light Source ; Chemistry Department; Biology Department; Medical Department; energy sciences; environmental sciences; energy technology programs; National Center for Analysis of Energy Systems; advanced reactor systems; nuclear safety; National Nuclear Data Center; nuclear materials safeguards; Applied Mathematics Department; and support activities. (GHT)

  9. Amended annual report for Brookhaven National Laboratory: Epidemiologic surveillance - 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    Epidemiologic surveillance at DOE facilities consists of regular and systematic collection, analysis, and interpretation of data on absences due to illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupation and other relevant variables. They may be compared with the disease experience of different groups within the DOE work force and with populations and do not work for DOE to identify disease patterns or clusters that may be associated work activities. This report provides a final summary for BNL.

  10. Building Magnets at Brookhaven National Laboratory: A Condensed Account

    Science.gov (United States)

    Willen, Erich

    2017-09-01

    The development of superconducting wire and cable in the late twentieth century enabled high-field magnets and thus much higher beam-collision energies in accelerators. These higher collision energies have allowed experimentalists to probe further into the structure of matter at the most fundamental, subatomic level. The behavior of the early universe, where these high energies prevailed, and its evolution over time are the realm their experiments seek to investigate. The subject has aroused the curiosity of the public as well as scientists and has facilitated the support needed to build and operate such expensive machines and experiments. The path forward has not been easy, however. Success in most projects has been mixed with failure, progress with ineptitude. The building of high energy accelerators is mostly a story of capable people doing their best to develop new and unusual technology toward some defined goal, facing both success and failure along the way. It is also a story of administrative imperatives that had unpredictable effects on a project's success, depending mostly on the people in the administrative roles and the decisions that they made.

  11. Wildland Fire Management Plan for Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Schwager, K.; Green, T. M.

    2014-10-01

    The DOE policy for managing wildland fires requires that all areas managed by DOE and/or Its various contractors which can sustain fire must have a FMP that details fire management guidelines for operational procedures associated with wildland fire, operational, and prescribed fires. FMPs provide guidance on fire preparedness, fire prevention, wildfire suppression, and the use of controlled ''prescribed'' fires and mechanical means to control the amount of available combustible material. Values reflected in the BNL Wildland FMP include protecting life and public safety; Lab properties, structures and improvements; cultural and historical sites; neighboring private and public properties; and endangered, threatened, and species of concern. Other values supported by the plan include the enhancement of fire-dependent ecosystems at BNL. The plan will be reviewed periodically to ensure fire program advances and will evolve with the missions of DOE and BNL.

  12. Building Magnets at Brookhaven National Laboratory - An Account

    Energy Technology Data Exchange (ETDEWEB)

    Willen, E. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-02-01

    The development of superconducting wire and cable in the late 20th century enabled high field magnets and thus much higher beam collision energies in accelerators. These higher collision energies have allowed experiments to probe further into the structure of matter at the most fundamental, subatomic level. The behavior of the early universe, where these high energies prevailed, and its evolution over time are what these experiments seek to investigate. The subject has aroused the curiosity of not only scientists but of the public as well and has facilitated the support needed to build and operate such expensive machines and experiments. The path forward has not been easy, however. Success in most projects has been mixed with failure, progress with ineptitude. The building of high energy accelerators is mostly a story of capable people doing their best to develop new and unusual technology toward some defined goal, with success and failure in uneven measure along the way. It is also a story of administrative imperatives that have had unpredictable effects on a project’s success, depending mostly on the people in the administrative roles and the decisions that they have made.

  13. BWR plant analyzer development at BNL (Brookhaven National Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1986-01-01

    An engineering plant analyzer has been developed at BNL for realistically and accurately simulating transients and severe abnormal events in BWR power plants. Simulations are being carried out routinely with high fidelity, high simulation speed, at low cost and with unsurpassed user convenience. The BNL Plant Analyzer is the only operating facility which (a) simulates more than two orders-of-magnitude faster than the CDC-7600 mainframe computer, (b) is accessible and fully operational in on-line interactive mode, remotely from anywhere in the US, from Europe or the Far East (Korea), via widely available IBM-PC compatible personal computers, standard modems and telephone lines, (c) simulates both slow and rapid transients seven times faster than real-time in direct access, and four times faster in remote access modes, (d) achieves high simulation speed without compromising fidelity, and (e) is available to remote access users at the low cost of $160 per hour.

  14. Annual report for Brookhaven National Laboratory 1994 epidemiologic surveillance

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    Epidemiologic surveillance at DOE facilities consists of regular and systematic collection, analysis, and interpretation of data on absences due to illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupation and other relevant variables. They may be compared with the disease experience of different groups within the DOE work force and with populations that do not work for DOE to identify disease patterns or clusters that may be associated with work activities. In this annual report, the 1994 morbidity data for BNL are summarized. These analyses focus on absences of 5 or more consecutive workdays occurring among workers aged 16-80 years. They are arranged in five sets of tables that present: (1) the distribution of the labor force by occupational category and salary status; (2) the absences per person, diagnoses per absence, and diagnosis rates for the whole work force; (3) diagnosis rates by type of disease or injury; (4) diagnosis rates by occupational category; and (5) relative risks for specific types of disease or injury by occupational category.

  15. Challenge Team Report: Brookhaven National Laboratory Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    T. L. Page; M. S. Montgomery

    1999-07-19

    The overall conclusion is that the BNL ER program has accomplished much and is well positioned to move aggressively towards closure. Seven removal actions have been completed. A record of decision (ROD) has been reached on Operable Unit IV, and interim soil cleanup has been completed. The remaining three RODs are under negotiation now.

  16. Brookhaven National Laboratory environmental monitoring plan for Calendar Year 1996

    Energy Technology Data Exchange (ETDEWEB)

    Naidu, J.R.; Paquette, D.; Lee, R. [and others

    1996-10-01

    As required by DOE Order 5400.1, each U.S. Department of Energy (DOE) site, facility, or activity that uses, generates, releases, or manages significant quantities of hazardous materials shall provide a written Environmental Monitoring Plan (EMP) covering effluent monitoring and environmental surveillance. DOE/EH-0173T, Environmental Regulatory Guide for Radiological Effluent Monitoring and Environmental Surveillance, provides specific guidance regarding environmental monitoring activities.

  17. 2006 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs

    2008-03-06

    The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of illness and injury surveillance activities that provide an early warning system to detect health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence, occupational injuries and illnesses, and disabilities and deaths among current workers.

  18. 2008 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Security, Office of Health and Safety, Office of Illness and Injury Prevention Programs

    2009-12-10

    The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.

  19. 2010 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs

    2011-08-16

    The U.S. Department of Energy's (DOE) commitment to assuring the health and safety of its workers includes the conduct of illness and injury surveillance activities that provide an early warning system to detect health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence, occupational injuries and illnesses, and disabilities and deaths among current workers.

  20. 2009 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Security, Office of Health and Safety, Office of Illness and Injury Prevention Programs

    2010-11-24

    The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.

  1. Brookhaven National Laboratory technology transfer report, fiscal year 1986

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    An increase in the activities of the Office of Research and Technology Applications (ORTA) is reported. Most of the additional effort has been directed to the regional electric utility initiative, but intensive efforts have been applied to the commercialization of a compact synchrotron storage ring for x-ray lithography applications. At least six laboratory technologies are reported as having been transferred or being in the process of transfer. Laboratory accelerator technology is being applied to study radiation effects, and reactor technology is being applied for designing space reactors. Technologies being transferred and emerging technologies are described. The role of the ORTA and the technology transfer process are briefly described, and application assessment records are given for a number of technologies. A mini-incubator facility is also described. (LEW)

  2. 2007 Brookhaven National Laboratory Annual Illness and Injury Surveillance Report

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, Office of Health, Safety and Health, Office of Health and Safety, Office of Illness and Injury Prevention Programs

    2008-07-31

    The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of illness and injury surveillance activities that provide an early warning system to detect health problems among workers. The Illness and Injury Surveillance Program monitors illnesses and health conditions that result in an absence, occupational injuries and illnesses, and disabilities and deaths among current workers.

  3. Tritium toxicity program in the Medical Department, Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Carsten, A.L.

    1983-01-01

    It is possible to detect somatic, cytogenetic and genetic effects resulting from exposures at 33 to 100 times the mpc's for tritiated water (HTO). The reduction in bone marrow cells in animals maintaining normal total cellularity demonstrate both the presence of an effect at the primitive cell level as well as the animal's ability to compensate for this effect by recruiting stem cells from the G/sub 0/ resting state. This evidence of damage together with the observed cytogenetic changes leads one to contemplate the possible importance of radiation exposures at these levels for the induction of leukemia or other blood dyscrasias. As predicted on the basis of established principles of radiobiology, exposure to tritium beta rays from HTO ingestion results in measureable effects on several animal systems.

  4. Brookhaven National Laboratory meteorological services instrument calibration plan and procedures

    Energy Technology Data Exchange (ETDEWEB)

    Heiser, John [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2013-02-16

    This document describes the Meteorological Services (Met Services) Calibration and Maintenance Schedule and Procedures, The purpose is to establish the frequency and mechanism for the calibration and maintenance of the network of meteorological instrumentation operated by Met Services. The goal is to maintain the network in a manner that will result in accurate, precise and reliable readings from the instrumentation.

  5. Compositional compiler construction: Oberon0

    NARCIS (Netherlands)

    Viera, Marcos; Swierstra, S. Doaitse

    2015-01-01

    We describe an implementation of an Oberon0 compiler using the techniques proposed in the CoCoCo project. The compiler is constructed out of a collection of pre-compiled, statically type-checked language-definition fragments written in Haskell.

  6. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  7. National assessment of shoreline change: A GIS compilation of updated vector shorelines and associated shoreline change data for the Southeast Atlantic coast

    Science.gov (United States)

    Kratzmann, Meredith; Himmelstoss, Emily; Thieler, E. Robert

    2017-01-01

    Sandy ocean beaches in the United States are popular tourist and recreational destinations and constitute some of the most valuable real estate in the country.The boundary between land and water along the coastline is often the location of concentrated residential and commercial development and is frequently exposed to a range of natural hazards, which include flooding, storm effects, and coastal erosion.  In response, the U.S. Geological Survey (USGS) is conducting a national assessment of coastal change hazards.  One component of this research effort, the National Assessment of Shoreline Change Project (http://coastal.er.usgs.gov/shoreline-change/), documents changes in shoreline position as a proxy for coastal change. Shoreline position is an easily understood feature representing the historical location of a beach position through time. All data can be viewed on the National Assessment of Coastal Change Hazards Portal at https://marine.usgs.gov/coastalchangehazardsportal/

  8. Brookhaven highlights. [Fiscal year 1992, October 1, 1991--September 30, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.S.; Cohen, A.; Greenberg, D.; Seubert, L. [eds.

    1992-12-31

    This publication provides a broad overview of the research programs and efforts being conducted, built, designed, and planned at Brookhaven National Laboratory. This work covers a broad range of scientific disciplines. Major facilities include the Alternating Gradient Synchrotron (AGS), with its newly completed booster, the National Synchrotron Light Source (NSLS), the High Flux Beam Reactor (HFBR), and the RHIC, which is under construction. Departments within the laboratory include the AGS department, accelerator development, physics, chemistry, biology, NSLS, medical, nuclear energy, and interdepartmental research efforts. Research ranges from the pure sciences, in nuclear physics and high energy physics as one example, to environmental work in applied science to study climatic effects, from efforts in biology which are a component of the human genome project to the study, production, and characterization of new materials. The paper provides an overview of the laboratory operations during 1992, including staffing, research, honors, funding, and general laboratory plans for the future.

  9. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    Optimizing compilers are vital for performance. However, compilers ability to optimize aggressively is limited in some cases. To address this limitation, we have developed a compiler guiding the programmer in making small source code changes, potentially making the source code more amenable...... to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...... of the programmers development flow. We have evaluated our preliminary implementation and show it can guide to a 12% improvement in performance. Furthermore the tool can be used as an interactive optimization adviser improving the performance of the code generated by a production compiler. Here it can lead to a 153...

  10. National award for energy-efficient town lighting. Compilation of energy-efficient town lighting techniques; Bundeswettbewerb Energieeffiziente Stadtbeleuchtung. Sammlung energieeffizienter Techniken fuer die Stadtbeleuchtung

    Energy Technology Data Exchange (ETDEWEB)

    Piller, Sabine; Huebner, Vanessa; Barbre, Felix; Schaefer, Moritz [Berliner Energieagentur GmbH, Berlin (Germany)

    2009-02-11

    The national award for innovative urban lighting was initiated by the Federal Environmental Office. The resulting publication presents innovative techniques for urban lighting. While it is not a complete market survey, it provides an outline of modern, energy-efficient and environment-friendly technologies that are commercially available. Most systems are also available at comparatively low cost. For more information, interested users should refer to http://www.bmu.de/klimaschutzinitiative/aktuell/41708.php. (orig./AKB)

  11. Brookhaven highlights, July 1976-September 1978

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-01

    Some of the most significant research accomplishments during this 27-month period are presented. Although some data are given, this report is primarily descriptive in outlook; detailed information on completed work should be sought from the references cited herein or from the usual sources of physics research information. The report is organized as follows: High-energy Physics (general introduction, physics research, accelerators, ISABELLE); Nuclear and Solid State Physics, and Chemistry; Life Sciences (biology, medicine); Applied Energy Science (energy and the environment, reactor systems and safety, National Nuclear Data Center, nuclear materials safeguards); Support Activities (applied mathematics, instrumentation, reactors, safety and environmental protection); and General and Administrative. 117 figures, 16 tables, 315 references. (RWR)

  12. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...... in a modern context, and show how such a compiler can now be calculated directly from a specification of its correctness using simple equational reasoning techniques....

  13. LXG Compiler - Design and Implementation

    OpenAIRE

    Vassev, Emil

    2010-01-01

    LXG is a simple Pascal-like language. It is a functional programming language developed for studying compiler design and implementation. The language supports procedure and variable declarations, but no classes. This paper reports the design and implementation of an LXG compiler. Test results are presented as well.

  14. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  15. 1991 OCRWM bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year`s Bulletins.

  16. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  17. Alaska NWRS Legacy Seabird Monitoring Data Inventory and Compilation

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The objective of this project is to compile and standardize data from the Alaska Peninsula/Becharof, Kodiak, Togiak, and Yukon Delta National Wildlife Refuges. This...

  18. Compiler for Fast, Accurate Mathematical Computing on Integer Processors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposers will develop a computer language compiler to enable inexpensive, low-power, integer-only processors to carry our mathematically-intensive comptutations...

  19. 75 FR 43611 - U S Rail Corporation-Construction and Operation Exemption-Brookhaven Rail Terminal

    Science.gov (United States)

    2010-07-26

    ... TRANSPORTATION Surface Transportation Board U S Rail Corporation--Construction and Operation Exemption-- Brookhaven Rail Terminal On August 7, 2008, U S Rail Corporation (U S Rail), an existing class III short line...) of new rail line at a 28-acre site (the Brookhaven Rail Terminal or BRT) located in the Town of...

  20. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  1. Dictionaries compiled with French and

    African Journals Online (AJOL)

    Information Technology

    1. Introduction. In present-day lexicography, there is a tendency to give an account of the underlying culture and civilization of the languages being described. However, reservations can be made with regard to an encyclopaedic bias. Too often lexi- cographers compiling dictionaries in the languages in question have been.

  2. Scientists at Brookhaven contribute to the development of a better electron accelerator

    CERN Multimedia

    2004-01-01

    Scientists working at Brookhaven have developed a compact linear accelerator called STELLA (Staged Electron Laser Acceleration). Highly efficient, it may help electron accelerators become practical tools for applications in industry and medicine, such as radiation therapy (1 page)

  3. Compilation of HPSG to TAG

    CERN Document Server

    Kasper, R; Netter, K; Vijay-Shanker, K; Kasper, Robert; Kiefer, Bernd; Netter, Klaus

    1995-01-01

    We present an implemented compilation algorithm that translates HPSG into lexicalized feature-based TAG, relating concepts of the two theories. While HPSG has a more elaborated principle-based theory of possible phrase structures, TAG provides the means to represent lexicalized structures more explicitly. Our objectives are met by giving clear definitions that determine the projection of structures from the lexicon, and identify maximal projections, auxiliary trees and foot nodes.

  4. Anatomy of a Silicon Compiler

    Science.gov (United States)

    1994-10-04

    CATHEDRAL-II compiler has been adapted and reworked by the Phil- ips Corp. into the PIRAMID environment [Delaruelle88]. From this commercial- ization and...of a declarative DSP programming environment. 220 Behavioral Synthesis Part IV REFERENCES [Delaruelle8S] A. Delaruelie et al., "Design of a syndrome ...generator chip using tde PIRAMID Design System", Proc. FSSCIRC Co4f., pp 256-259, Manchiester. September 1988. [Demang6J H. Do Man,.J. Rabasy, P. Six

  5. HOM identification by bead pulling in the Brookhaven ERL cavity

    CERN Document Server

    Hahn, H; Jain, Puneet; Johnson, Elliott C; Xu, Wencan

    2014-01-01

    Exploratory measurements of the Brookhaven Energy Recovery Linac (ERL) cavity at superconducting temperature produced a long list of high order modes (HOMs). The niobium 5-cell cavity is terminated at each end with HOM ferrite dampers that successfully reduce the Q-factors to levels required to avoid beam break up (BBU) instabilities. However, a number of un-damped resonances with Q≥106 were found at 4 K and their mode identification forms the focus of this paper. The approach taken here consists of bead pulling on a copper (Cu) replica of the ERL cavity with dampers involving various network analyzer measurements. Several different S21 transmission measurements are used, including those taken from the fundamental input coupler to the pick-up probe across the cavity, others between beam-position monitor probes in the beam tubes, and also between probes placed into the cells. The bead pull technique suitable for HOM identification with a metallic needle or dielectric bead is detailed. This paper presents the...

  6. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  7. An Action Compiler Targeting Standard ML

    DEFF Research Database (Denmark)

    Iversen, Jørgen

    2005-01-01

    We present an action compiler that can be used in connection with an action semantics based compiler generator. Our action compiler produces code with faster execution times than code produced by other action compilers, and for some non-trivial test examples it is only a factor two slower than...... the code produced by the Gnu C Compiler. Targeting Standard ML makes the description of the code generation simple and easy to implement. The action compiler has been tested on a description of the Core of Standard ML and a subset of C....

  8. Research programs at the Department of Energy National Laboratories. Volume 2: Laboratory matrix

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-01

    For nearly fifty years, the US national laboratories, under the direction of the Department of Energy, have maintained a tradition of outstanding scientific research and innovative technological development. With the end of the Cold War, their roles have undergone profound changes. Although many of their original priorities remain--stewardship of the nation`s nuclear stockpile, for example--pressing budget constraints and new federal mandates have altered their focus. Promotion of energy efficiency, environmental restoration, human health, and technology partnerships with the goal of enhancing US economic and technological competitiveness are key new priorities. The multiprogram national laboratories offer unparalleled expertise in meeting the challenge of changing priorities. This volume aims to demonstrate each laboratory`s uniqueness in applying this expertise. It describes the laboratories` activities in eleven broad areas of research that most or all share in common. Each section of this volume is devoted to a single laboratory. Those included are: Argonne National Laboratory; Brookhaven National Laboratory; Idaho National Engineering Laboratory; Lawrence Berkeley Laboratory; Lawrence Livermore National Laboratory; Los Alamos National Laboratory; National Renewable Energy Laboratory; Oak Ridge National Laboratory; Pacific Northwest Laboratory; and Sandia National Laboratories. The information in this volume was provided by the multiprogram national laboratories and compiled at Lawrence Berkeley Laboratory.

  9. TUNE: Compiler-Directed Automatic Performance Tuning

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-18

    This project has developed compiler-directed performance tuning technology targeting the Cray XT4 Jaguar system at Oak Ridge, which has multi-core Opteron nodes with SSE-3 SIMD extensions, and the Cray XE6 Hopper system at NERSC. To achieve this goal, we combined compiler technology for model-guided empirical optimization for memory hierarchies with SIMD code generation, which have been developed by the PIs over the past several years. We examined DOE Office of Science applications to identify performance bottlenecks and apply our system to computational kernels that operate on dense arrays. Our goal for this performance-tuning technology has been to yield hand-tuned levels of performance on DOE Office of Science computational kernels, while allowing application programmers to specify their computations at a high level without requiring manual optimization. Overall, we aim to make our technology for SIMD code generation and memory hierarchy optimization a crucial component of high-productivity Petaflops computing through a close collaboration with the scientists in national laboratories.

  10. Initial experiments with the Nevis Cyclotron, the Brookhaven Cosmotron, the Brookhaven AGS and their effects on high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Lindenbaum, S.J.

    1988-01-01

    The first experiment at the Nevis Cyclotron by Bernardini, Booth and Lindenbaum demonstrated that nuclear stars are produced by a nucleon-nucleon cascade within the nucleon. This solved a long standing problem in Cosmic rays and made it clear that where they overlap cosmic ray investigation would not be competitive with accelerator investigations. The initial experiments at the Brookhaven Cosmotron by Lindenbaum and Yuan demonstrated that low energy pion nucleon scattering and pion production were unexpectedly mostly due to excitation of the isotopic spin = angular momentum = 3/2 isobaric state of the nucleon. This contradicted the Fermi statistical theory and led to the Isobar model proposed by the author and a collaborator. The initial experiments at the AGS by the author and collaborators demonstrated that the Pomeronchuck Theorem would not come true till at least several hundred GeV. These scattering experiments led to the development of the ''On-line Computer Technique'' by the author and collaborators which is now the almost universal technique in high energy physics. The first accomplishment which flowed from this technique led to contradiction of the Regge pole theory as a dynamical asymptotic theory, by the author and collaborators. The first critical experimental proof of the forward dispersion relation in strong interactions was accomplished by the author and collaborators. They were then used as a crystal ball to predict that ''Asymptopia''---the theoretically promised land where all asymptotic theorems come true---would not be reached till at least 25,000 BeV and probably not before 1,000,000 BeV. 26 refs., 11 figs., 2 tabs.

  11. Attempts for fostering researches on information compilation

    Science.gov (United States)

    Matsushita, Mitsunori; Kato, Tsuneaki

    Information compilation is a novel research topic which aims to compile various information intelligently, and to make it easy to comprehend/access. The information compilation is emphasized by two characteristics. The first one is a cross-modalitiness achieved by cooperation between non-linguistic information and linguistic information, and the second one is a continuousness in supporting all aspects of information utilization. To foster researches related to the information compilation, we conducted three attempts: installation of a reference model on information compilation, distribution of annotated corpus and a visualization platform as boundary objects, and deployment of an evaluation workshop. The paper describes current efforts of the attempts.

  12. National Educators' Workshop: Update 1998. Standard Experiments in Engineering, Materials Science, and Technology

    Science.gov (United States)

    Arrington, Ginger L. F. (Compiler); Gardner, James E. (Compiler); Jacobs, James A. (Compiler); Swyler, Karl J. (Compiler); Fine, Leonard W. (Compiler)

    1999-01-01

    This document contains a collection of experiments presented and demonstrated at the National Educators' Workshop: Update 98. held at Brookhaven National Laboratory, Upton, New York on November 1-4, 1998.

  13. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice...... and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., that it is a decompiler. In the context of partial evaluation, the binding-time shift of going from a source interpreter to a compiler is classically referred...... to as a Futamura projection. By symmetry, it seems logical to refer to the binding-time shift of going from a target interpreter to a compiler as a Futamura embedding....

  14. Horizontal Compilations of Nuclear Data III

    Science.gov (United States)

    Soroko, Z. N.; Sukhoruchkin, S. I.; Sukhoruchkin, D. S.

    2001-11-01

    Compilations of resonance parameters for all nuclei (horizontal-type compilations) derived from study of reactions with neutrons and charge particles are described. Systematic trends in experimental data due to the differences in energy scales of spectrometers were found after comparison of data for many nuclei. The combination of data from horizontal-type compilations with data from the well-known file ENSDF expands the usefulness of these data files.

  15. Parallel machine architecture and compiler design facilities

    Science.gov (United States)

    Kuck, David J.; Yew, Pen-Chung; Padua, David; Sameh, Ahmed; Veidenbaum, Alex

    1990-01-01

    The objective is to provide an integrated simulation environment for studying and evaluating various issues in designing parallel systems, including machine architectures, parallelizing compiler techniques, and parallel algorithms. The status of Delta project (which objective is to provide a facility to allow rapid prototyping of parallelized compilers that can target toward different machine architectures) is summarized. Included are the surveys of the program manipulation tools developed, the environmental software supporting Delta, and the compiler research projects in which Delta has played a role.

  16. 75 FR 55631 - U. S. Rail Corporation-Construction and Operation Exemption-Brookhaven Rail Terminal

    Science.gov (United States)

    2010-09-13

    ... Surface Transportation Board U. S. Rail Corporation--Construction and Operation Exemption-- Brookhaven Rail Terminal AGENCY: Surface Transportation Board. ACTION: Notice of Board Action. SUMMARY: Subject to... approval requirements of 49 U.S.C. 10901 for U. S. Rail Corporation (U. S. Rail) to construct and operate a...

  17. T.D. LEE: RELATIVISTIC HEAVY ION COLLISIONS AND THE RIKEN BROOKHAVEN CENTER.

    Energy Technology Data Exchange (ETDEWEB)

    MCLERRAN,L.; SAMIOS, N.

    2006-11-24

    This paper presents the history of Professor T. D. Lee's seminal work on the theory of relativistic heavy ion collisions, and the founding and development of the Riken Brookhaven Center. A number of anecdotes are given about Prof. Lee, and his strong positive effect on his colleagues, particularly young physicists.

  18. Compiling Specialised Dictionaries in African Languages ...

    African Journals Online (AJOL)

    This article presents the challenges encountered in the compilation of Isichazamazwi SezoMculo, a Ndebele dictionary of musical terms, henceforth the ISM and the strategies that were used to deal with the respective challenges. The article notes that the experience of the compilers of the dictionary in question is similar to ...

  19. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    We present an approach to compiler implementation using Oliveira and Cook’s structured graphs that avoids the use of explicit jumps in the generated code. The advantage of our method is that it takes the implementation of a compiler using a tree type along with its correctness proof and turns...

  20. Notes on Compiling a Corpus- Based Dictionary*

    African Journals Online (AJOL)

    rbr

    useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dic- tionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form. Keywords: MONOLINGUAL DICTIONARIES, CORPUS LEXICOGRAPHY, SYNTAG-.

  1. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    Helping programmers write parallel software is an urgent problem given the popularity of multi-core architectures. Engineering compilers which automatically parallelize and vectorize code has turned out to be very challenging. Consequently, compilers are very selective with respect to the coding...... patterns they can optimize. We present an interactive approach and a tool set which leverages ad- vanced compiler analysis and optimizations while retaining programmer control over the source code and its transformation. This allows opti- mization even when programmers refrain from enabling optimizations...... to preserve accurate debug information or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and suggests workarounds which can be applied automatically...

  2. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2010-01-01

    Helping programmers write parallel software is an urgent problem given the popularity of multi-core architectures. Engineering compilers which automatically parallelize and vectorize code has turned out to be very challenging and consequently compilers are very selective with respect to the coding...... patterns they can optimize. We present an interactive approach which leverages advanced compiler analysis and optimizations while retaining program- mer control over the source code and its transformation. This allows optimization even when programmers refrain from enabling optimizations to preserve...... accurate debug in- formation or to avoid bugs in the compiler. It also allows the source code to carry optimizations from one compiler to another. Secondly, our tool-set provides feedback on why optimizations do not apply to a code fragment and sug- gests workarounds which can be applied automatically. We...

  3. Dynamic Compilation of C++ Template Code

    Directory of Open Access Journals (Sweden)

    Martin J. Cole

    2003-01-01

    Full Text Available Generic programming using the C++ template facility has been a successful method for creating high-performance, yet general algorithms for scientific computing and visualization. However, adding template code tends to require more template code in surrounding structures and algorithms to maintain generality. Compiling all possible expansions of these templates can lead to massive template bloat. Furthermore, compile-time binding of templates requires that all possible permutations be known at compile time, limiting the runtime extensibility of the generic code. We present a method for deferring the compilation of these templates until an exact type is needed. This dynamic compilation mechanism will produce the minimum amount of compiled code needed for a particular application, while maintaining the generality and performance that templates innately provide. Through a small amount of supporting code within each templated class, the proper templated code can be generated at runtime without modifying the compiler. We describe the implementation of this goal within the SCIRun dataflow system. SCIRun is freely available online for research purposes.

  4. LBA-ECO CD-32 Flux Tower Network Data Compilation, Brazilian Amazon: 1999-2006

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set is a compilation of carbon and energy eddy covariance flux, meteorology, radiation, canopy temperature, humidity, and CO2 profiles and soil moisture...

  5. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  6. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  7. LBA-ECO CD-32 Flux Tower Network Data Compilation, Brazilian Amazon: 1999-2006

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set is a compilation of carbon and energy eddy covariance flux, meteorology, radiation, canopy temperature, humidity, and CO2 profiles and soil...

  8. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  9. Proceedings of RIKEN BNL Research Center Workshop: Brookhaven Summer Program on Nucleon Spin Physics

    Energy Technology Data Exchange (ETDEWEB)

    Aschenauer, A.; Qiu, Jianwei; Vogelsang, W.; Yuan, F.

    2011-08-02

    Understanding the structure of the nucleon is of fundamental importance in sub-atomic physics. Already the experimental studies on the electro-magnetic form factors in the 1950s showed that the nucleon has a nontrivial internal structure, and the deep inelastic scattering experiments in the 1970s revealed the partonic substructure of the nucleon. Modern research focuses in particular on the spin and the gluonic structure of the nucleon. Experiments using deep inelastic scattering or polarized p-p collisions are carried out in the US at the CEBAF and RHIC facilities, respectively, and there are other experimental facilities around the world. More than twenty years ago, the European Muon Collaboration published their first experimental results on the proton spin structure as revealed in polarized deep inelastic lepton-nucleon scattering, and concluded that quarks contribute very little to the proton's spin. With additional experimental and theoretical investigations and progress in the following years, it is now established that, contrary to naive quark model expectations, quarks and anti-quarks carry only about 30% of the total spin of the proton. Twenty years later, the discovery from the polarized hadron collider at RHIC was equally surprising. For the phase space probed by existing RHIC experiments, gluons do not seem to contribute any to the proton's spin. To find out what carries the remaining part of proton's spin is a key focus in current hadronic physics and also a major driving force for the new generation of spin experiments at RHIC and Jefferson Lab and at a future Electron Ion Collider. It is therefore very important and timely to organize a series of annual spin physics meetings to summarize the status of proton spin physics, to focus the effort, and to layout the future perspectives. This summer program on 'Nucleon Spin Physics' held at Brookhaven National Laboratory (BNL) on July 14-27, 2010 [http://www.bnl.gov/spnsp/] is the

  10. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  11. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  12. Brookhaven experience with handling and shipping of, and cask design for, reactor spent-fuel elements

    Energy Technology Data Exchange (ETDEWEB)

    Dugan, Francis A.

    1965-12-01

    The general problems in the area are presented. Solutions to the specific problems at Brookhaven are discussed in relation to the general problem. Presentation covers (a) fuel removal tools and equipment, and canal storage facilities; (b) methods of shipment; (c) brief review of the AEC and ICC regulatory requirements; and (d) optimized design of the shipping container. Specific solutions used by BNL over a six -year period are described. The need for complete and early analysis of the specific problem is indicated.

  13. The first picosecond terawatt CO{sub 2} laser at the Brookhaven Accelerator Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Pogorelsky, I.V.; Ben-Zvi, I.; Babzien, M. [and others

    1998-02-01

    The first terawatt picosecond CO{sub 2} laser will be brought to operation at the Brookhaven Accelerator Test Facility in 1998. System consists of a single-mode TEA oscillator, picosecond semiconductor optical switch, multi-atmosphere. The authors report on design, simulation, and performance tests of the 10 atm final amplifier that allows for direct multi-joule energy extraction in a picosecond laser pulse.

  14. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  15. Swift: Compiled Inference for Probabilistic Programming Languages

    OpenAIRE

    Wu, Yi; Li, Lei; Russell, Stuart; Bodik, Rastislav

    2016-01-01

    A probabilistic program defines a probability measure over its semantic structures. One common goal of probabilistic programming languages (PPLs) is to compute posterior probabilities for arbitrary models and queries, given observed evidence, using a generic inference engine. Most PPL inference engines---even the compiled ones---incur significant runtime interpretation overhead, especially for contingent and open-universe models. This paper describes Swift, a compiler for the BLOG PPL. Swift-...

  16. Ada (Trademark) Compiler Validation Summary Report.

    Science.gov (United States)

    1987-04-30

    identifies and rejects illegal language constructs. The testing also identifies behaviour that is implementation dependent but permitted by the Ada Standard... behaviour is allowed by the Ada Standard Testing of this compiler was conducted by NCC under the direction of the AVF according to policies and...subject compiler has no nonconformities to the Ada Standard other than those presented. Copies of this report are available to the public from: Ada

  17. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    Science.gov (United States)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where

  18. Flexible IDL Compilation for Complex Communication Patterns

    Directory of Open Access Journals (Sweden)

    Eric Eide

    1999-01-01

    Full Text Available Distributed applications are complex by nature, so it is essential that there be effective software development tools to aid in the construction of these programs. Commonplace “middleware” tools, however, often impose a tradeoff between programmer productivity and application performance. For instance, many CORBA IDL compilers generate code that is too slow for high‐performance systems. More importantly, these compilers provide inadequate support for sophisticated patterns of communication. We believe that these problems can be overcome, thus making idl compilers and similar middleware tools useful for a broader range of systems. To this end we have implemented Flick, a flexible and optimizing IDL compiler, and are using it to produce specialized high‐performance code for complex distributed applications. Flick can produce specially “decomposed” stubs that encapsulate different aspects of communication in separate functions, thus providing application programmers with fine‐grain control over all messages. The design of our decomposed stubs was inspired by the requirements of a particular distributed application called Khazana, and in this paper we describe our experience to date in refitting Khazana with Flick‐generated stubs. We believe that the special idl compilation techniques developed for Khazana will be useful in other applications with similar communication requirements.

  19. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  20. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING ... mode of communication between the dictionary compilers and users who are ... during the com-pilation of the ISM and the strategies the compilers employed to ...

  1. Code Generation in the Columbia Esterel Compiler

    Directory of Open Access Journals (Sweden)

    Jia Zeng

    2007-02-01

    Full Text Available The synchronous language Esterel provides deterministic concurrency by adopting a semantics in which threads march in step with a global clock and communicate in a very disciplined way. Its expressive power comes at a cost, however: it is a difficult language to compile into machine code for standard von Neumann processors. The open-source Columbia Esterel Compiler is a research vehicle for experimenting with new code generation techniques for the language. Providing a front-end and a fairly generic concurrent intermediate representation, a variety of back-ends have been developed. We present three of the most mature ones, which are based on program dependence graphs, dynamic lists, and a virtual machine. After describing the very different algorithms used in each of these techniques, we present experimental results that compares twenty-four benchmarks generated by eight different compilation techniques running on seven different processors.

  2. Code Generation in the Columbia Esterel Compiler

    Directory of Open Access Journals (Sweden)

    Edwards StephenA

    2007-01-01

    Full Text Available The synchronous language Esterel provides deterministic concurrency by adopting a semantics in which threads march in step with a global clock and communicate in a very disciplined way. Its expressive power comes at a cost, however: it is a difficult language to compile into machine code for standard von Neumann processors. The open-source Columbia Esterel Compiler is a research vehicle for experimenting with new code generation techniques for the language. Providing a front-end and a fairly generic concurrent intermediate representation, a variety of back-ends have been developed. We present three of the most mature ones, which are based on program dependence graphs, dynamic lists, and a virtual machine. After describing the very different algorithms used in each of these techniques, we present experimental results that compares twenty-four benchmarks generated by eight different compilation techniques running on seven different processors.

  3. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  4. Compilation of data on elementary particles

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed.

  5. Compiler Directed Architecture-Dependent Communication Optimizations.

    Science.gov (United States)

    1995-06-01

    conceptually simpler than general connection creation, because all nodes open and close connections at a common point in the program. The compiler can...occurrence A(lb:ub:s) is mapA (i) = s i + lb. For array dimensions with only scalar references, the initial map is simpler. The scalar is the initial...offset and the multiplier is assumed to be one, e.g. A(x) is mapA (i) = i+ x. If the scalar is a loop index, the compiler can derive more information about

  6. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  7. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... overview of how to use abc to implement an extension. We illustrate the extension mechanisms of abc through a number of small, but nontrivial, examples. We then proceed to contrast the design goals of abc with those of the original AspectJ compiler, and how these different goals have led to different...

  8. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  9. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and ...

  10. Compilation of information on melter modeling

    Energy Technology Data Exchange (ETDEWEB)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (HLW) melter development.

  11. Compiling the First Monolingual Lusoga Dictionary

    African Journals Online (AJOL)

    rbr

    theory explains how the different modules of a language contribute information to the different .... lexicography is not a branch of linguistics but a discipline of its own. ... languages. Lusoga was categorized as an undocumented language but the contexts in which the dictionaries in the two contexts were compiled is not the.

  12. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...

  13. The Compilation of Bilingual Dictionaries between African ...

    African Journals Online (AJOL)

    rbr

    The Compilation of Bilingual. Dictionaries between African. Languages in South Africa: The Case of Northern Sotho and. Tshivenda*. Kwena J. Mashamaite, Department of Northern Sotho, University of the. North, Pietersburg, Republic of South Africa. (mashamaitek@unin.unorth.ac.za). Abstract: Bilingual dictionaries ...

  14. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  15. Criteria for Evaluating the Performance of Compilers

    Science.gov (United States)

    1974-10-01

    See Section 2.) Wnereas in a multi-pass compiler, the code generacion production would usually result in some activity like the tree-builder to develop...Profile data for the list could be automated. (Several idea - on how this might be done are discussed in Section 3 of Chapter 8.) * Chapter 9 presents

  16. X-ray holographic microscopy experiments at the Brookhaven synchrotron light source

    Energy Technology Data Exchange (ETDEWEB)

    Howells, M.R.; Iarocci, M.; Kenney, J.; Kirz, J.; Rarback, H.

    1983-01-01

    Soft x-ray holographic microscopy is discussed from an experimental point of view. Three series of measurements have been carried out using the Brookhaven 750 MeV storage ring as an x-ray source. Young slits fringes, Gabor (in line) holograms and various data pertaining to the soft x-ray performance of photographic plates are reported. The measurements are discussed in terms of the technique for recording them and the experimental limitations in effect. Some discussion is also given of the issues involved in reconstruction using visible light.

  17. Epidemiologic surveillance. [1994] amended annual report for Brookhaven National Laboratory. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    Epidemiologic surveillance at DOE facilities consists of regular and systematic collection, analysis, and interpretation of data on absences due to illness and injury in the work force. Its purpose is to provide an early warning system for health problems occurring among employees at participating sites. Data are collected by coordinators at each site and submitted to the Epidemiologic Surveillance Data Center, located at the Oak Ridge Institute for Science and Education, where quality control procedures and analyses are carried out. Rates of absences and rates of diagnoses associated with absences are analyzed by occupation and other relevant variables. They may be compared with the disease experience of different groups within the DOE work force and with populations that do not work for DOE to identify disease patterns or clusters that may be associated with work activities. This amended annual report corrects errors in the initial release of the BNL report for 1994. In this annual report, the 1994 morbidity data for BNL are summarized.

  18. The electron lens test bench for the relativistic heavy ion collider at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Gu, X., E-mail: xgu@bnl.gov; Altinbas, F.Z.; Beebe, E.; Fischer, W.; Frak, B.M.; Gassner, D.M.; Hamdi, K.; Hock, J.; Hoff, L.; Kankiya, P.; Lambiase, R.; Luo, Y.; Mapes, M.; Mi, J.; Miller, T.; Montag, C.; Nemesure, S.; Okamura, M.; Olsen, R.H.; Pikin, A.I.; and others

    2014-04-11

    To compensate for the beam–beam effects from the proton–proton interactions at the two interaction points IP6 and IP8 in the Relativistic Heavy Ion Collider (RHIC), we are constructing two electron lenses (e-lenses) that we plan to install in the interaction region IR10. Before installing them, the electron gun, collector, instrumentation were tested and the electron beam properties were qualified on an electron lens test bench. We will present the test results and discuss our measurement of the electron beam current and of the electron gun perveance. We achieved a maximum current of 1 A with 5 kV energy for both the pulsed- and the DC-beam (which is a long turn-by-turn pulse beam). We measured beam transverse profiles with an yttrium aluminum garnet (YAG) screen and pinhole detector, and compared those to simulated beam profiles. Measurements of the pulsed electron beam stability were obtained by measuring the modulator voltage.

  19. BROOKHAVEN NATIONAL LABORATORY INSTRUMENTATION DIVISION, R AND D PROGRAMS, FACILITIES, STAFF.

    Energy Technology Data Exchange (ETDEWEB)

    INSTRUMENTATION DIVISION STAFF

    1999-06-01

    To develop state-of-the-art instrumentation required for experimental research programs at BNL, and to maintain the expertise and facilities in specialized high technology areas essential for this work. Development of facilities is motivated by present BNL research programs and anticipated future directions of BNL research. The Division's research efforts also have a significant impact on programs throughout the world that rely on state-of-the-art radiation detectors and readout electronics. Our staff scientists are encouraged to: Become involved in challenging problems in collaborations with other scientists; Offer unique expertise in solving problems; and Develop new devices and instruments when not commercially available. Scientists from other BNL Departments are encouraged to bring problems and ideas directly to the Division staff members with the appropriate expertise. Division staff is encouraged to become involved with research problems in other Departments to advance the application of new ideas in instrumentation. The Division Head integrates these efforts when they evolve into larger projects, within available staff and budget resources, and defines the priorities and direction with concurrence of appropriate Laboratory program leaders. The Division Head also ensures that these efforts are accompanied by strict adherence to all ES and H regulatory mandates and policies of the Laboratory. The responsibility for safety and environmental protection is integrated with supervision of particular facilities and conduct of operations.

  20. Waste management technology development and demonstration programs at Brookhaven National Laboratory

    Science.gov (United States)

    Kalb, Paul D.; Colombo, Peter

    1991-01-01

    Two thermoplastic processes for improved treatment of radioactive, hazardous, and mixed wastes were developed from bench scale through technology demonstration: polyethylene encapsulation and modified sulfur cement encapsulation. The steps required to bring technologies from the research and development stage through full scale implementation are described. Both systems result in durable waste forms that meet current Nuclear Regulatory Commission and Environmental Protection Agency regulatory criteria and provide significant improvements over conventional solidification systems such as hydraulic cement. For example, the polyethylene process can encapsulate up to 70 wt pct. nitrate salt, compared with a maximum of about 20 wt pct. for the best hydraulic cement formulation. Modified sulfur cement waste forms containing as much as 43 wt pct. incinerator fly ash were formulated, whereas the maximum quantity of this waste in hydraulic cement is 16 wt pct.

  1. National Nuclear Physics Summer School

    CERN Document Server

    2016-01-01

    The 2016 National Nuclear Physics Summer School (NNPSS) will be held from Monday July 18 through Friday July 29, 2016, at the Massachusetts Institute of Technology (MIT). The summer school is open to graduate students and postdocs within a few years of their PhD (on either side) with a strong interest in experimental and theoretical nuclear physics. The program will include the following speakers: Accelerators and Detectors - Elke-Caroline Aschenauer, Brookhaven National Laboratory Data Analysis - Michael Williams, MIT Double Beta Decay - Lindley Winslow, MIT Electron-Ion Collider - Abhay Deshpande, Stony Brook University Fundamental Symmetries - Vincenzo Cirigliano, Los Alamos National Laboratory Hadronic Spectroscopy - Matthew Shepherd, Indiana University Hadronic Structure - Jianwei Qiu, Brookhaven National Laboratory Hot Dense Nuclear Matter 1 - Jamie Nagle, Colorado University Hot Dense Nuclear Matter 2 - Wilke van der Schee, MIT Lattice QCD - Sinead Ryan, Trinity College Dublin Neutrino Theory - Cecil...

  2. Doing More with Less: Cost-effective, Compact Particle Accelerators (489th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Trbojevic, Dejan [BNL Collider-Accelerator Department

    2013-10-22

    Replace a 135-ton magnet used for cancer-fighting particle therapies with a magnet that weighs only two tons? Such a swap is becoming possible thanks to new particle accelerator advances being developed by researchers at Brookhaven Lab. With an approach that combines techniques used by synchrotron accelerators with the ability to accept more energy, these new technologies could be used for more than fighting cancer. They could also decrease the lifecycle of byproducts from nuclear power plants and reduce costs for eRHIC—a proposed electron-ion collider for Brookhaven Lab that researchers from around the world would use to explore the glue that holds together the universe’s most basic building blocks and explore the proton-spin puzzle. During this lecture, Dr. Trbojevic provides an overview of accelerator technologies and techniques—particularly a non-scaling, fixed-focused alternating gradient—to focus particle beams using fewer, smaller magnets. He discusses how these technologies will benefit eRHIC and other applications, including particle therapies being developed to combat cancer.

  3. Magic Lenses for RHIC: Compensating Beam-beam Interaction (488th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Yun [BNL Collider-Accelerator Department

    2013-07-17

    Scientists at Brookhaven Lab’s Relativistic Heavy Ion Collider (RHIC) smash atomic particles together to understand more about why the physical world works the way it does. Increasing rates of particle collisions, or luminosity, at RHIC is no small challenge, but the results—more data for better clues—are crucial for scientists trying answer big questions about the origins of matter and mass. When scientists at RHIC collide protons, they don’t hope for a head-on crash by focusing only two particles roaring toward each other from opposite directions. For all intents and purposes, that would be impossible. The scientists can smash protons because they significantly increase the likelihood of collisions by steering hundreds of billions clumped into bunches, which at RHIC are about 3.5 meters long and less than 1 millimeter tall. The particles of these bunches are all positively charged, so when they interact, they repel outwardly—think how magnets repel when their same poles are pushed together. Although this decreases the density of each bunch, reducing luminosity, scientists in Brookhaven Lab’s Collider-Accelerator Department (C-AD) have a solution. After more than seven years of development, the scientists have designed an electron-lens system that uses electrons’ negative charges to attract positively charged proton bunches and minimize their repelling tendencies. Combined with other upgrades to the RHIC accelerator complex, these lenses are important components in efforts towards the major task of doubling the luminosity for proton-proton collisions.

  4. Manipulating Light to Understand and Improve Solar Cells (494th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Eisaman, Matthew [BNL, Sustainable Energy Technologies Department

    2014-04-16

    Energy consumption around the world is projected to approximately triple by the end of the century, according to the 2005 Report from the U.S. Department of Energy's Basic Energy Sciences Workshop on Solar Energy Utilization. Much will change in those next 86 years, but for all the power the world needs—for everything from manufacturing and transportation to air conditioning and charging cell phone batteries—improved solar cells will be crucial to meet this future energy demand with renewable energy sources. At Brookhaven Lab, scientists are probing solar cells and exploring variations within the cells—variations that are so small they are measured in billionths of a meter—in order to make increasingly efficient solar cells and ultimately help reduce the overall costs of deploying solar power plants. Dr. Eisaman will discuss DOE's Sunshot Initiative, which aims to reduce the cost of solar cell-generated electricity by 2020. He will also discuss how he and collaborators at Brookhaven Lab are probing different material compositions within solar cells, measuring how efficiently they collect electrical charge, helping to develop a new class of solar cells, and improving solar-cell manufacturing processes.

  5. Testing-Based Compiler Validation for Synchronous Languages

    Science.gov (United States)

    Garoche, Pierre-Loic; Howar, Falk; Kahsai, Temesghen; Thirioux, Xavier

    2014-01-01

    In this paper we present a novel lightweight approach to validate compilers for synchronous languages. Instead of verifying a compiler for all input programs or providing a fixed suite of regression tests, we extend the compiler to generate a test-suite with high behavioral coverage and geared towards discovery of faults for every compiled artifact. We have implemented and evaluated our approach using a compiler from Lustre to C.

  6. Process compilation methods for thin film devices

    Science.gov (United States)

    Zaman, Mohammed Hasanuz

    This doctoral thesis presents the development of a systematic method of automatic generation of fabrication processes (or process flows) for thin film devices starting from schematics of the device structures. This new top-down design methodology combines formal mathematical flow construction methods with a set of library-specific available resources to generate flows compatible with a particular laboratory. Because this methodology combines laboratory resource libraries with a logical description of thin film device structure and generates a set of sequential fabrication processing instructions, this procedure is referred to as process compilation, in analogy to the procedure used for compilation of computer programs. Basically, the method developed uses a partially ordered set (poset) representation of the final device structure which describes the order between its various components expressed in the form of a directed graph. Each of these components are essentially fabricated "one at a time" in a sequential fashion. If the directed graph is acyclic, the sequence in which these components are fabricated is determined from the poset linear extensions, and the component sequence is finally expanded into the corresponding process flow. This graph-theoretic process flow construction method is powerful enough to formally prove the existence and multiplicity of flows thus creating a design space {cal D} suitable for optimization. The cardinality Vert{cal D}Vert for a device with N components can be large with a worst case Vert{cal D}Vert≤(N-1)! yielding in general a combinatorial explosion of solutions. The number of solutions is hence controlled through a-priori estimates of Vert{cal D}Vert and condensation (i.e., reduction) of the device component graph. The mathematical method has been implemented in a set of algorithms that are parts of the software tool MISTIC (Michigan Synthesis Tools for Integrated Circuits). MISTIC is a planar process compiler that generates

  7. Compilation of Physicochemical and Toxicological Information ...

    Science.gov (United States)

    The purpose of this product is to make accessible the information about the 1,173 hydraulic fracturing-related chemicals that were listed in the external review draft of the Hydraulic Fracturing Drinking Water Assessment that was released recently. The product consists of a series of spreadsheets with physicochemical and toxicological information pulled from several sources of information, including: EPI Suite, LeadScope, QikiProp, Reaxys, IRIS, PPRTV, ATSDR, among other sources. The spreadsheets also contain background information about how the list of chemicals were compiled, what the different sources of chemical information are, and definitions and descriptions of the values presented. The purpose of this product is to compile and make accessible information about the 1,173 hydraulic fracturing-related chemicals listed in the external review draft of the Hydraulic Fracturing Drinking Water Assessment.

  8. COMPILATION OF A SKI RESORT MAP

    Directory of Open Access Journals (Sweden)

    N. V. Zhuravlev

    2016-01-01

    Full Text Available A technique of compiling tourist maps is suggested based on the formation of a basic cartographic material supplemented by the elaboration of derived thematic maps in the GIS-environment. It will be possible to apply the local GIS for new investigations of a touristrecreational cluster. The avalanche-prone areas which are extra risk zones gained special attention.

  9. Nuclear Data Compilation for Beta Decay Isotope

    Science.gov (United States)

    Olmsted, Susan; Kelley, John; Sheu, Grace

    2015-10-01

    The Triangle Universities Nuclear Laboratory nuclear data group works with the Nuclear Structure and Decay Data network to compile and evaluate data for use in nuclear physics research and applied technologies. Teams of data evaluators search through the literature and examine the experimental values for various nuclear structure parameters. The present activity focused on reviewing all available literature to determine the most accurate half-life values for beta unstable isotopes in the A = 3-20 range. This analysis will eventually be folded into the ENSDF (Evaluated Nuclear Structure Data File). By surveying an accumulated compilation of reference articles, we gathered all of the experimental half-life values for the beta decay nuclides. We then used the Visual Averaging Library, a data evaluation software package, to find half-life values using several different averaging techniques. Ultimately, we found recommended half-life values for most of the mentioned beta decay isotopes, and updated web pages on the TUNL webpage to reflect these evaluations. To summarize, we compiled and evaluated literature reports on experimentally determined half-lives. Our findings have been used to update information given on the TUNL Nuclear Data Evaluation group website. This was an REU project with Triangle Universities Nuclear Laboratory.

  10. National Energy Software Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.M.; Butler, M.K.; De Bruler, M.M.

    1979-05-01

    This is the third complete revision of program abstracts undertaken by the Center. Programs of the IBM 7040, 7090, and CDC 3600 vintage have been removed. Historical data and information on abstract format, program package contents, and subject classification are given. The following subject areas are included in the library: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; electronics, engineering equipment, and energy systems studies; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; data. (RWR)

  11. Compilation of Input-Output Data from the National Accounts

    NARCIS (Netherlands)

    Konijn, P.J.A.; Konijn, P.J.A.; Steenge, A.E.

    1995-01-01

    In this paper, a new method is presented to derive an input-output table from a system of make and use tables. The method, which we call 'activity technology', is mathematically equivalent to the well-known commodity technology, but chooses another unit, i.e. the activity. We will argue that, in the

  12. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  13. Fundamentals of Physical Design and Query Compilation

    CERN Document Server

    Toman, David

    2011-01-01

    Query compilation is the problem of translating user requests formulated over purely conceptual and domain specific ways of understanding data, commonly called logical designs, to efficient executable programs called query plans. Such plans access various concrete data sources through their low-level often iterator-based interfaces. An appreciation of the concrete data sources, their interfaces and how such capabilities relate to logical design is commonly called a physical design. This book is an introduction to the fundamental methods underlying database technology that solves the problem of

  14. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy`s (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation.

  15. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  16. Code Commentary and Automatic Refactorings using Feedback from Multiple Compilers

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Probst, Christian W.; Karlsson, Sven

    2014-01-01

    Optimizing compilers are essential to the performance of parallel programs on multi-core systems. It is attractive to expose parallelism to the compiler letting it do the heavy lifting. Unfortunately, it is hard to write code that compilers are able to optimize aggressively and therefore tools...

  17. Proceedings of RIKEN BNL Research Center Workshop: Brookhaven Summer Program on Quarkonium Production in Elementary and Heavy Ion Collisions

    Energy Technology Data Exchange (ETDEWEB)

    Dumitru, A.; Lourenco, C.; Petreczky, P.; Qiu, J., Ruan, L.

    2011-08-03

    Understanding the structure of the hadron is of fundamental importance in subatomic physics. Production of heavy quarkonia is arguably one of the most fascinating subjects in strong interaction physics. It offers unique perspectives into the formation of QCD bound states. Heavy quarkonia are among the most studied particles both theoretically and experimentally. They have been, and continue to be, the focus of measurements in all high energy colliders around the world. Because of their distinct multiple mass scales, heavy quarkonia were suggested as a probe of the hot quark-gluon matter produced in heavy-ion collisions; and their production has been one of the main subjects of the experimental heavy-ion programs at the SPS and RHIC. However, since the discovery of J/psi at Brookhaven National Laboratory and SLAC National Accelerator Laboratory over 36 years ago, theorists still have not been able to fully understand the production mechanism of heavy quarkonia, although major progresses have been made in recent years. With this in mind, a two-week program on quarkonium production was organized at BNL on June 6-17, 2011. Many new experimental data from LHC and from RHIC were presented during the program, including results from the LHC heavy ion run. To analyze and correctly interpret these measurements, and in order to quantify properties of the hot matter produced in heavy-ion collisions, it is necessary to improve our theoretical understanding of quarkonium production. Therefore, a wide range of theoretical aspects on the production mechanism in the vacuum as well as in cold nuclear and hot quark-gluon medium were discussed during the program from the controlled calculations in QCD and its effective theories such as NRQCD to various models, and to the first principle lattice calculation. The scientific program was divided into three major scientific parts: basic production mechanism for heavy quarkonium in vacuum or in high energy elementary collisions; the

  18. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    Science.gov (United States)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  19. Application of Halstead’s Timing Model to Predict the Compilation Time of Ada Compilers

    Science.gov (United States)

    1986-12-01

    conducted by Kerlinger (10:10), Campbell and Standley (10:10), and Elshoff (11:14-16) have tested Halstead*s forumlas with impressive results, thereby...Review. JJL (2): 80-114 (Summer 1982). 20. Relph, Richard, Steve Hahn, and Fred Vilea. "Benchmarking C Compilers," Dr. Dobb’a Journal of Software Tools

  20. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... enforcement proceedings; (2) Deprive a person of a right to a fair trial or an impartial adjudication; (3... and in the case of a record compiled by a criminal law enforcement authority in the courts of a criminal investigation, or by an agency conducting a lawful national security intelligence investigation...

  1. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    with target-specific optimizations. Furthermore, comparing the first benchmark to manually-parallelized, handoptimized pthreads and OpenMP versions, we find that code generated using our approach typically outperforms the pthreads code (within 93-339%). It also performs competitively against the OpenMP code...... an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection......, resulting in scalable parallelized code that runs up to 8.3 times faster on an eightcore Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should be combined...

  2. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  3. Compilation of requests for nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. At the present time needs fall primarily in the area of neutron cross sections, although no such limitation is intended. This request list is ordered by target nucleus (isotope) and then reaction type (quantity). An attempt is made to describe the quantity in standard notation. A glossary of these symbols with a short explanatory text is provided. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1 sigma at each measured point in the energy range requested unless a comment specifies otherwise. (RWR)

  4. A Fast, Versatile Nanoprobe for Complex Materials: The Sub-micron Resolution X-ray Spectroscopy Beamline at NSLS-II (491st Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Thieme, Juergen [BNL Photon Sciences Directorate

    2014-02-06

    Time is money and for scientists who need to collect data at research facilities like Brookhaven Lab’s National Synchrotron Light Source (NSLS), “beamtime” can be a precious commodity. While scanning a complex material with a specific technique and standard equipment today would take days to complete, researchers preparing to use brighter x-rays and the new sub-micron-resolution x-ray spectroscopy (SRX) beamline at the National Synchrotron Light Source II (NSLS-II) could scan the same sample in greater detail with just a few hours of beamtime. Talk about savings and new opportunities for researchers! Users will rely on these tools for locating trace elements in contaminated soils, developing processes for nanoparticles to deliver medical treatments, and much more. Dr. Thieme explains benefits for next-generation research with spectroscopy and more intense x-rays at NSLS-II. He discusses the instrumentation, features, and uses for the new SRX beamline, highlighting its speed, adjustability, and versatility for probing samples ranging in size from millimeters down to the nanoscale. He will talk about complementary beamlines being developed for additional capabilities at NSLS-II as well.

  5. Self-Assembly by Instruction: Designing Nanoscale Systems Using DNA-Based Approaches (474th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Oleg [Center for Functional Nanomaterials

    2012-01-18

    In the field of nanoscience, if you can control how nanoparticles self-assemble in particular structures — joining each other, for example, as molecules can form, atom-by-atom — you can design new materials that have unique properties that industry needs. Nature already uses the DNA genetic code to instruct the building of specific proteins and whole organisms in both plants and people. Taking a cue from nature, scientists at BNL devised a way of using strands of synthetic DNA attached to the surface of nanoparticles to instruct them to self-assemble into specific nanoscale structures, clusters, and three-dimensional organizations. Novel materials designed and fabricated this way promise use in photovoltaics, energy storage, catalysis, cell-targeted systems for more effective medical treatments, and biomolecular sensing for environmental monitoring and medical applications. To find out more about the rapid evolution of this nanoassembly method and its applications, join Physicist Oleg Gang of the Center for Functional Nanomaterials (CFN) as he gives the 474th Brookhaven Lecture, titled “Self-Assembly by Instruction: Designing Nanoscale Systems Using DNA-Based Approaches." Gang, who has led this work at the CFN, will explain the rapid evolution of this nanoassembly method, and discuss its present and future applications in highly specific biosensors, optically active nano-materials, and new ways to fabricate complex architectures in a rational manner via self-assembly. Gang and his colleagues used the CFN and the National Synchrotron Light Source (NSLS) facilities to perform their groundbreaking research. At the CFN, the scientists used electron microscopes and optical methods to visualize the clusters that they fabricated. At the NSLS, they applied x-rays to study a particles-assembly process in solution, DNA’s natural environment. Gang earned a Ph.D. in soft matter physics from Bar-Ilan University in 2000, and he was a Rothschild Fellow at Harvard

  6. Database compilation: hydrology of Lake Tiberias (Jordan Valley)

    Science.gov (United States)

    Shentsis, Izabela; Rosenthal, Eliyahu; Magri, Fabien

    2014-05-01

    A long-term series of water balance data over the last 50 years is compiled to gain insights into the hydrology of the Lake Tiberias (LT) and surrounding aquifers. This database is used within the framework of a German-Israeli-Jordanian project (DFG Ma4450-2) in which numerical modeling is applied to study the mechanisms of deep fluid transport processes affecting the Tiberias basin. The LT is the largest natural freshwater lake in Israel. It is located in the northern part of the Dead Sea Rift. The behavior of the lake level is a result of the regional water balance caused mainly by interaction of two factors: (i) fluctuations of water inflow to the Lake, (ii) water exploitation in the adjacent aquifers and consumptions from the lake (pumping, diversion, etc). The replenishment of the lake occurs through drainage from surrounding mountains (Galilee, Golan Heights), entering the lake through the Jordan River and secondary streams (85%), direct precipitation (11%), fresh-saline springs discharging along the shoreline, divertion from Yarmouk river and internal springs and seeps. The major losses occur through the National Water Carrier (ca. 44%), evaporation (38%), local consumption and compensation to Jordan (in sum 12%). In spite of the increasing role of water exploitation, the natural inflow to the Lake remains the dominant factor of hydrological regime of the Tiberias Lake. Additionally, series of natural yield to the LT are reconstructed with precipitation data measured in the Tiberias basin (1922-2012). The earlier period (1877-1921) is evaluated considering long rainfall records at Beirut and Nazareth stations (Middle East Region). This data enables to use the LT yield as a complex indicator of the regional climate change. Though the data applies to the LT, this example shows the importance of large database. Their compilation defines the correct set-up of joint methodologies such as numerical modeling and hydrochemical analyses aimed to understand large

  7. Compiling a Monolingual Dictionary for Native Speakers

    Directory of Open Access Journals (Sweden)

    Patrick Hanks

    2011-10-01

    Full Text Available

    ABSTRACT: This article gives a survey of the main issues confronting the compilers of monolingual dictionaries in the age of the Internet. Among others, it discusses the relationship between a lexical database and a monolingual dictionary, the role of corpus evidence, historical principles in lexicography vs. synchronic principles, the instability of word meaning, the need for full vocabulary coverage, principles of definition writing, the role of dictionaries in society, and the need for dictionaries to give guidance on matters of disputed word usage. It concludes with some questions about the future of dictionary publishing.

    OPSOMMING: Die samestelling van 'n eentalige woordeboek vir moedertaalsprekers. Hierdie artikel gee 'n oorsig van die hoofkwessies waarmee die samestellers van eentalige woordeboeke in die eeu van die Internet te kampe het. Dit bespreek onder andere die verhouding tussen 'n leksikale databasis en 'n eentalige woordeboek, die rol van korpusgetuienis, historiese beginsels vs sinchroniese beginsels in die leksikografie, die onstabiliteit van woordbetekenis, die noodsaak van 'n volledige woordeskatdekking, beginsels van die skryf van definisies, die rol van woordeboeke in die maatskappy, en die noodsaak vir woordeboeke om leiding te gee oor sake van betwiste woordgebruik. Dit sluit af met 'n aantal vrae oor die toekoms van die publikasie van woordeboeke.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, LEKSIKALE DATABASIS, WOORDEBOEKSTRUKTUUR, WOORDBETEKENIS, BETEKENISVERANDERING, GEBRUIK, GEBRUIKSAANTEKENINGE, HISTORIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, SINCHRONIESE BEGINSELS VAN DIE LEKSIKOGRAFIE, REGISTER, SLANG, STANDAARDENGELS, WOORDESKATDEKKING, KONSEKWENSIE VAN VERSAMELINGS, FRASEOLOGIE, SINTAGMATIESE PATRONE, PROBLEME VAN KOMPOSISIONALITEIT, LINGUISTIESE PRESKRIPTIVISME, LEKSIKALE GETUIENIS

  8. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  9. The Shape and Flow of Heavy Ion Collisions (490th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Schenke, Bjoern [BNL Physics Department

    2014-12-18

    The sun can’t do it, but colossal machines like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven Lab and Large Hadron Collider (LHC) in Europe sure can. Quarks and gluons make up protons and neutrons found in the nucleus of every atom in the universe. At heavy ion colliders like RHIC and the LHC, scientists can create matter more than 100,000 times hotter than the center of the sun—so hot that protons and neutrons melt into a plasma of quarks and gluons. The particle collisions and emerging quark-gluon plasma hold keys to understanding how these fundamental particles interact with each other, which helps explain how everything is held together—from atomic nuclei to human beings to the biggest stars—how all matter has mass, and what the universe looked like microseconds after the Big Bang. Dr. Schenke discusses theory that details the shape and structure of heavy ion collisions. He will also explain how this theory and data from experiments at RHIC and the LHC are being used to determine properties of the quark-gluon plasma.

  10. What Goes Up Must Come Down: The Lifecycle of Convective Clouds (492nd Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Michael [BNL Environmental Sciences

    2014-02-19

    Some clouds look like cotton balls and others like anvils. Some bring rain, some snow and sleet, and others, just shade. But, whether big and billowy or dark and stormy, clouds affect far more than the weather each day. Armed with measurements of clouds’ updrafts and downdrafts—which resemble airflow in a convection oven—and many other atmospheric interactions, scientists from Brookhaven Lab and other institutions around the world are developing models that are crucial for understanding Earth’s climate and forecasting future climate change. During his lecture, Dr. Jensen provides an overview of the importance of clouds in the Earth’s climate system before explaining how convective clouds form, grow, and dissipate. His discussion includes findings from the Midlatitude Continental Convective Clouds Experiment (MC3E), a major collaborative experiment between U.S. Department of Energy (DOE) and NASA scientists to document precipitation, clouds, winds, and moisture in 3-D for a holistic view of convective clouds and their environment.

  11. A Really Good Hammer: Quantification of Mass Transfer Using Perfluorocarbon Tracers (475th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Watson, Tom [BNL Environmental Sciences, Tracer Technology Group

    2012-02-15

    Brookhaven Lab’s perfluorocarbon tracer (PFT) technology can be viewed as a hammer looking for nails. But, according to Tom Watson, leader of the Lab’s Tracer Technology Group in the Environmental Research and Technology Division (ERTD), “It’s a really good hammer!” The colorless, odorless and safe gases have a number of research uses, from modeling how airborne contaminants might move through urban canyons to help first responders plan their response to potential terrorist attacks and accidents to locating leaks in underground gas pipes. Their extremely low background level — detectable at one part per quadrillion — allows their transport to be easily tracked. Lab researchers used PFTs during the 2005 Urban Dispersion Program field studies in New York City, gathering data to help improve models of how a gas or chemical release might move around Manhattan’s tall buildings and canyons. Closer to home, scientists also used PFTs to make ventilation measurements in Bldg. 400 on the Lab site to provide data to test air flow models used in determining the effects of passive and active air exchange on the levels of indoor and outdoor air pollution, and to determine the effects of an accidental or intentional release of hazardous substances in or around buildings.

  12. Reliable operation of the Brookhaven EBIS for highly charged ion production for RHIC and NSRL

    Energy Technology Data Exchange (ETDEWEB)

    Beebe, E., E-mail: beebe@bnl.gov; Alessi, J., E-mail: beebe@bnl.gov; Binello, S., E-mail: beebe@bnl.gov; Kanesue, T., E-mail: beebe@bnl.gov; McCafferty, D., E-mail: beebe@bnl.gov; Morris, J., E-mail: beebe@bnl.gov; Okamura, M., E-mail: beebe@bnl.gov; Pikin, A., E-mail: beebe@bnl.gov; Ritter, J., E-mail: beebe@bnl.gov; Schoepfer, R., E-mail: beebe@bnl.gov [Brookhaven National Laboratory, Upton, NY 11973 (United States)

    2015-01-09

    An Electron Beam Ion Source for the Relativistic Heavy Ion Collider (RHIC EBIS) was commissioned at Brookhaven in September 2010 and since then it routinely supplies ions for RHIC and NASA Space Radiation Laboratory (NSRL) as the main source of highly charged ions from Helium to Uranium. Using three external primary ion sources for 1+ injection into the EBIS and an electrostatic injection beam line, ion species at the EBIS exit can be switched in 0.2 s. A total of 16 different ion species have been produced to date. The length and the capacity of the ion trap have been increased by 20% by extending the trap by two more drift tubes, compared with the original design. The fraction of Au{sup 32+} in the EBIS Au spectrum is approximately 12% for 70-80% electron beam neutralization and 8 pulses operation in a 5 Hertz train and 4-5 s super cycle. For single pulse per super cycle operation and 25% electron beam neutralization, the EBIS achieves the theoretical Au{sup 32+} fractional output of 18%. Long term stability has been very good with availability of the beam from RHIC EBIS during 2012 and 2014 RHIC runs approximately 99.8%.

  13. BESTIA (Brookhaven Experimental Supra-Terawatt Infrared at ATF) laser: A status report

    Science.gov (United States)

    Polyanskiy, Mikhail N.; Babzien, Marcus; Pogorelsky, Igor V.

    2017-03-01

    Development of a next-generation CO2 laser aiming at 100 TW peak power at a wavelength of 10 µm ts underway at the Brookhaven Accelerator Test Facility (ATF). A new laser facility is being deployed as part of the ATF-II upgrade. New high-pressure power amplifiers are being fabricated and assembled, while R&D continues with ATF's present 2 TW CO2 laser system. Our plan for increasing the peak laser power envisions several discrete steps in the upgrade. First will be demonstration of a 10 TW capability utilizing chirped pulse amplification, with an extended power amplifier chain filled with high-pressure isotopic gas. Further development aimed at a demonstration of 25 TW operation will require the addition of a nonlinear compressor system to shrink the pulse width below the nominal gain-bandwidth limit. These upgrades will then enable a longer-term R&D effort to achieve the 100 TW goal. Over the last two years, significant R&D effort has been focused on the development of chirped-pulse amplification, the study of the behavior of optical materials under the action of high-peak-power mid-IR pulses, and the optimization of the beam quality, which is required for nonlinear pulse compression. The results of this R&D have been implemented into the ongoing operation of the ATF's CO2 laser and have already benefited our users in their experimental programs.

  14. Archive Compiles New Resource for Global Tropical Cyclone Research

    Science.gov (United States)

    Knapp, Kenneth R.; Kruk, Michael C.; Levinson, David H.; Gibney, Ethan J.

    2009-02-01

    The International Best Track Archive for Climate Stewardship (IBTrACS) compiles tropical cyclone best track data from 11 tropical cyclone forecast centers around the globe, producing a unified global best track data set (M. C. Kruk et al., A technique for merging global tropical cyclone best track data, submitted to Journal of Atmospheric and Oceanic Technology, 2008). Best track data (so called because the data generally refer to the best estimate of a storm's characteristics) include the position, maximum sustained winds, and minimum central pressure of a tropical cyclone at 6-hour intervals. Despite the significant impact of tropical cyclones on society and natural systems, there had been no central repository maintained for global best track data prior to the development of IBTrACS in 2008. The data set, which builds upon the efforts of the international tropical forecasting community, has become the most comprehensive global best track data set publicly available. IBTrACS was created by the U.S. National Oceanic and Atmospheric Administration's National Climatic Data Center (NOAA NCDC) under the auspices of the World Data Center for Meteorology.

  15. Temporal Planning for Compilation of Quantum Approximate Optimization Algorithm Circuits

    Science.gov (United States)

    Venturelli, Davide; Do, Minh Binh; Rieffel, Eleanor Gilbert; Frank, Jeremy David

    2017-01-01

    We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus our initial experiments on Quantum Approximate Optimization Algorithm (QAOA) circuits that have few ordering constraints and allow highly parallel plans. We report on experiments using several temporal planners to compile circuits of various sizes to a realistic hardware. This early empirical evaluation suggests that temporal planning is a viable approach to quantum circuit compilation.

  16. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  17. DisBlue+: A distributed annotation-based C# compiler

    Directory of Open Access Journals (Sweden)

    Samir E. AbdelRahman

    2010-06-01

    Full Text Available Many programming languages utilize annotations to add useful information to the program but they still result in more tokens to be compiled and hence slower compilation time. Any current distributed compiler breaks the program into scattered disjoint pieces to speed up the compilation. However, these pieces cooperate synchronously and depend highly on each other. This causes massive overhead since messages, symbols, or codes must be roamed throughout the network. This paper presents two promising compilers named annotation-based C# (Blue+ and distributed annotation-based C# (DisBlue+. The proposed Blue+ annotation is based on axiomatic semantics to replace the if/loop constructs. As the developer tends to use many (complex conditions and repeat them in the program, such annotations reduce the compilation scanning time and increases the whole code readability. Built on the top of Blue+, DisBlue+ presents its proposed distributed concept which is to divide each program class to its prototype and definition, as disjoint distributed pieces, such that each class definition is compiled with only its related compiled prototypes (interfaces. Such concept reduces the amount of code transferred over the network, minimizes the dependencies among the disjoint pieces, and removes any possible synchronization between them. To test their efficiencies, Blue+ and DisBlue+ were verified with large-size codes against some existing compilers namely Javac, DJavac, and CDjava.

  18. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  19. Compilation of data for radionuclide transport analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-11-01

    This report is one of the supporting documents to the updated safety assessment (project SAFE) of the Swedish repository for low and intermediate level waste, SFR 1. A number of calculation cases for quantitative analysis of radionuclide release and dose to man are defined based on the expected evolution of the repository, geosphere and biosphere in the Base Scenario and other scenarios selected. The data required by the selected near field, geosphere and biosphere models are given and the values selected for the calculations are compiled in tables. The main sources for the selected values of the migration parameters in the repository and geosphere models are the safety assessment of a deep repository for spent fuel, SR 97, and the preliminary safety assessment of a repository for long-lived, low- and intermediate level waste, SFL 3-5. For the biosphere models, both site-specific data and generic values of the parameters are selected. The applicability of the selected parameter values is discussed and the uncertainty is qualitatively addressed for data to the repository and geosphere migration models. Parameter values selected for these models are in general pessimistic in order not to underestimate the radionuclide release rates. It is judged that this approach combined with the selected calculation cases will illustrate the effects of uncertainties in processes and events that affects the evolution of the system as well as in quantitative data that describes this. The biosphere model allows for probabilistic calculations and the uncertainty in input data are quantified by giving minimum, maximum and mean values as well as the type of probability distribution function.

  20. BNL National Synchrotron Light Source activity report 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    During FY 1997 Brookhaven National Laboratory celebrated its 50th Anniversary and 50 years of outstanding achievement under the management of Associated Universities, Inc. This progress report is divided into the following sections: (1) introduction; (2) science highlights; (3) meetings and workshops; (4) operations; (5) projects; (6) organization; and (7) abstracts and publications.

  1. Global compilation of marine varve records

    Science.gov (United States)

    Schimmelmann, Arndt; Lange, Carina B.; Schieber, Juergen; Francus, Pierre; Ojala, Antti E. K.; Zolitschka, Bernd

    2017-04-01

    Marine varves contain highly resolved records of geochemical and other paleoceanographic and paleoenvironmental proxies with annual to seasonal resolution. We present a global compilation of marine varved sedimentary records throughout the Holocene and Quaternary covering more than 50 sites worldwide. Marine varve deposition and preservation typically depend on environmental and sedimentological conditions, such as a sufficiently high sedimentation rate, severe depletion of dissolved oxygen in bottom water to exclude bioturbation by macrobenthos, and a seasonally varying sedimentary input to yield a recognizable rhythmic varve pattern. Additional oceanographic factors may include the strength and depth range of the Oxygen Minimum Zone (OMZ) and regional anthropogenic eutrophication. Modern to Quaternary marine varves are not only found in those parts of the open ocean that comply with these conditions, but also in fjords, embayments and estuaries with thermohaline density stratification, and nearshore 'marine lakes' with strong hydrologic connections to ocean water. Marine varves have also been postulated in pre-Quaternary rocks. In the case of non-evaporitic laminations in fine-grained ancient marine rocks, such as banded iron formations and black shales, laminations may not be varves but instead may have multiple alternative origins such as event beds or formation via bottom currents that transported and sorted silt-sized particles, clay floccules, and organic-mineral aggregates in the form of migrating bedload ripples. Modern marine ecosystems on continental shelves and slopes, in coastal zones and in estuaries are susceptible to stress by anthropogenic pressures, for example in the form of eutrophication, enhanced OMZs, and expanding ranges of oxygen-depletion in bottom waters. Sensitive laminated sites may play the important role of a 'canary in the coal mine' where monitoring the character and geographical extent of laminations/varves serves as a diagnostic

  2. Not Mere Lexicographic Cosmetics: The Compilation and Structural ...

    African Journals Online (AJOL)

    Riette Ruthven

    structure of the ISM is not a case of mere cosmetics but a lexicographic mode of communication between the dictionary compilers and .... after the compilation of the ISM pertains to the attitude of the users towards other languages. The attitude of Ndebele speakers towards other languages in. Zimbabwe are discussed by ...

  3. 12 CFR 203.4 - Compilation of loan data.

    Science.gov (United States)

    2010-01-01

    ... may but need not be collected for loans purchased by the financial institution. (c) Optional data. A... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Compilation of loan data. 203.4 Section 203.4... DISCLOSURE (REGULATION C) § 203.4 Compilation of loan data. (a) Data format and itemization. A financial...

  4. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  5. Pick'n'Fix: Capturing Control Flow in Modular Compilers

    DEFF Research Database (Denmark)

    Day, Laurence E.; Bahr, Patrick

    2014-01-01

    We present a modular framework for implementing languages with effects and control structures such as loops and conditionals. This framework enables modular definitions of both syntax and semantics as well as modular implementations of compilers and virtual machines. In order to compile control s...

  6. Compilation of floristic and herbarium specimen datain Iran: proposal to data structure

    Directory of Open Access Journals (Sweden)

    Majid Sharifi-Tehrani

    2013-09-01

    Full Text Available Floristic databases constitute the second level of plant information systems, after taxonomic-nomenclatural databases. This paper provided the details of data structure and available data resources to develop a floristic database, along with some explanations on taxonomic and floristic databases. Also, this paper proposed the availability and possibility of a shortcut to constructing a national floristic database through uniforming and compilation of dispersed floristic data contained in various botanical centers of Iran. Therefore, Iran could be the second country in SW Asia region to have a national floristic database, and the resulted services can be presented to national scientific community.

  7. Compilation of kinetic data for geochemical calculations

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, R.C. [Monitor Scientific, LLC., Denver, Colorado (United States); Savage, D. [Quintessa, Ltd., Nottingham (United Kingdom); Sasamoto, Hiroshi; Shibata, Masahiro; Yui, Mikazu [Japan Nuclear Cycle Development Inst., Tokai, Ibaraki (Japan). Tokai Works

    2000-01-01

    Kinetic data, including rate constants, reaction orders and activation energies, are compiled for 34 hydrolysis reactions involving feldspars, sheet silicates, zeolites, oxides, pyroxenes and amphiboles, and for similar reactions involving calcite and pyrite. The data are compatible with a rate law consistent with surface reaction control and transition-state theory, which is incorporated in the geochemical software package EQ3/6 and GWB. Kinetic data for the reactions noted above are strictly compatible with the transition-state rate law only under far-from-equilibrium conditions. It is possible that the data are conceptually consistent with this rate law under both far-from-equilibrium and near-to-equilibrium conditions, but this should be confirmed whenever possible through analysis of original experimental results. Due to limitations in the availability of kinetic data for mine-water reactions, and in order to simplify evaluations of geochemical models of groundwater evolution, it is convenient to assume local-equilibrium in such models whenever possible. To assess whether this assumption is reasonable, a modeling approach accounting for couple fluid flow and water-rock interaction is described that can be use to estimate spatial and temporal scale of local equilibrium. The approach is demonstrated for conditions involving groundwater flow in fractures at JNC's Kamaishi in-situ tests site, and is also used to estimate the travel time necessary for oxidizing surface waters to migrate to the level of a HLW repository in crystalline rock. The question of whether local equilibrium is a reasonable assumption must be addressed using an appropriate modeling approach. To be appropriate for conditions at the Kamaishi site using the modeling approach noted above, the fracture fill must closely approximate a porous mine, groundwater flow must be purely advective and diffusion of solutes across the fracture-host rock boundary must not occur. Moreover, the

  8. 1989 OCRWM [Office of Civilian Radioactive Waste Management] Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-02-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1989 calendar year. A table of contents and one index have been provided to assist in finding information contained in this year`s Bulletins. The pages have been numbered consecutively at the bottom for easy reference. 7 figs.

  9. Data compilations for primary production, herbivory, decomposition, and export for different types of marine communities, 1962-2002 (NCEI Accession 0054500)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a compilation of published data on primary production, herbivory, and nutrient content of primary producers in pristine communities of...

  10. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  11. Compiler Optimization: A Case for the Transformation Tool Contest

    Directory of Open Access Journals (Sweden)

    Sebastian Buchwald

    2011-11-01

    Full Text Available An optimizing compiler consists of a front end parsing a textual programming language into an intermediate representation (IR, a middle end performing optimizations on the IR, and a back end lowering the IR to a target representation (TR built of operations supported by the target hardware. In modern compiler construction graph-based IRs are employed. Optimization and lowering tasks can then be implemented with graph transformation rules. This case provides two compiler tasks to evaluate the participating tools regarding performance.

  12. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    The functional requirements to be met by the HAL/S-FC compiler, and the hardware and software compatibilities between the compiler system and the environment in which it operates are defined. Associated runtime facilities and the interface with the Software Development Laboratory are specified. The construction of the HAL/S-FC system as functionally separate units and the interfaces between those units is described. An overview of the system's capabilities is presented and the hardware/operating system requirements are specified. The computer-dependent aspects of the HAL/S-FC are also specified. Compiler directives are included.

  13. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  14. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  15. Calculating certified compilers for non-deterministic languages

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2015-01-01

    Reasoning about programming languages with nondeterministic semantics entails many difficulties. For instance, to prove correctness of a compiler for such a language, one typically has to split the correctness property into a soundness and a completeness part, and then prove these two parts...... be used to formally derive – from the semantics of the source language – a compiler that is correct by construction. For such a derivation to succeed it is crucial that the underlying correctness argument proceeds as a single calculation, as opposed to separate calculations of the two directions...... of the correctness property. We demonstrate our technique by deriving a compiler for a simple language with interrupts....

  16. Statistical compilation of NAPAP chemical erosion observations

    Science.gov (United States)

    Mossotti, Victor G.; Eldeeb, A. Raouf; Reddy, Michael M.; Fries, Terry L.; Coombs, Mary Jane; Schmiermund, Ron L.; Sherwood, Susan I.

    2001-01-01

    In the mid 1980s, the National Acid Precipitation Assessment Program (NAPAP), in cooperation with the National Park Service (NPS) and the U.S. Geological Survey (USGS), initiated a Materials Research Program (MRP) that included a series of field and laboratory studies with the broad objective of providing scientific information on acid rain effects on calcareous building stone. Among the several effects investigated, the chemical dissolution of limestone and marble by rainfall was given particular attention because of the pervasive appearance of erosion effects on cultural materials situated outdoors. In order to track the chemical erosion of stone objects in the field and in the laboratory, the Ca 2+ ion concentration was monitored in the runoff solution from a variety of test objects located both outdoors and under more controlled conditions in the laboratory. This report provides a graphical and statistical overview of the Ca 2+ chemistry in the runoff solutions from (1) five urban and rural sites (DC, NY, NJ, NC, and OH) established by the MRP for materials studies over the period 1984 to 1989, (2) subevent study at the New York MRP site, (3) in situ study of limestone and marble monuments at Gettysburg, (4) laboratory experiments on calcite dissolution conducted by Baedecker, (5) laboratory simulations by Schmiermund, and (6) laboratory investigation of the surface reactivity of calcareous stone conducted by Fries and Mossotti. The graphical representations provided a means for identifying erroneous data that can randomly appear in a database when field operations are semi-automated; a purged database suitable for the evaluation of quantitative models of stone erosion is appended to this report. An analysis of the sources of statistical variability in the data revealed that the rate of stone erosion is weakly dependent on the type of calcareous stone, the ambient temperature, and the H + concentration delivered in the incident rain. The analysis also showed

  17. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  18. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  19. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  20. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  1. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  2. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    art developments in the theory and practice of lexicography is necessary. The focus of the African languages should be directed onto the compilation of monolingual dictionaries. It is important that these monolingual dictionaries should be ...

  3. Compilation and Synthesis for Fault-Tolerant Digital Microfluidic Biochips

    DEFF Research Database (Denmark)

    Alistar, Mirela

    of electrodes to perform operations such as dispensing, transport, mixing, split, dilution and detection. Researchers have proposed compilation approaches, which, starting from a biochemical application and a biochip architecture, determine the allocation, resource binding, scheduling, placement and routing...

  4. The Health Impact Assessment (HIA) Resource and Tool Compilation

    Science.gov (United States)

    The compilation includes tools and resources related to the HIA process and can be used to collect and analyze data, establish a baseline profile, assess potential health impacts, and establish benchmarks and indicators for monitoring and evaluation.

  5. Availability of Ada and C++ Compilers, Tools, Education and Training

    Science.gov (United States)

    1991-07-01

    Compilers, Ltd. Membership: Observer Name: Eileen Baxter Address: Edinburgh Portable Compilers, Ltd. 17 Alva Street Edinburgh EH2 4PH 0 United Kingdom Tel...031-225 6262 Fax: 031-225 6644 Email: Affiliation: Edison Design Group Membership: Observer Name: J. Stephen Adamczyk Address: Edison Design Group 4... Edison Design Group 8 Orchard View Drive Wilton, NH 03086 Tel: (603) 654-5047 0 D-38 -_0 Fax: (603) 654-5581 Email : rma@edg.com Affiliation: EDP

  6. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite...... by replacing the equivalence test with a constraint-specific measure of distance. We demonstrate the value of the approach for approximate and exact MDD compilation and evaluate its benefits in one of the main MDD application domains, interactive configuration....

  7. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  8. Interactive radiopharmaceutical facility between Yale Medical Center and Brookhaven National Laboratory. Progress report, November 1, 1980-October 31, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Gottschalk, A

    1981-10-31

    Research conducted in three principal areas is discussed and summarized. (1) Investigation of the influence of antiarrhythmic agents, such as lidocaine and procainamide, on the chemotaxis and nylon fiber adherence of indium-111-labelled human polymorphonulcear leukocytes (PMNs) in vitro revealed that at normal therapeutic levels of lidocaine and procaine, the adherence and chemotactic function of In-111-PMNs remain unaltered. Results with higher therapeutic blood levels are also discussed. (2) An improved method for labeling human platelets with In-111-oxine is outlined, and the influence of centrifugal force, oxine, ethanol, and radiation on platelet function is reported. Results indicate that normal labeling procedures induce no gross changes in platelet function. (3) The chemical preparation of radioiodinated arachidonic acid (AA) and nonradioactive acid ester of AA, and the analysis of metabolites of these compounds following myocardial ischemia were investigated in dogs. The tissue uptake of /sup 131/I-AA was compared to that of thallium-201.

  9. Simulation prediction and experiment setup of vacuum laser acceleration at Brookhaven National Lab-Accelerator Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Shao, L., E-mail: leishao@ucla.edu [UCLA, Los Angeles, CA 90095 (United States); Cline, D.; Ding, X. [UCLA, Los Angeles, CA 90095 (United States); Ho, Y.K.; Kong, Q.; Xu, J.J. [Fudan University, Shanghai 200433 (China); Pogorelsky, I.; Yakimenko, V.; Kusche, K. [BNL-ATF, Upton, NY 11973 (United States)

    2013-02-11

    This paper presents the pre-experiment plan and prediction of the first stage of vacuum laser acceleration (VLA) collaborating by UCLA, Fudan University and ATF-BNL. This first stage experiment is a proof-of-principle to support our previously posted novel VLA theory. Simulations show that based on ATF's current experimental conditions the electron beam with initial energy of 15 MeV can get net energy gain from an intense CO{sub 2} laser beam. The difference in electron beam energy spread is observable by the ATF beam line diagnostics system. Further, this energy spread expansion effect increases along with an increase in laser intensity. The proposal has been approved by the ATF committee and the experiment will be our next project.

  10. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP, HADRON STRUCTURE FROM LATTICE QCD, MARCH 18 - 22, 2002, BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    BLUM, T.; BOER, D.; CREUTZ, M.; OHTA, S.; ORGINOS, K.

    2002-03-18

    The RIKEN BNL Research Center workshop on ''Hadron Structure from Lattice QCD'' was held at BNL during March 11-15, 2002. Hadron structure has been the subject of many theoretical and experimental investigations, with significant success in understanding the building blocks of matter. The nonperturbative nature of QCD, however, has always been an obstacle to deepening our understanding of hadronic physics. Lattice QCD provides the tool to overcome these difficulties and hence a link can be established between the fundamental theory of QCD and hadron phenomenology. Due to the steady progress in improving lattice calculations over the years, comparison with experimentally measured hadronic quantities has become important. In this respect the workshop was especially timely. By providing an opportunity for experts from the lattice and hadron structure communities to present their latest results, the workshop enhanced the exchange of knowledge and ideas. With a total of 32 registered participants and 26 talks, the interest of a growing community is clearly exemplified. At the workshop Schierholz and Negele presented the current status of lattice computations of hadron structure. Substantial progress has been made during recent years now that the quenched results are well under control and the first dynamical results have appeared. In both the dynamical and the quenched simulations the lattice results, extrapolated to lighter quark masses, seem to disagree with experiment. Melnitchouk presented a possible explanation (chiral logs) for this disagreement. It became clear from these discussions that lattice computations at significantly lighter quark masses need to be performed.

  11. Interactive radiopharmaceutical facility between Yale Medical Center and Brookhaven National Laboratory. Progress report, June 1981-July 1982

    Energy Technology Data Exchange (ETDEWEB)

    Gottschalk, A

    1982-01-01

    Progress is reported in the following research areas: (1) evaluation of /sup 14/C-labelled carboxyethyl ester 2-cardoxy methyl ester of arachidonic acid; (2) the effects of drug intervention on cardiac inflammatory response following experimental myocardial infarction using indium-111 labeled autologous leukoyctes; (3) the evaluation of /sup 97/Ru-oxine to label human platelets in autologous plasma; and (4) the specific in vitro radiolabeling of human neutrophils. (ACR)

  12. PROCEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP ON BARYON DYNAMICS AT RHIC, MARCH 28-30, 2002, BROOKHAVEN NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    GYULASSY,M.; KHARZEEV,D.; XU,N.

    2002-03-28

    One of the striking observations at RHIC is the large valence baryon rapidity density observed at mid rapidity in central Au+Au at 130 A GeV. There are about twice as many valence protons at mid-rapidity than predicted based on extrapolation from p+p collisions. Even more striking PHENIX observed that the high pt spectrum is dominated by baryons and anti-baryons. The STAR measured event anisotropy parameter v2 for lambdas are as high as charged particles at pt {approx} 2.5 GeV/c. These are completely unexpected based on conventional pQCD parton fragmentation phenomenology. One exciting possibility is that these observables reveal the topological gluon field origin of baryon number transport referred to as baryon junctions. Another is that hydrodynamics may apply up to high pt in A+A. There is no consensus on what are the correct mechanisms for producing baryons and hyperons at high pt and large rapidity shifts and the new RHIC data provide a strong motivation to hold a meeting focusing on this class of observables. The possible role of junctions in forming CP violating domain walls and novel nuclear bucky-ball configurations would also be discussed. In this workshop, we focused on all measured baryon distributions at RHIC energies and related theoretical considerations. To facilitate the discussions, results of heavy ion collisions at lower beam energies, results from p+A /p+p/e+e collisions were included. Some suggestions for future measurements have been made at the workshop.

  13. PHENIX Conceptual Design Report. An experiment to be performed at the Brookhaven National Laboratory Relativistic Heavy Ion Collider

    Energy Technology Data Exchange (ETDEWEB)

    Nagamiya, Shoji; Aronson, Samuel H.; Young, Glenn R.; Paffrath, Leo

    1993-01-29

    The PHENIX Conceptual Design Report (CDR) describes the detector design of the PHENIX experiment for Day-1 operation at the Relativistic Heavy Ion Collider (RHIC). The CDR presents the physics capabilities, technical details, cost estimate, construction schedule, funding profile, management structure, and possible upgrade paths of the PHENIX experiment. The primary goals of the PHENIX experiment are to detect the quark-gluon plasma (QGP) and to measure its properties. Many of the potential signatures for the QGP are measured as a function of a well-defined common variable to see if any or all of these signatures show a simultaneous anomaly due to the formation of the QGP. In addition, basic quantum chromodynamics phenomena, collision dynamics, and thermodynamic features of the initial states of the collision are studied. To achieve these goals, the PHENIX experiment measures lepton pairs (dielectrons and dimuons) to study various properties of vector mesons, such as the mass, the width, and the degree of yield suppression due to the formation of the QGP. The effect of thermal radiation on the continuum is studied in different regions of rapidity and mass. The e{mu} coincidence is measured to study charm production, and aids in understanding the shape of the continuum dilepton spectrum. Photons are measured to study direct emission of single photons and to study {pi}{sup 0} and {eta} production. Charged hadrons are identified to study the spectrum shape, production of antinuclei, the {phi} meson (via K{sup +}K{sup {minus}} decay), jets, and two-boson correlations. The measurements are made down to small cross sections to allow the study of high p{sub T} spectra, and J/{psi} and {Upsilon} production. The PHENIX collaboration consists of over 300 scientists, engineers, and graduate students from 43 institutions in 10 countries. This large international collaboration is supported by US resources and significant foreign resources.

  14. Status of Switched-Power Linac studies at BNL (Brookhaven National Laboratory) and CERN (European Organization for Nuclear Research)

    Energy Technology Data Exchange (ETDEWEB)

    Aronson, S.

    1986-10-31

    The switched-power linac (SPL) concepts are reviewed briefly, and recent work on computer-modelling of the photoemission process at the photocathode and the experimental study of the process are discussed. Work on rf-modelling of the properties of the radial transmission line is outlined. (LEW)

  15. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  16. Pre-synthesis Optimization for Asynchronous Circuits Using Compiler Techniques

    Science.gov (United States)

    Zamanzadeh, Sharareh; Najibi, Mehrdad; Pedram, Hossein

    The effectiveness of traditional compiler techniques employed in high-level synthesis of synchronous circuits aiming to present a generic code is studied for asynchronous synthesis by considering the special features of these circuits. The compiler methods can be used innovatively to improve the synthesis results in both power consumption and area. The compiler methods like speculation, loop invariant code motion and condition expansion are applicable in decreasing mass of handshaking circuits and intermediate modules. Moreover, they eliminate conditional access to variables and ports and reducing the amount of completion detection circuits. The approach is superimposed on to Persia synthesis toolset as a presynthesis source-to-source transformation phase, and results shows on average 22% improvement in terms of area and 24 % in power consumption for asynchronous benchmarks.

  17. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  18. Automated Vulnerability Detection for Compiled Smart Grid Software

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Pleszkoch, Mark G [ORNL; Sayre, Kirk D [ORNL; Linger, Richard C [ORNL

    2012-01-01

    While testing performed with proper experimental controls can provide scientifically quantifiable evidence that software does not contain unintentional vulnerabilities (bugs), it is insufficient to show that intentional vulnerabilities exist, and impractical to certify devices for the expected long lifetimes of use. For both of these needs, rigorous analysis of the software itself is essential. Automated software behavior computation applies rigorous static software analysis methods based on function extraction (FX) to compiled software to detect vulnerabilities, intentional or unintentional, and to verify critical functionality. This analysis is based on the compiled firmware, takes into account machine precision, and does not rely on heuristics or approximations early in the analysis.

  19. Compiling a Comprehensive EVA Training Dataset for NASA Astronauts

    Science.gov (United States)

    Laughlin, M. S.; Murray, J. D.; Lee, L. R.; Wear, M. L.; Van Baalen, M.

    2016-01-01

    Training for a spacewalk or extravehicular activity (EVA) is considered a hazardous duty for NASA astronauts. This places astronauts at risk for decompression sickness as well as various musculoskeletal disorders from working in the spacesuit. As a result, the operational and research communities over the years have requested access to EVA training data to supplement their studies. The purpose of this paper is to document the comprehensive EVA training data set that was compiled from multiple sources by the Lifetime Surveillance of Astronaut Health (LSAH) epidemiologists to investigate musculoskeletal injuries. The EVA training dataset does not contain any medical data, rather it only documents when EVA training was performed, by whom and other details about the session. The first activities practicing EVA maneuvers in water were performed at the Neutral Buoyancy Simulator (NBS) at the Marshall Spaceflight Center in Huntsville, Alabama. This facility opened in 1967 and was used for EVA training until the early Space Shuttle program days. Although several photographs show astronauts performing EVA training in the NBS, records detailing who performed the training and the frequency of training are unavailable. Paper training records were stored within the NBS after it was designated as a National Historic Landmark in 1985 and closed in 1997, but significant resources would be needed to identify and secure these records, and at this time LSAH has not pursued acquisition of these early training records. Training in the NBS decreased when the Johnson Space Center in Houston, Texas, opened the Weightless Environment Training Facility (WETF) in 1980. Early training records from the WETF consist of 11 hand-written dive logbooks compiled by individual workers that were digitized at the request of LSAH. The WETF was integral in the training for Space Shuttle EVAs until its closure in 1998. The Neutral Buoyancy Laboratory (NBL) at the Sonny Carter Training Facility near JSC

  20. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  1. Compilation of Quality Improvement Standards of General Dentistry Program in Islamic Republic

    Directory of Open Access Journals (Sweden)

    Fakhrossadat Hosseini

    Full Text Available Introduction: The importance of quality assurance makes the standard compilation in educational systems as a high priority subject in medical education. The purpose of this study was to study the compilation of quality improvement standards in general dentistry program in Islamic Republic of Iran.Materials & Methods: This descriptive study was performed during the years of 2011 & 2012 in three phases. In the first phase, previous literature and similar standards were included in a comparative study and screened based on national health policies in Health map. Results were evaluated by 16 dental school representatives using modified Delphi methodology and open-closed questionnaires were filled by their faculty members and were reported back to the dental secretariat of ministry of health in the second phase. In the final phase, results were evaluated in the secretariat by a focus group and the final criteria were introduced based on the secretariat politics. Results: Fifty-eight criteria were created in the first phase. Data were collected from 13 faculties in the second phase (response rate=81%. Eighteen criteria had less than 90% agreement of the participants; however, all of the criteria were agreed by more than 70% of the participants. In the final phase, 48 quality improvement standards in seven areas were accepted and introduced in dental secretariat of the ministry of health.Conclusion: The final standard documents could be used as a national covenant of quality improvement based on their high agreement rate and dependence by national politics in the health map.

  2. The compilation of bilingual dictionaries between African languages ...

    African Journals Online (AJOL)

    The absence of bilingual dictionaries between African languages again provides evidence of who the lexicographers were and which population groups they represented. To fill this void, a model called the hub-and-spoke is proposed in this paper for the compilation of such kind of dictionaries. The model has been chosen ...

  3. Compiling European Immigration History: the Case of Land of Promise

    NARCIS (Netherlands)

    Meuzelaar, Andrea

    2015-01-01

    abstractToday television's reliance on archival footage seems to be intensifying due to the increased accessibility of European broadcast archives and the increased amount of available digitized broadcast material. In this article, the author reflects on television's convention to compile stories

  4. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  5. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  6. Target Users' Expectations Versus the Actual Compilation Of A ...

    African Journals Online (AJOL)

    Mev. R.B. Ruthven

    Abstract: The article discusses the challenges that confronted the team of compilers working on the monolingual Shona Children's Dictionary (henceforth SCD). It looks at the active involve- ment of the target users in shaping the project and discusses the considerations for the implemen- tation of their recommendations.

  7. MCP-based detectors: A compact compilation of recent developments

    CERN Document Server

    Valiev, F F; CERN. Geneva

    1992-01-01

    We present a compilation compressed to a table form of the parameters of some recently developed position-sensitive detectors based on the application of microchannel plates (MCPs) for the registration of various sorts of radiations in a wide region of energies.

  8. Experiences with Compiler Support for Processors with Exposed Pipelines

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Schleuniger, Pascal; Hindborg, Andreas Erik

    2015-01-01

    Field programmable gate arrays, FPGAs, have become an attractive implementation technology for a broad range of computing systems. We recently proposed a processor architecture, Tinuso, which achieves high performance by moving complexity from hardware to the compiler tool chain. This means...

  9. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    This article proposes a lexicographical approach to the compilation of multilingual concept literacy glossaries which may play a very important role in supporting students at institutions of higher education. In order to support concept literacy, especially for students for whom English is not the native language, a number of ...

  10. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  11. Compilation and memory management for ASF+SDF

    NARCIS (Netherlands)

    M.G.J. van den Brand (Mark); P. Klint (Paul); P.A. Olivier (Pieter)

    1999-01-01

    textabstractCan formal specification techniques be scaled-up to industrial problems such as the development of domain-specific languages and the renovation of large COBOL systems? We have developed a compiler for the specification formalism Asf+Sdf that has been used successfully to meet such

  12. Joyce M. Hawkins (Compiler). The South African Oxford School Diction

    African Journals Online (AJOL)

    Preparing a dictionary for school children confronts the compiler with at least two problems. The first is how to simplify complex definitions, using a limited vocabulary; the second is what words to put in, and what to leave out. Indeed, how does one begin to assess what words South African school children· cur- rently live ...

  13. Compiling a bidirectional dictionary bridging English and the Sotho ...

    African Journals Online (AJOL)

    The aim of this article is to investigate the viability of the compilation of a single bidirectional dictionary with a single lemma list for the Sesotho sa Leboa, Setswana and Sesotho → English side and a simultaneous treatment of the three Sotho languages in the articles of the English lemmas in the English → Sesotho sa ...

  14. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  15. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  16. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  17. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  18. Target Users' Expectations versus the Actual Compilation of a ...

    African Journals Online (AJOL)

    The article discusses the challenges that confronted the team of compilers working on the monolingual Shona Children's Dictionary (henceforth SCD). It looks at the active involve-ment of the target users in shaping the project and discusses the considerations for the implemen-tation of their recommendations. Matters of ...

  19. The Compilation of a Shona Children's Dictionary: Challenges and ...

    African Journals Online (AJOL)

    This article outlines the challenges encountered by the African Languages Research Institute (ALRI) team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also ...

  20. Compilation of Radiogenic Isotope Data in Mexico and their ...

    Indian Academy of Sciences (India)

    Seven hundred and twenty-five Sr, two hundred and forty-three Nd and one hundred and fifty-one Pb isotopic ratios from seven different Mexican magmatic provinces were compiled in an extensive geochemical database. Data were arranged according to the Mexican geological provinces, indicating for each province total ...

  1. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    Abstract: This article proposes a lexicographical approach to the compilation of multilingual concept literacy glossaries which may play a very important role in supporting students at institu- tions of higher education. In order to support concept literacy, especially for students for whom. English is not the native language, ...

  2. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  3. Thoughts and Views on the Compilation of Monolingual Dictionaries ...

    African Journals Online (AJOL)

    rbr

    the Compilation of Monolingual. Dictionaries in South Africa*. N.C.P. Golele, Xitsonga Language Research and Development Centre, Letaba,. Republic of South Africa (xitsolrdc@telkomsa.net). Abstract: Developing and documenting the eleven official languages of South Africa on all levels of communication in order to fulfil ...

  4. The compilation of the Shona–English biomedical dictionary ...

    African Journals Online (AJOL)

    The bilingual Shona–English dictionary of biomedical terms, Duramazwi reUrapi neUtano, was compiled with the aim of improving the efficiency of communication between doctor and patient. The dictionary is composed of terms from both modern and traditional medicinal practices. The article seeks to look at the methods ...

  5. Fernando: An educational ahead-of-time bytecode compiler

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang

    2015-01-01

    While modern Java virtual machines are efficient and portable, they are also very complex software artifacts. Adapting or extending such a complex system within the scope of a university course is hardly an option. Fernando is a minimalist ahead-of-time bytecode compiler with the aim of providing...

  6. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  7. 5 CFR 9701.524 - Compilation and publication of data.

    Science.gov (United States)

    2010-01-01

    ....524 Section 9701.524 Administrative Personnel DEPARTMENT OF HOMELAND SECURITY HUMAN RESOURCES MANAGEMENT SYSTEM (DEPARTMENT OF HOMELAND SECURITY-OFFICE OF PERSONNEL MANAGEMENT) DEPARTMENT OF HOMELAND SECURITY HUMAN RESOURCES MANAGEMENT SYSTEM Labor-Management Relations § 9701.524 Compilation and...

  8. 15 CFR 28.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Semi-annual compilation. 28.600 Section 28.600 Commerce and Foreign Trade Office of the Secretary of Commerce NEW RESTRICTIONS ON LOBBYING...

  9. 22 CFR 138.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 138.600 Section 138.600 Foreign Relations DEPARTMENT OF STATE MISCELLANEOUS NEW RESTRICTIONS ON LOBBYING Agency Reports...

  10. 22 CFR 712.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... the Committee on Foreign Relations of the Senate and the Committee on Foreign Affairs of the House of... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 712.600 Section 712.600 Foreign Relations OVERSEAS PRIVATE INVESTMENT CORPORATION ADMINISTRATIVE PROVISIONS NEW RESTRICTIONS ON...

  11. 22 CFR 311.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Senate and the Committee on Foreign Affairs of the House of Representatives or the Committees on Armed... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 311.600 Section 311.600 Foreign Relations PEACE CORPS NEW RESTRICTIONS ON LOBBYING Agency Reports § 311.600 Semi-annual...

  12. 22 CFR 227.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Semi-annual compilation. 227.600 Section 227.600 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT NEW RESTRICTIONS ON LOBBYING Agency Reports...

  13. 22 CFR 519.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-04-01

    ... Relations of the Senate and the Committee on Foreign Affairs of the House of Representatives or the... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Semi-annual compilation. 519.600 Section 519.600 Foreign Relations BROADCASTING BOARD OF GOVERNORS NEW RESTRICTIONS ON LOBBYING Agency Reports § 519.600...

  14. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  15. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  16. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  17. Ada Compiler Validation Summary Report: Proprietary Software Systems, Inc., PSS Ada Compiler VAX/VMS, VAX/VMS 8350 (Host and Target) 89071011.10120

    Science.gov (United States)

    1989-07-10

    compiler has no nonconformities to the Ada Standard other than those presented. Copies of this report are available to the public from: Ada Information...result that demonstrates nonconformity to the Ada Standard. Host The computer on which the compiler resides. Inapplicable An ACVC test that uses features...units are in separate compilation files or not. A1408 and AI506 allow this behaviour . 1) Generic specifications and bodies can be compiled in separate

  18. A Study on Features and Limitations of On-line C Compilers

    OpenAIRE

    Lakshminarayanan, Ramkumar; Dhanasekaran, Balaji; Ephre, Ben George

    2016-01-01

    Compilers are used to run programs that are written in a range of designs from text to executable formats. With the advent of the internet, studies related to the development of cloud based compilers are being carried out. There is a considerable increase of on-line compilers enabling on-line compilation of user programs without any mandate to. This study is specific to on-line C compilers to investigate the correctness, issues and limitations.

  19. Compilation of gallium resource data for bauxite deposits

    Science.gov (United States)

    Schulte, Ruth F.; Foley, Nora K.

    2014-01-01

    Gallium (Ga) concentrations for bauxite deposits worldwide have been compiled from the literature to provide a basis for research regarding the occurrence and distribution of Ga worldwide, as well as between types of bauxite deposits. In addition, this report is an attempt to bring together reported Ga concentration data into one database to supplement ongoing U.S. Geological Survey studies of critical mineral resources. The compilation of Ga data consists of location, deposit size, bauxite type and host rock, development status, major oxide data, trace element (Ga) data and analytical method(s) used to derive the data, and tonnage values for deposits within bauxite provinces and districts worldwide. The range in Ga concentrations for bauxite deposits worldwide is

  20. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  1. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  2. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  3. Effect-driven QuickChecking of compilers

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Justesen, Mathias Nygaard; Kasting, Patrick Frederik Soelmark

    2017-01-01

    How does one test a language implementation with QuickCheck (aka. property-based testing)? One approach is to generate programs following the grammar of the language. But in a statically-typed language such as OCaml too many of these candidate programs will be rejected as ill-typed by the type ch...... OCaml's two compiler backends against each other and report on a number of bugs we have found doing so....

  4. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  5. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  6. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  7. National Capital Planning Commission Library contents

    Data.gov (United States)

    National Capital Planning Commission — The National Capital Planning Commission library catalog is a compilation of titles, authors, years of publication and topics of books, reports and NCPC publications.

  8. National Flood Hazard Layer (NFHL)

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The National Flood Hazard Layer (NFHL) is a compilation of GIS data that comprises a nationwide digital Flood Insurance Rate Map. The GIS data and services are...

  9. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  10. The RaDIATE High-Energy Proton Materials Irradiation Experiment at the Brookhaven Linac Isotope Producer Facility

    Energy Technology Data Exchange (ETDEWEB)

    Ammigan, Kavin; et al.

    2017-05-01

    The RaDIATE collaboration (Radiation Damage In Accelerator Target Environments) was founded in 2012 to bring together the high-energy accelerator target and nuclear materials communities to address the challenging issue of radiation damage effects in beam-intercepting materials. Success of current and future high intensity accelerator target facilities requires a fundamental understanding of these effects including measurement of materials property data. Toward this goal, the RaDIATE collaboration organized and carried out a materials irradiation run at the Brookhaven Linac Isotope Producer facility (BLIP). The experiment utilized a 181 MeV proton beam to irradiate several capsules, each containing many candidate material samples for various accelerator components. Materials included various grades/alloys of beryllium, graphite, silicon, iridium, titanium, TZM, CuCrZr, and aluminum. Attainable peak damage from an 8-week irradiation run ranges from 0.03 DPA (Be) to 7 DPA (Ir). Helium production is expected to range from 5 appm/DPA (Ir) to 3,000 appm/DPA (Be). The motivation, experimental parameters, as well as the post-irradiation examination plans of this experiment are described.

  11. Magnet design for the splitter/combiner regions of CBETA, the Cornell-Brookhaven Energy-Recovery-Linac Test Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Crittendon, J. A. [Cornell Lab. for Accelerator-Based Sciences and Education, Ithaca, NY (United States); Burke, D. C. [Cornell Lab. for Accelerator-Based Sciences and Education, Ithaca, NY (United States); Fuentes, Y. L.P. [Cornell Lab. for Accelerator-Based Sciences and Education, Ithaca, NY (United States); Mayes, C. E. [Cornell Lab. for Accelerator-Based Sciences and Education, Ithaca, NY (United States); Smolenski, K. W. [Cornell Lab. for Accelerator-Based Sciences and Education, Ithaca, NY (United States)

    2017-01-06

    The Cornell-Brookhaven Energy-Recovery-Linac Test Accelerator (CBETA) will provide a 150-MeV electron beam using four acceleration and four deceleration passes through the Cornell Main Linac Cryomodule housing six 1.3-GHz superconducting RF cavities. The return path of this 76-m-circumference accelerator will be provided by 106 fixed-field alternating-gradient (FFAG) cells which carry the four beams of 42, 78, 114 and 150 MeV. Here we describe magnet designs for the splitter and combiner regions which serve to match the on-axis linac beam to the off-axis beams in the FFAG cells, providing the path-length adjustment necessary to energy recovery for each of the four beams. The path lengths of the four beamlines in each of the splitter and combiner regions are designed to be adapted to 1-, 2-, 3-, and 4-pass staged operations. Design specifi- cations and modeling for the 24 dipole and 32 quadrupole electromagnets in each region are presented. The CBETA project will serve as the first demonstration of multi-pass energy recovery using superconducting RF cavities with FFAG cell optics for the return loop.

  12. ISO 14001 IMPLEMENTATION AT A NATIONAL LABORATORY.

    Energy Technology Data Exchange (ETDEWEB)

    BRIGGS,S.L.K.

    2001-06-01

    After a tumultuous year discovering serious lapses in environment, safety and health management at Brookhaven National Laboratory, the Department of Energy established a new management contract. It called for implementation of an IS0 14001 Environmental Management System and registration of key facilities. Brookhaven Science Associates, the managing contractor for the Laboratory, designed and developed a three-year project to change culture and achieve the goals of the contract. The focus of its efforts were to use IS0 14001 to integrate environmental stewardship into all facets of the Laboratory's mission, and manage its programs in a manner that protected the ecosystem and public health. A large multidisciplinary National Laboratory with over 3,000 employees and 4,000 visiting scientists annually posed significant challenges for IS0 14001 implementation. Activities with environmental impacts varied from regulated industrial waste generation, to soil activation from particle accelerator operations, to radioactive groundwater contamination from research reactors. A project management approach was taken to ensure project completion on schedule and within budget. The major work units for the Environmental Management System Project were as follows: Institutional EMS Program Requirements, Communications, Training, Laboratory-wide Implementation, and Program Assessments. To minimize costs and incorporate lessons learned before full-scale deployment throughout the Laboratory, a pilot process was employed at three facilities. Brookhaven National Laboratory has completed its second year of the project in the summer of 2000, successfully registering nine facilities and self-declaring conformance in all remaining facilities. Project controls, including tracking and reporting progress against a model, have been critical to the successful implementation. Costs summaries are lower than initial estimates, but as expected legal requirements, training, and assessments are key

  13. Coastal Assessment Framework - National Assessment of Estuary and Coastal Habitats

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Under the National Fish Habitat Partnership, scientists at the NEFSC, NWFSC, and Silver Spring Headquarters are compiling information on the nation's estuarine and...

  14. Compilation of Henry's law constants, version 3.99

    Science.gov (United States)

    Sander, R.

    2014-11-01

    Many atmospheric chemicals occur in the gas phase as well as in liquid cloud droplets and aerosol particles. Therefore, it is necessary to understand the distribution between the phases. According to Henry's law, the equilibrium ratio between the abundances in the gas phase and in the aqueous phase is constant for a dilute solution. Henry's law constants of trace gases of potential importance in environmental chemistry have been collected and converted into a uniform format. The compilation contains 14775 values of Henry's law constants for 3214 species, collected from 639 references. It is also available on the internet at law.org">http://www.henrys-law.org.

  15. A methodology for compiling the current licensing basis

    Energy Technology Data Exchange (ETDEWEB)

    Ward, P.A.; Levin, J.P.

    1990-01-01

    Many licensees are considering comprehensive configuration management programs to maintain documentation and material consistency in all aspects of nuclear power plant operation. An effective licensing basis management program is a key component of the configuration management program. The purpose of this paper is to describe a methodology for compiling the current licensing basis. The current licensing basis for an operating nuclear power plant is the body of licensee obligations, which consist of imposed US Nuclear Regulatory Commission (NRC) requirements and licensee commitments in effect at any given time during the license term, upon which the NRC relies to allow continued operation.

  16. An Advanced Compiler Designed for a VLIW DSP for Sensors-Based Systems

    Directory of Open Access Journals (Sweden)

    Hu He

    2012-04-01

    Full Text Available The VLIW architecture can be exploited to greatly enhance instruction level parallelism, thus it can provide computation power and energy efficiency advantages, which satisfies the requirements of future sensor-based systems. However, as VLIW codes are mainly compiled statically, the performance of a VLIW processor is dominated by the behavior of its compiler. In this paper, we present an advanced compiler designed for a VLIW DSP named Magnolia, which will be used in sensor-based systems. This compiler is based on the Open64 compiler. We have implemented several advanced optimization techniques in the compiler, and fulfilled the O3 level optimization. Benchmarks from the DSPstone test suite are used to verify the compiler. Results show that the code generated by our compiler can make the performance of Magnolia match that of the current state-of-the-art DSP processors.

  17. An advanced compiler designed for a VLIW DSP for sensors-based systems.

    Science.gov (United States)

    Yang, Xu; He, Hu

    2012-01-01

    The VLIW architecture can be exploited to greatly enhance instruction level parallelism, thus it can provide computation power and energy efficiency advantages, which satisfies the requirements of future sensor-based systems. However, as VLIW codes are mainly compiled statically, the performance of a VLIW processor is dominated by the behavior of its compiler. In this paper, we present an advanced compiler designed for a VLIW DSP named Magnolia, which will be used in sensor-based systems. This compiler is based on the Open64 compiler. We have implemented several advanced optimization techniques in the compiler, and fulfilled the O3 level optimization. Benchmarks from the DSPstone test suite are used to verify the compiler. Results show that the code generated by our compiler can make the performance of Magnolia match that of the current state-of-the-art DSP processors.

  18. USGS Water Use Data for the Nation - National Water Information System (NWIS)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The U.S. Geological Survey's National Water-Use Information Program is responsible for compiling and disseminating the nation's water-use data. The USGS works in...

  19. Symbolic LTL Compilation for Model Checking: Extended Abstract

    Science.gov (United States)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  20. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  1. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  2. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  3. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  4. Proceedings of the Oak Ridge National Laboratory/Brookhaven National Laboratory workshop on neutron scattering instrumentation at high-flux reactors

    Energy Technology Data Exchange (ETDEWEB)

    McBee, M.R. (ed.); Axe, J.D.; Hayter, J.B.

    1990-07-01

    For the first three decades following World War II, the US, which pioneered the field of neutron scattering research, enjoyed uncontested leadership in the field. By the mid-1970's, other countries, most notably through the West European consortium at Institut Laue-Langevin (ILL) in Grenoble, France, had begun funding neutron scattering on a scale unmatched in this country. By the early 1980's, observers charged with defining US scientific priorities began to stress the need for upgrading and expansion of US research reactor facilities. The conceptual design of the ANS facility is now well under way, and line-item funding for more advanced design is being sought for FY 1992. This should lead to a construction request in FY 1994 and start-up in FY 1999, assuming an optimal funding profile. While it may be too early to finalize designs for instruments whose construction is nearly a decade removed, it is imperative that we begin to develop the necessary concepts to ensure state-of-the-art instrumentation for the ANS. It is in this context that this Instrumentation Workshop was planned. The workshop touched upon many ideas that must be considered for the ANS, and as anticipated, several of the discussions and findings were relevant to the planning of the HFBR Upgrade. In addition, this report recognizes numerous opportunities for further breakthroughs on neutron instrumentation in areas such as improved detection schemes (including better tailored scintillation materials and image plates, and increased speed in both detection and data handling), in-beam monitors, transmission white beam polarizers, multilayers and supermirrors, and more. Each individual report has been cataloged separately.

  5. Striving Toward Energy Sustainability: How Plants Will Play a Role in Our Future (453rd Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    Ferrieri, Richard A. (Ph.D., Medical Department)

    2009-10-28

    Edible biomass includes sugars from sugar cane or sugar beets, starches from corn kernels or other grains, and vegetable oils. The fibrous, woody and generally inedible portions of plants contain cellulose, hemicellulose and lignin, three key cell-wall components that make up roughly 70 percent of total plant biomass. At present, starch can readily be degraded from corn grain into glucose sugar, which is then fermented into ethanol, and an acre of corn can yield roughly 400 gallons of ethanol. In tapping into the food supply to solve the energy crisis, however, corn and other crops have become more expensive as food. One solution lies in breaking down other structural tissues of plants, including the stalks and leaves of corn, grasses and trees. However, the complex carbohydrates in cellulose-containing biomass are more difficult to break down and convert to ethanol. So researchers are trying to engineer plants having optimal sugars for maximizing fuel yield. This is a challenge because only a handful of enzymes associated with the more than 1,000 genes responsible for cell-wall synthesis have had their roles in controlling plant metabolism defined. As Richard Ferrieri, Ph.D., a leader of a biofuel research initiative within the Medical Department, will discuss during the 453rd Brookhaven Lecture, he and his colleagues use short-lived radioisotopes, positron emission tomography and biomarkers that they have developed to perform non-invasive, real time imaging of whole plants. He will explain how the resulting metabolic flux analysis gives insight into engineering plant metabolism further.

  6. Memory management and compiler support for rapid recovery from failures in computer systems

    Science.gov (United States)

    Fuchs, W. K.

    1991-01-01

    This paper describes recent developments in the use of memory management and compiler technology to support rapid recovery from failures in computer systems. The techniques described include cache coherence protocols for user transparent checkpointing in multiprocessor systems, compiler-based checkpoint placement, compiler-based code modification for multiple instruction retry, and forward recovery in distributed systems utilizing optimistic execution.

  7. Investigating the effects various compilers have on the electromagnetic signature of a cryptographic executable

    CSIR Research Space (South Africa)

    Frieslaar, Ibraheem

    2017-09-01

    Full Text Available This research investigates changes in the electromagnetic (EM) signatures of a cryptographic binary executable based on compile-time parameters to the GNU and clang compilers. The source code is compiled and executed on the Raspberry Pi 2 which...

  8. Brookhaven integrated energy/economy modeling system and its use in conservation policy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Groncki, P.J.; Marcuse, W.

    1979-07-01

    The approach used at BNL to model the impact of the introduction of advanced energy technologies in response to increased energy prices has been to link econometric, process, and input-output models. The econometric model generates growth, employment, productivity, inflation, final demand, and price-determined input-output coefficients for a ten-sector interindustry model. The outputs from the six energy sectors are used to drive a national energy process model which supplies energy prices, fuel mix, and energy capital requirements to the econometric model. The four nonenergy final demands from the econometric model are disaggregated and used with the energy demands from the process model to drive a 110-sector input-output model. The nonenergy coefficients in the input-output model are fixed, but the energy coefficients are variable - reflecting the technologies chosen by the solution of the process model. Coefficients representing advanced-energy-technology production functions have been incorporated in the input-output structure. This approach is briefly described, and three applications of this set of linked models are presented: (1) reports the findings of a study of the effects of various levels of conservation on the rate of growth in GNP and other economic indicators; (2) describes an application of the linked models to an accelerated solar-technology scenario, focusing on the long-run macroeconomic impacts of increased solar utilization; and (3) currently in progress, examines the robustness of two policies (a supply and a demand policy) and their effect on the penetration of renewable technologies across a range of reference cases designed to capture several of the uncertainties faced by decision makers. 63 references.

  9. Compiling Multibeam Sonar data for the U.S. Pacific West Coast Extended Continental Shelf Project

    Science.gov (United States)

    Lim, E.; Gardner, J. V.; Henderson, J. F.

    2011-12-01

    The United States Extended Continental Shelf (ECS) Project is a multi-agency collaboration whose goals are to determine and define a potential extension of the U.S. continental shelf beyond 200 nautical miles (nmi). Under international law as reflected in the 1982 United Nations Convention on the Law of the Sea (UNCLOS), every coastal state is entitled to a continental shelf out to 200 nmi (the Exclusive Economic Zone) from its coastal baseline or out to a maritime boundary with another coastal country. The extended continental shelf (ECS) is the area that lies beyond this 200 nm limit where a country could gain sovereign rights to the resources of the seafloor and sub-seafloor. In 2007, the U.S. ECS Task Force designated NOAA's National Geophysical Data Center (NGDC) as the Data Management lead for the U.S. ECS Project and the data stewards and archival location for all data related to this project. The process to determine the outer limits of the ECS requires the collection and analysis of data that describe the depth, shape, and geophysical characteristics of the seafloor and sub-seafloor, as well as the thickness of the underlying sediments. The specific types of data that need to be collected include bathymetric data, seismic profiles, magnetic and gravity data, and other geophysical data. NGDC maintains several global geophysical databases, including bathymetric, seismic and geological data, all critical for supporting ECS analysis. Multibeam bathymetry is a primary dataset used for ECS analysis. Since 2003, the U.S. has collected more than 1.65 million square kilometers of multibeam bathymetric data from 18 cruises. One area where new data has been collected and where the U.S. may have an extended continental shelf is off the U.S. Pacific West Coast. New and old multibeam bathymetry archived at and delivered by NGDC were individually gridded by survey for an area within 48-30 degrees north latitude and -140 and -115 west longitude at a resolution of 210

  10. COMPILATION OF LABORATORY SCALE ALUMINUM WASH AND LEACH REPORT RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    HARRINGTON SJ

    2011-01-06

    This report compiles and analyzes all known wash and caustic leach laboratory studies. As further data is produced, this report will be updated. Included are aluminum mineralogical analysis results as well as a summation of the wash and leach procedures and results. Of the 177 underground storage tanks at Hanford, information was only available for five individual double-shell tanks, forty-one individual single-shell tanks (e.g. thirty-nine 100 series and two 200 series tanks), and twelve grouped tank wastes. Seven of the individual single-shell tank studies provided data for the percent of aluminum removal as a function of time for various caustic concentrations and leaching temperatures. It was determined that in most cases increased leaching temperature, caustic concentration, and leaching time leads to increased dissolution of leachable aluminum solids.

  11. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  12. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  13. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  14. Rubus: A compiler for seamless and extensible parallelism.

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Nawaz, Zubair; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  15. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  16. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  17. Implementing a New Register Allocator for the Server Compiler in the Java HotSpot Virtual Machine

    OpenAIRE

    Adlertz, Niclas

    2015-01-01

    The Java HotSpot Virtual Machine currently uses two Just In Time compilers to increase the performance of Java code in execution. The client and server compilers, as they are named, serve slightly different purposes. The client compiler produces code fast, while the server compiler produces code of greater quality. Both are important, because in a runtime environment there is a tradeoff between compiling and executing code. However, maintaining two separate compilers leads to increased mainte...

  18. Religious Disbelief and Intelligence: The Failure of a Contemporary Attempt to Correlate National Mean IQs and Rates of Atheism

    National Research Council Canada - National Science Library

    Hale, Frederick

    2011-01-01

    ... the psychometric intelligence of their populations. They relied heavily on international statistics of atheism compiled by Phil Zuckerman and correlated these with data which Lynn had compiled about national mean IQ levels...

  19. COMPILATION AND ANALYSES OF EMISSIONS INVENTORIES FOR THE NOAA ATMOSPHERIC CHEMISTRY PROJECT. PROGRESS REPORT, AUGUST 1997.

    Energy Technology Data Exchange (ETDEWEB)

    BENKOVITZ,C.M.

    1997-09-01

    Global inventories of anthropogenic emissions of oxides of nitrogen (NO{sub x}) for circa 1985 and 1990 and Non-Methane Volatile Organic Compounds (NMVOCs) for circa 1990 have been compiled by this project. Work on the inventories has been carried out under the umbrella of the Global Emissions Inventory Activity (GEIA) of the International Global Atmospheric Chemistry (IGAC) Program. The 1985 NO{sub x} inventory was compiled using default data sets of global emissions that were refined via the use of more detailed regional data sets; this inventory is being distributed to the scientific community at large as the GEIA Version 1A inventory. Global emissions of NO{sub x} for 1985 are estimated to be 21 Tg N y{sup -1}, with approximately 84% originating in the Northern Hemisphere. The 1990 inventories of NO{sub x} and NMVOCs were compiled using unified methodologies and data sets in collaboration with the Netherlands National Institute of Public Health and Environmental Protection (Rijksinstituut Voor Volksgezondheid en Milieuhygiene, RIVM) and the Division of Technology for Society of the Netherlands Organization for Applied Scientific Research, (IMW-TNO); these emissions will be used as the default estimates to be updated with more accurate regional data. The NMVOC inventory was gridded and speciated into 23 chemical categories. The resulting global emissions for 1990 are 31 Tg N yr{sup -1} for NO{sub x} and 173 Gg NMVOC yr{sup -1}. Emissions of NO{sub x} are highest in the populated and industrialized areas of eastern North America and across Europe, and in biomass burning areas of South America, Africa, and Asia. Emissions of NMVOCs are highest in biomass burning areas of South America, Africa, and Asia. The 1990 NO{sub x} emissions were gridded to 1{sup o} resolution using surrogate data, and were given seasonal, two-vertical-level resolution and speciated into NO and NO{sub 2} based on proportions derived from the 1985 GEIA Version 1B inventory. Global NMVOC

  20. Compilation and analyses of emissions inventories for NOAA`s atmospheric chemistry project. Progress report, August 1997

    Energy Technology Data Exchange (ETDEWEB)

    Benkovitz, C.M.; Mubaraki, M.A.

    1997-09-01

    Global inventories of anthropogenic emissions of oxides of nitrogen (NO{sub x}) for circa 1985 and 1990 and Non-Methane Volatile Organic Compounds (NMVOCs) for circa 1990 have been compiled by this project. Work on the inventories has been carried out under the umbrella of the Global Emissions Inventory Activity (GEIA) of the International Global Atmospheric Chemistry (IGAC) Program. The 1985 NO{sub x} inventory was compiled using default data sets of global emissions that were refined via the use of more detailed regional data sets; this inventory is being distributed to the scientific community at large as the GEIA Version 1A inventory. Global emissions of NO{sub x} for 1985 are estimated to be 21 Tg N y{sup -1}, with approximately 84% originating in the Northern Hemisphere. The 1990 inventories of NO{sub x} and NMVOCs were compiled using unified methodologies and data sets in collaboration with the Netherlands National Institute of Public Health and Environmental Protection (Rijksinstituut Voor Volksgezondheid en Milieuhygiene, RIVM) and the Division of Technology for Society of the Netherlands Organization for Applied Scientific Research, (IMW-TNO); these emissions will be used as the default estimates to be updated with more accurate regional data. The NMVOC inventory was gridded and speciated into 23 chemical categories.

  1. Brookhaven highlights 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    Established in 1947 on Long Island, New York, on the site of the former army Camp Upton, BNL is a multidisciplinary laboratory that carries out basic and applied research in the physical, biomedical and environmental sciences and in selected energy technologies. The Laboratory is managed by Associated Universities, Inc., under contract to the US Department of Energy. BNL`s annual budget is about $400 million, and the Laboratory`s facilities are valued at replacements cost in excess of over $2.8 billion. Employees number around 3,300,and over 4,000 guests, collaborators and students come each year to use the Laboratory`s facilities and work with the staff. Scientific and technical achievements at BNL have made their way into daily life in areas as varied as health care, construction materials and video games. The backbone of these developments is fundamental research, which is and always will be an investment in the future.

  2. Current schemes for National Synchrotron Light Source UV beamlines

    Energy Technology Data Exchange (ETDEWEB)

    Williams, G.P.; Howells, M.R.; McKinney, W.R.

    1979-01-01

    We describe in some detail four beamlines proposed for the National Synchrotron Light Source uv ring at Brookhaven National Laboratory. Three grazing-incidence instruments, one of the plane grating Mijake type and two with toroidal gratings at grazing angles of 2-1/2/sup 0/ and 15/sup 0/ are described. Two normal incidence instruments, one using the source as entrance slit and accepting 75 milliradians horizontally are also discussed. In each case we have estimated the output fluxes expected from such beamlines.

  3. OMPC: an Open-Source MATLAB®-to-Python Compiler

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  4. Resource efficient gadgets for compiling adiabatic quantum optimization problems

    Science.gov (United States)

    Babbush, Ryan; O'Gorman, Bryan; Aspuru-Guzik, Alán

    2013-11-01

    We develop a resource efficient method by which the ground-state of an arbitrary k-local, optimization Hamiltonian can be encoded as the ground-state of a (k-1)-local optimization Hamiltonian. This result is important because adiabatic quantum algorithms are often most easily formulated using many-body interactions but experimentally available interactions are generally 2-body. In this context, the efficiency of a reduction gadget is measured by the number of ancilla qubits required as well as the amount of control precision needed to implement the resulting Hamiltonian. First, we optimize methods of applying these gadgets to obtain 2-local Hamiltonians using the least possible number of ancilla qubits. Next, we show a novel reduction gadget which minimizes control precision and a heuristic which uses this gadget to compile 3-local problems with a significant reduction in control precision. Finally, we present numerics which indicate a substantial decrease in the resources required to implement randomly generated, 3-body optimization Hamiltonians when compared to other methods in the literature.

  5. OMPC: an Open-Source MATLAB-to-Python Compiler.

    Science.gov (United States)

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  6. The Genome Reverse Compiler: an explorative annotation tool

    Directory of Open Access Journals (Sweden)

    Warren Andrew S

    2009-01-01

    Full Text Available Abstract Background As sequencing costs have decreased, whole genome sequencing has become a viable and integral part of biological laboratory research. However, the tools with which genes can be found and functionally characterized have not been readily adapted to be part of the everyday biological sciences toolkit. Most annotation pipelines remain as a service provided by large institutions or come as an unwieldy conglomerate of independent components, each requiring their own setup and maintenance. Results To address this issue we have created the Genome Reverse Compiler, an easy-to-use, open-source, automated annotation tool. The GRC is independent of third party software installs and only requires a Linux operating system. This stands in contrast to most annotation packages, which typically require installation of relational databases, sequence similarity software, and a number of other programming language modules. We provide details on the methodology used by GRC and evaluate its performance on several groups of prokaryotes using GRC's built in comparison module. Conclusion Traditionally, to perform whole genome annotation a user would either set up a pipeline or take advantage of an online service. With GRC the user need only provide the genome he or she wants to annotate and the function resource files to use. The result is high usability and a very minimal learning curve for the intended audience of life science researchers and bioinformaticians. We believe that the GRC fills a valuable niche in allowing users to perform explorative, whole-genome annotation.

  7. ROSE: Compiler Support for Object-Oriented Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Qainlant, D.

    1999-11-17

    ROSE is a preprocessor generation tool for the support of compile time performance optimizations in Overture. The Overture framework is an object-oriented environment for solving partial differential equations in two and three space dimensions. It is a collection of C++ libraries that enables the use of finite difference and finite volume methods at a level that hides the details of the associated data structures. Overture can be used to solve problems in complicated, moving geometries using the method of overlapping grids. It has support for grid generation, difference operators, boundary conditions, database access and graphics. In this paper we briefly present Overture, and discuss our approach toward performance within Overture and the A++P++ array class abstractions upon which Overture depends, this work represents some of the newest work in Overture. The results we present show that the abstractions represented within Overture and the A++P++ array class library can be used to obtain application codes with performance equivalent to that of optimized C and Fortran 77. ROSE, the preprocessor generation tool, is general in its application to any object-oriented framework or application and is not specific to Overture.

  8. Compiler Optimization to Improve Data Locality for Processor Multithreading

    Directory of Open Access Journals (Sweden)

    Balaram Sinharoy

    1999-01-01

    Full Text Available Over the last decade processor speed has increased dramatically, whereas the speed of the memory subsystem improved at a modest rate. Due to the increase in the cache miss latency (in terms of the processor cycle, processors stall on cache misses for a significant portion of its execution time. Multithreaded processors has been proposed in the literature to reduce the processor stall time due to cache misses. Although multithreading improves processor utilization, it may also increase cache miss rates, because in a multithreaded processor multiple threads share the same cache, which effectively reduces the cache size available to each individual thread. Increased processor utilization and the increase in the cache miss rate demands higher memory bandwidth. A novel compiler optimization method has been presented in this paper that improves data locality for each of the threads and enhances data sharing among the threads. The method is based on loop transformation theory and optimizes both spatial and temporal data locality. The created threads exhibit high level of intra‐thread and inter‐thread data locality which effectively reduces both the data cache miss rates and the total execution time of numerically intensive computation running on a multithreaded processor.

  9. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  10. A robust method for compiling trawl survey data used in the assessment of central Baltic cod ( Gadus morhua L.)

    DEFF Research Database (Denmark)

    Sparholt, H.; Tomkiewicz, Jonna

    2000-01-01

    Annual stratified bottom trawl surveys have been used since 1982 to estimate cod abundance in the central Baltic Sea. Catch at age from national research vessels is included in a common database, but data compilation is hampered by large differences in fishing power between vessels, mainly due...... of adult cod of all ages aggregated to minimise the effects of age determination discrepancies. The established index of spawning stock biomass did not correlate significantly better with VPA estimates than simpler indices ignoring depth stratification and fishing power. However, the method is preferable...... because it considers factors known to influence survey trawl catches....

  11. Objective Caml on .NET: The OCamIL Compiler and Toplevel

    OpenAIRE

    Montelatici, Raphaël; Chailloux, Emmanuel; Pagano, Bruno

    2005-01-01

    International audience; We present the OCamIL compiler for Objective Caml that targets .NET. Our experiment consists in adding a new back-end to the INRIA Objective Caml compiler, that generates MSIL bytecode. Among all the advantages of code reuse, ensuring compatibility while keeping all the expressiveness of the original language is particularly interesting. This allowed us to bootstrap the OCamIL compiler as a .NET component and build an interactive loop (toplevel) which may be embedded w...

  12. 50 CFR 86.123 - Comprehensive National Assessment schedule.

    Science.gov (United States)

    2010-10-01

    ... THE INTERIOR (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM BOATING... National Assessment schedule. Using the results from the State surveys, the Service will compile the results and produce the Comprehensive National Assessment by September 30, 2003. ...

  13. Draft Water Resource Inventory (WRI) Kanuti National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This Water Resource Inventory (WRI) Report for the Kanuti National Wildlife Refuge (NWR) compiles the results of data mining from national and regional sources for...

  14. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  15. Comment on "Atomic mass compilation 2012" by B. Pfeiffer, K. Venkataramaniah, U. Czok, C. Scheidenberger

    Science.gov (United States)

    Audi, G.; Blaum, K.; Block, M.; Bollen, G.; Goriely, S.; Hardy, J. C.; Herfurth, F.; Kondev, F. G.; Kluge, H.-J.; Lunney, D.; Pearson, J. M.; Savard, G.; Sharma, K. S.; Wang, M.; Zhang, Y. H.

    2015-05-01

    In order to avoid errors and confusion that may arise from the recent publication of a paper entitled "Atomic Mass Compilation 2012", we explain the important difference between a compilation and an evaluation; the former is a necessary but insufficient condition for the latter. The simple list of averaged mass values offered by the "Atomic Mass Compilation" uses none of the numerous links and correlations present in the large body of input data that are carefully maintained within the "Atomic Mass Evaluation". As such, the mere compilation can only produce results of inferior accuracy. Illustrative examples are given.

  16. Global compilation of coastline change at river mouths

    Science.gov (United States)

    Aadland, Tore; Helland-Hansen, William

    2016-04-01

    We are using Google Earth Engine to analyze Landsat images to create a global compilation of coastline change at river mouths in order to develop scaling relationships between catchment properties and shoreline behaviour. Our main motivation for doing this is to better understand the rates at which shallowing upward successions of deltaic successions are formed. We are also interested in getting an insight into the impact of climate change and human activity on modern shorelines. Google Earth Engine is a platform that offers simple selection of relevant data from an extensive catalog of geospatial data and the tools to analyse it efficiently. We have used Google Earth Engine to select and analyze temporally and geographically bounded sets of Landsat images covering modern deltas included in the Milliman and Farnsworth 2010 database. The part of the shoreline sampled for each delta has been manually defined. The areas depicted in these image sets have been classified as land or water by thresholding a calibrated Modified Normalized Water Index. By representing land and water as 1.0 and 0 respectively and averaging image sets of sufficient size we have generated rasters quantifying the probability of an area being classified as land. The calculated probabilities reflect variation in the shoreline position; in particular, it minimizes the impact of short term-variations produced by tides. The net change in the land area of deltas can be estimated by comparing how the probability changes between image sets spanning different time periods. We have estimated the land area change that occurred from 2000 to 2014 at more than 130 deltas with catchment areas ranging from 470 to 6300000 sqkm. Log-log plots of the land area change of these deltas against their respective catchment properties in the Milliman and Farnsworth 2010 database indicate that the rate of land area change correlates with catchment size and discharge. Useful interpretation of the data requires that we

  17. Assessment of the current status of basic nuclear data compilations

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, R.L.

    1992-12-31

    The Panel on Basic Nuclear Data Compilations believes that it is important to provide the user with an evaluated nuclear database of the highest quality, dependability, and currency. It is also important that the evaluated nuclear data are easily accessible to the user. In the past the panel concentrated its concern on the cycle time for the publication of A-chain evaluations. However, the panel now recognizes that publication cycle time is no longer the appropriate goal. Sometime in the future, publication of the evaluated A-chains will evolve from the present hard-copy Nuclear Data Sheets on library shelves to purely electronic publication, with the advent of universal access to terminals and the nuclear databases. Therefore, the literature cut-off date in the Evaluated Nuclear Structure Data File (ENSDF) is rapidly becoming the only important measure of the currency of an evaluated A-chain. Also, it has become exceedingly important to ensure that access to the databases is as user-friendly as possible and to enable electronic publication of the evaluated data files. Considerable progress has been made in these areas: use of the on-line systems has almost doubled in the past year, and there has been initial development of tools for electronic evaluation, publication, and dissemination. Currently, the nuclear data effort is in transition between the traditional and future methods of dissemination of the evaluated data. Also, many of the factors that adversely affect the publication cycle time simultaneously affect the currency of the evaluated nuclear database. Therefore, the panel continues to examine factors that can influence cycle time: the number of evaluators, the frequency with which an evaluation can be updated, the review of the evaluation, and the production of the evaluation, which currently exists as a hard-copy issue of Nuclear Data Sheets.

  18. A compiled checklist of seaweeds of Sudanese Red Sea coast

    Directory of Open Access Journals (Sweden)

    Nahid Abdel Rahim Osman

    2016-02-01

    Full Text Available Objective: To present an updated and compiled checklist of Sudanese seaweeds as an example for the region for conservational as well as developmental purposes. Methods: The checklist was developed based on both field investigations using line transect method at 4 sites along the Red Sea coast of Sudan and review of available studies done on Sudanese seaweeds. Results: In total 114 macroalgal names were recorded and were found to be distributed in 16 orders, 34 families, and 62 genera. The Rhodophyceae macroalgae contained 8 orders, 17 families, 32 genera and 47 species. The Phaeophyceae macroalgae composed of 4 orders, 5 families, 17 genera, and 28 species. The 39 species of the Chlorophyceae macroalgae belong to 2 classes, 4 orders, 12 families, and 14 genera. The present paper proposed the addition of 11 macroalgal taxa to be included in Sudan seaweeds species list. These include 3 red seaweed species, 1 brown seaweed species and 7 green seaweed species. Conclusions: This list is not yet inclusive and it only represents the macroalgal species common to the intertidal areas of Sudan Red Sea coast. Further investigation may reveal the presence of more species. While significant levels of diversity and endemism were revealed for other groups of organisms in the Red Sea region, similar work still has to be performed for seaweeds. Considering the impact of climate change on communities’ structure and composition and the growing risk of maritime transportation through the Red Sea particularly that may originate from oil tankers as well as that may emanate from oil exploration, baseline data on seaweeds are highly required for management purposes.

  19. Fifth Baltic Sea pollution load compilation (PLC-5)

    Energy Technology Data Exchange (ETDEWEB)

    Knuuttila, S.; Svendsen, L.M.; Staaf, H.; Kotilainen, P.; Boutrup, S.; Pyhala, M.; Durkin, M.

    2011-07-01

    This report includes the main results from the Fifth Pollution Load Compilation abbreviated PLC-5. It includes quantified annual waterborne total loads (from rivers, unmonitored and coastal areas as well as direct point and diffuse sources discharging directly to the Baltic Sea) from 1994 to 2008 to provide a basis for evaluating any decreasing (or increasing) trends in the total waterborne inputs to the Baltic Sea. Chapter 1 contains the objectives of PLC and the framework on classification of inputs and sources. Chapter 2 includes a short description of the Baltic Sea catchment area, while the methods for quantification and analysis together with quality assurance topics are briefly introduced in Chapter 3. More detailed information on methodologies is presented in the PLC-5 guidelines (HELCOM 2006). Chapter 4 reports the total inputs to the Baltic Sea of nutrients and selected heavy metals. Furthermore, the results of the quatification of discharges and losses of nitrogen and phosphorus from point and diffuse sources into inland surface waters within the Baltic Sea catchment area (source-oriented approach or gross loads) as well as the total load to the maritime area (load-oriented approarch or net loads) in 2006 are shown. Typically, results are presented by country and by main Baltic Sea sub-region. In Chapter 5, flow normalization is introduced and the results of trend analyses on 1994-2008 time series data on total waterborne loads of nitrogen and phosphorus are given together with a first evaluation of progress in obtaining the provisional reduction targets by country and by main Baltic Sea sub-region. Chapter 6 includes discussion of some of the main conclusions and advice for future PLCs. The annexes contain the flow-normalized annual load data and figures and tables with results from the PLC-5.

  20. Evaluation and compilation of fission product yields 1993

    Energy Technology Data Exchange (ETDEWEB)

    England, T.R.; Rider, B.F.

    1995-12-31

    This document is the latest in a series of compilations of fission yield data. Fission yield measurements reported in the open literature and calculated charge distributions have been used to produce a recommended set of yields for the fission products. The original data with reference sources, and the recommended yields axe presented in tabular form. These include many nuclides which fission by neutrons at several energies. These energies include thermal energies (T), fission spectrum energies (F), 14 meV High Energy (H or HE), and spontaneous fission (S), in six sets of ten each. Set A includes U235T, U235F, U235HE, U238F, U238HE, Pu239T, Pu239F, Pu241T, U233T, Th232F. Set B includes U233F, U233HE, U236F, Pu239H, Pu240F, Pu241F, Pu242F, Th232H, Np237F, Cf252S. Set C includes U234F, U237F, Pu240H, U234HE, U236HE, Pu238F, Am241F, Am243F, Np238F, Cm242F. Set D includes Th227T, Th229T, Pa231F, Am241T, Am241H, Am242MT, Cm245T, Cf249T, Cf251T, Es254T. Set E includes Cf250S, Cm244S, Cm248S, Es253S, Fm254S, Fm255T, Fm256S, Np237H, U232T, U238S. Set F includes Cm243T, Cm246S, Cm243F, Cm244F, Cm246F, Cm248F, Pu242H, Np237T, Pu240T, and Pu242T to complete fission product yield evaluations for 60 fissioning systems in all. This report also serves as the primary documentation for the second evaluation of yields in ENDF/B-VI released in 1993.

  1. Ada Compiler Validation Summary Report: Certificate Number 890621I1. 10150 Intel Corporation/TARTAN Laboratories VMS Ada960MC Compiler R1.0 VAXstation 3200 to Intel 80060MC

    Science.gov (United States)

    1989-06-21

    set forth in this report are accurate and complete, or that the subject compiler has no nonconformities to the Ada Standard other than those presented...translators, and interpreters. Failed test An ACVC test for which the compiler generates a result that demonstrates nonconformity to the Ada Standard. Host...compilation files or not. A1408 and A1506 allow this behaviour . 1) Generic specifications and bodies can be compiled in separate compilations. (See tests

  2. Los Alamos National Laboratory: Request for Information (RFI) – Call for Commercialization Partners on behalf of the Department of Energy’s Fuel Cell Technologies Office (FCTO) L’Innovator Pilot Program

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Laura Jeaneen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    The purpose of the L’Innovator is to assemble unique, state-of-the-art IP bundles developed at the national labs that aggregate synergistic technologies in furtherance of the emerging hydrogen and fuel cell market. The first L’Innovator IP bundle consists of Oxygen Reduction Reaction (ORR) Catalyst technology developed at Brookhaven National Laboratory (BNL), combined with Membrane Electrode Assembly (MEA) technology developed at Los Alamos National Laboratory (LANL).

  3. Compilation and network analyses of cambrian food webs.

    Directory of Open Access Journals (Sweden)

    Jennifer A Dunne

    2008-04-01

    Full Text Available A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid

  4. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  5. Simulating TRSs by minimal TRSs : a simple, efficient, and correct compilation technique

    NARCIS (Netherlands)

    J.F.T. Kamperman; H.R. Walters (Pum)

    1996-01-01

    textabstractA simple, efficient, and correct compilation technique for left-linear Term Rewriting Systems (TRSs) is presented. TRSs are compiled into Minimal Term Rewriting Systems (MTRSs), a subclass of TRSs, presented in [KW95d]. In MTRSs, the rules have such a simple form that they can be seen as

  6. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  7. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports.

  8. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts.

  9. Compiling a land audit in large rural areas: Results from the ...

    African Journals Online (AJOL)

    To compile a comprehensive land audit in large, mainly rural-based municipalities such as the Matzikama Municipality in the Western Cape warrants an alternative methodology than that conventionally done through exhaustive property visits. This study attempts to showcase such an alternative methodology to compile the ...

  10. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays...

  11. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  12. Obtaining correct compile results by absorbing mismatches between data types representations

    Energy Technology Data Exchange (ETDEWEB)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-11-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  13. Obtaining correct compile results by absorbing mismatches between data types representations

    Energy Technology Data Exchange (ETDEWEB)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2017-03-21

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  14. Obtaining correct compile results by absorbing mismatches between data types representations

    Energy Technology Data Exchange (ETDEWEB)

    Horie, Michihiro; Horii, Hiroshi H.; Kawachiya, Kiyokuni; Takeuchi, Mikio

    2016-10-04

    Methods and a system are provided. A method includes implementing a function, which a compiler for a first language does not have, using a compiler for a second language. The implementing step includes generating, by the compiler for the first language, a first abstract syntax tree. The implementing step further includes converting, by a converter, the first abstract syntax tree to a second abstract syntax tree of the compiler for the second language using a conversion table from data representation types in the first language to data representation types in the second language. When a compilation error occurs, the implementing step also includes generating a special node for error processing in the second abstract syntax tree and storing an error token in the special node. When unparsing, the implementing step additionally includes outputting the error token, in the form of source code written in the first language.

  15. 2002 global update of available data on visual impairment: a compilation of population-based prevalence studies.

    Science.gov (United States)

    Pascolini, D; Mariotti, S P; Pokharel, G P; Pararajasegaram, R; Etya'ale, D; Négrel, A D; Resnikoff, S

    2004-04-01

    For the past 25 years, the WHO Programme for the Prevention of Blindness and Deafness has maintained a Global Data Bank on visual impairment with the purpose of storing the available epidemiological data on blindness and low vision. The Data Bank has now been updated to include studies conducted since the last update in 1994. An extensive literature search was conducted in international and national scientific and medical journals to identify epidemiological studies that fulfilled basic criteria for inclusion in the Data Bank, namely a clearly stated definition of blindness and low vision, and prevalence rates derived from population-based surveys. Sources such as National Prevention of Blindness Programmes, academic institutions or WHO country or regional reports were also investigated. Two-hundred-and-eight population-based studies on visual impairment for 68 countries are reported in detail, providing an up-to-date, comprehensive compilation of the available information on visual impairment and its causes globally.

  16. Los Alamos geostationary orbit synoptic data set: a compilation of energetic particle data

    Energy Technology Data Exchange (ETDEWEB)

    Baker, D.N.; Higbie, P.R.; Belian, R.D.; Aiello, W.P.; Hones, E.W. Jr.; Tech, E.R.; Halbig, M.F.; Payne, J.B.; Robinson, R.; Kedge, S.

    1981-08-01

    Energetic electron (30 to 2000 keV) and proton (145 keV to 150 MeV) measurements made by Los Alamos National Laboratory sensors at geostationary orbit 6.6 R/sub E/ are summarized. The data are plotted in terms of daily average spectra, 3-h local time averages, and in a variety of statistical formats. The data summarize conditions from mid-1976 through 1978 (S/C 1976-059) and from early 1977 through 1978 (S/C 1977-007). The compilations correspond to measurements at 35/sup 0/W, 70/sup 0/W, and 135/sup 0/W geographic longitude and, thus, are indicative of conditions at 9/sup 0/, 11/sup 0/, and 4.8/sup 0/ geomagnetic latitude, respectively. Most of this report is comprised of data plots that are organized according to Carrington solar rotations so that the data can be easily compared to solar rotation-dependent interplanetary data. As shown in prior studies, variations in solar wind conditions modulate particle intensity within the terrestrial magnetosphere. The effects of these variations are demonstrated and discussed. Potential uses of the Synoptic Data Set by the scientific and applications-oriented communities are also discussed.

  17. Palaeoecological studies as a source of peat depth data: A discussion and data compilation for Scotland

    Directory of Open Access Journals (Sweden)

    J. Ratcliffe

    2016-06-01

    Full Text Available The regional/national carbon (C stock of peatlands is often poorly characterised, even for comparatively well-studied areas. A key obstacle to better estimates of landscape C stock is the scarcity of data on peat depth, leading to simplistic assumptions. New measurements of peat depth become unrealistically resource-intensive when considering large areas. Therefore, it is imperative to maximise the use of pre-existing datasets. Here we propose that one potentially valuable and currently unexploited source of peat depth data is palaeoecological studies. We discuss the value of these data and present an initial compilation for Scotland (United Kingdom which consists of records from 437 sites and yields an average depth of 282 cm per site. This figure is likely to be an over-estimate of true average peat depth and is greater than figures used in current estimates of peatland C stock. Depth data from palaeoecological studies have the advantages of wide distribution, high quality, and often the inclusion of valuable supporting information; but also the disadvantage of spatial bias due to the differing motivations of the original researchers. When combined with other data sources, each with its own advantages and limitations, we believe that palaeoecological datasets can make an important contribution to better-constrained estimates of peat depth which, in turn, will lead to better estimates of peatland landscape carbon stock.

  18. Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations

    Energy Technology Data Exchange (ETDEWEB)

    Rachel Henderson

    2007-09-30

    The project is titled 'Identification, Verification, and Compilation of Produced Water Management Practices for Conventional Oil and Gas Production Operations'. The Interstate Oil and Gas Compact Commission (IOGCC), headquartered in Oklahoma City, Oklahoma, is the principal investigator and the IOGCC has partnered with ALL Consulting, Inc., headquartered in Tulsa, Oklahoma, in this project. State agencies that also have partnered in the project are the Wyoming Oil and Gas Conservation Commission, the Montana Board of Oil and Gas Conservation, the Kansas Oil and Gas Conservation Division, the Oklahoma Oil and Gas Conservation Division and the Alaska Oil and Gas Conservation Commission. The objective is to characterize produced water quality and management practices for the handling, treating, and disposing of produced water from conventional oil and gas operations throughout the industry nationwide. Water produced from these operations varies greatly in quality and quantity and is often the single largest barrier to the economic viability of wells. The lack of data, coupled with renewed emphasis on domestic oil and gas development, has prompted many experts to speculate that the number of wells drilled over the next 20 years will approach 3 million, or near the number of current wells. This level of exploration and development undoubtedly will draw the attention of environmental communities, focusing their concerns on produced water management based on perceived potential impacts to fresh water resources. Therefore, it is imperative that produced water management practices be performed in a manner that best minimizes environmental impacts. This is being accomplished by compiling current best management practices for produced water from conventional oil and gas operations and to develop an analysis tool based on a geographic information system (GIS) to assist in the understanding of watershed-issued permits. That would allow management costs to be kept in

  19. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. [comp.

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  20. A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries

    Energy Technology Data Exchange (ETDEWEB)

    Liao, C; Quinlan, D; Panas, T

    2010-01-25

    OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is often complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.

  1. Compilation of ocean circulation and other data from ADCP current meters, CTD casts, tidal gauges, and other instruments from a World-Wide distribution by Oregon State University and other institutions as part of World Ocean Circulation Experiment (WOCE) and other projects from 24 November 1985 to 30 December 2000 (NODC Accession 0000649)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Compilation of ocean circulation and other data were collected from a World-Wide distribution by Oregon State University (OSU) and other institutions as part of...

  2. Dolphin-FEW: an architecture for compilers development, monitoring and use on the web

    OpenAIRE

    Matos, Paulo; Pedro HENRIQUES

    2003-01-01

    DOLPHIN is a framework developed to help the construction of high performance, multi-language and retargetable compilers. It is constituted by a set of components, used to build and test new compilers or compiler routines; and by a set of tools, used to access, manage and develop the components. To improve and enlarge the functionalities of the DOLPHIN, several small projects were implemented around the framework, one of them is the DOLPHIN – Front-End for the Web, whose the goal is to build ...

  3. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  4. A Critical Compilation of Compressible Turbulent Boundary Layer Data

    Science.gov (United States)

    1977-06-01

    Netherlands Delegation to AGARD National Aerospace Laboratory, NLR P.O. Box 126 Del ft UNITED KINGDOM Defence Research Information Centre Station Square...ii.4720 1.9239 1.26780#401 1.26718𔃺+0 2.13J6,600 4.5129*40% 1.01000 I.bob’l’t0’ NM 2.4sl8 1.8050 i.fh09� t.1204p402 INFINTIE 3.1104.tiq 0.0000o...2.3610 1.0437 1.S0304ൌ NM 1.7794 1.4671 2.0717"+04 2.5959"+04 2.2225’-01 3.5709"*05 1.0292 2.8009"+04 NM 1.006 1.8007 2.8566蓒 1.3693"+02 INFINTIE 2

  5. AGS-2000: Experiments for the 21. Century. Proceedings of the workshop held at Brookhaven National Laboratory, May 13--17, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Littenberg, L. [ed.] [Brookhaven National Lab., Upton, NY (United States); Sandweiss, J. [ed.] [Yale Univ., New Haven, CT (United States)

    1996-10-01

    The AGS has a vital and interesting potential for new research. The reasons for this are a fortunate concomitance of the energy chosen for the AGS and the steady stream of technological advances which have both increased the intensity and flexibility of the AGS beams, and the capability of detectors to use these new beam parameters. The physics potentials of the future AGS program can be roughly divided into three broad areas. (1) fundamental elementary particle studies (based on rare kaon decays, rare muon processes and searches for new particles); (2) non-perturbative QCD; and (3) heavy ion physics. The overriding considerations for the operation of the AGS in the next decade must, of course, be the interest and potential of the scientific program. However, once that has been established, there are other aspects of the AGS program which deserve mention. Although experiments at the AGS are of increasing sophistication, they are smaller, less expensive, and more quickly executed than experiments at newer, larger facilities. Finally, the authors note that since the AGS must be maintained as a viable accelerator to serve as an injector to RHIC, the cost of an AGS fixed target experiment need be only the incremental cost of the experiment itself along with some modest additional operating costs. This means that AGS fixed target experiments are substantially cheaper than they would have been before the RHIC era. The remainder of this document contains brief summaries of the experiments considered by the working groups in the AGS-2000 Workshop. These summaries expand on points discussed here.

  6. HOM damping properties of fundamental power couplers in the superconducting electron gun of the energy recovery LINAC at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Hammons, L.; Hahn, H.

    2011-03-28

    Among the accelerator projects under construction at the Relativistic Heavy Ion Collider (RHIC) is an R and D energy recovery LINAC (ERL) test facility. The ERL includes both a five-cell superconducting cavity as well as a superconducting, photoinjector electron gun. Because of the high-charge and high-current demands, effective higher-order mode (HOM) damping is essential, and several strategies are being pursued. Among these is the use of the fundamental power couplers as a means for damping some HOMs. Simulation studies have shown that the power couplers can play a substantial role in damping certain HOMs, and this presentation discusses these studies along with measurements.

  7. Proceedings of the international workshop on the effects of acid precipitation on vegetation, soils, and terrestrial ecosystems, Brookhaven National Laboratory, June 12 to 14, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Evans, L.S.; Hendrey, G.R. (eds.)

    1979-01-01

    The objectives of the workshop were to determine the levels of current knowledge of the effects of acid precipitation on vegetation, soils, and terrestrial ecosystems; research needed in these areas to understand the environmental impacts of acid rain; and to help coordinate research groups to avoid excessive duplication of research. The workshop was designed so that researchers in the areas of effects of acid precipitation on vegetation, soils, and whole ecosystem approaches could communicate effectively. There was a general consensus that acid rain at extreme ambient levels, or in artificial systems that simulate extreme ambient levels, causes injury to plant tissues. A major area of concern of acid rain injury was thought to be plant reproduction. The overall levels of significance of plant injury among various plant species remain unknown. The most important priorities in the area of effects of acid rain on crops were an evaluation of effects on crop yields and interaction of acid rain in combination with pollutants on various plants. Few participants thought that ambient acid rain loadings have altered soils to such a degree that plants are affected at present, but many thought that acid rain could cause some alterations in soils. The most important research priorities were in the areas of the effects of acid rain on increased leaching of exchangeable plant nutrients and alterations in phosphorous availability. All participants agreed that there are alterations in terrestrial ecosystems from acid precipitation. However, no demonstrated harmful effects were presented from natural ecosystems. Further research on the effects of acid rain on terrestrial ecosystems should be directed mostly toward the interaction of acid rain with toxic elements such as Al, Fe, and Mn and on the effects on nutrient cycling, especially that of nitrogen.

  8. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Evan Harpeneau

    2011-06-24

    The Separations Process Research Unit (SPRU) complex located on the Knolls Atomic Power Laboratory (KAPL) site in Niskayuna, New York, was constructed in the late 1940s to research the chemical separation of plutonium and uranium (Figure A-1). SPRU operated as a laboratory scale research facility between February 1950 and October 1953. The research activities ceased following the successful development of the reduction oxidation and plutonium/uranium extraction processes. The oxidation and extraction processes were subsequently developed for large scale use by the Hanford and Savannah River sites (aRc 2008a). Decommissioning of the SPRU facilities began in October 1953 and continued through the 1990s.

  9. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  10. An application of Brookhaven National Laboratory`s hot particle methodology for determining the most effective beta particle energy in causing skin ulcers

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, C.

    1994-11-01

    The purpose of this project was to compare the effectiveness of hot particles with different energy betas in producing ulcers on skin. The sources were man-made hot particles similar in size and activity to those found in the commercial nuclear power industry. Four different particle types were used. These were thulium (Tm-170) with a 0.97 MeV maximum energy beta, ytterbium (Yb-175) with a maximum beta energy of 0.47 MeV, scandium (Sc-46) with a 0.36 MeV beta, which was used as a surrogate for cobalt-60 (0.31 MeV beta) and uranium (in the carbide form) with an average maximum beta energy of about 2.5 MeV. Since higher energy beta particles penetrate further in skin, they will affect a higher number and different populations of target cells. The experiments were designed as threshold studies such that the dose needed to produce ulcers ten percent of the time (ED 10%) for each particle type could be compared against each other.

  11. Ada Compiler Validation Summary Report: Proprietary Software Systems, Inc., PSS VAX.1750A Ada Compiler, VAX 8350 (Host) to MIL-STD-1750A/PSS AdaRAID (Target)

    Science.gov (United States)

    1989-12-18

    set forth in this report are accurate and complete, or that the subject compiler has no nonconformities to the Ada Standard other than those presented...translators, and interpreters. Failed test An ACVC test for which the compiler generates a result that demonstrates nonconformity to the Ada Standard. Host The...units are in separate compilation files or not. A1408 and A1506 allow this behaviour . 1) Generic specifications and bodies can be compiled in separate

  12. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  13. Compilation of the FY 1997 Financial Statements for Other Defense Organizations

    National Research Council Canada - National Science Library

    1998-01-01

    ...) Indianapolis Center of the FY 1997 Financial Statements for Other Defense Organizations. This report also summarizes conditions affecting the compilation process as reported in previous Inspector General, DoD, audit reports...

  14. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  15. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  16. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  17. De la problématique des articles synopsis dans la compilation des ...

    African Journals Online (AJOL)

    rbr

    environnement lin- guistique et sociolinguistique, dont il faut tenir compte dans la compilation des dictionnaires. Par rapport aux articles simples les articles synopsis offrent une meilleure approche dans le transfert des données linguistiques et ...

  18. Compilation of high energy physics reaction data: inventory of the particle data group holdings 1980

    Energy Technology Data Exchange (ETDEWEB)

    Fox, G.C.; Stevens, P.R.; Rittenberg, A.

    1980-12-01

    A compilation is presented of reaction data taken from experimental high energy physics journal articles, reports, preprints, theses, and other sources. Listings of all the data are given, and the data points are indexed by reaction and momentum, as well as by their source document. Much of the original compilation was done by others working in the field. The data presented also exist in the form of a computer-readable and searchable database; primitive access facilities for this database are available.

  19. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  20. Pin count-aware biochemical application compilation for mVLSI biochips

    DEFF Research Database (Denmark)

    Lander Raagaard, Michael; Pop, Paul

    2015-01-01

    manually by individually controlling each valve. Recent research has proposed top-down physical synthesis Computer- Aided Design tools, and programming languages and compilation techniques to automatically derive the control signals for the valve actuations. However, researchers have so far assumed...... a biochemical application. We focus on the compilation task, where the strategy is to delay operations, without missing their deadlines, such that the sharing of control signals is maximized. The evaluation shows a significant reduction in the number of control pins required....

  1. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  2. The Integration of Compiled and Explicit Causal Knowledge for Diagnostic Reasoning

    OpenAIRE

    Senyk, Oksana

    1988-01-01

    Causal reasoning gained prominence in the “second generation” of artificial intelligence in medicine (AIM) programs in the late 1970s. Today we face the challenge of developing methods of compiled causal reasoning which approximates the clinical reasoning employed by expert diagnosticians for the efficient solution of most cases. To this end, we have developed the compiled causal link technique, by means of which a diagnostic system can solve routine cases quickly and can deal efficiently wit...

  3. A compiler and validator for flight operations on NASA space missions

    Science.gov (United States)

    Fonte, Sergio; Politi, Romolo; Capria, Maria Teresa; Giardino, Marco; De Sanctis, Maria Cristina

    2016-07-01

    In NASA missions the management and the programming of the flight systems is performed by a specific scripting language, the SASF (Spacecraft Activity Sequence File). In order to perform a check on the syntax and grammar it is necessary a compiler that stress the errors (eventually) found in the sequence file produced for an instrument on board the flight system. In our experience on Dawn mission, we developed VIRV (VIR Validator), a tool that performs checks on the syntax and grammar of SASF, runs a simulations of VIR acquisitions and eventually finds violation of the flight rules of the sequences produced. The project of a SASF compiler (SSC - Spacecraft Sequence Compiler) is ready to have a new implementation: the generalization for different NASA mission. In fact, VIRV is a compiler for a dialect of SASF; it includes VIR commands as part of SASF language. Our goal is to produce a general compiler for the SASF, in which every instrument has a library to be introduced into the compiler. The SSC can analyze a SASF, produce a log of events, perform a simulation of the instrument acquisition and check the flight rules for the instrument selected. The output of the program can be produced in GRASS GIS format and may help the operator to analyze the geometry of the acquisition.

  4. The history of national accounting

    OpenAIRE

    Bos, Frits

    1992-01-01

    At present, the national accounts in most countries are compiled on the basis of concepts and classifications in international guidelines. In this paper, we trace the roots of these guidelines, compare the subsequent quidelines and discuss also alternative accounting systems like extended accounts and SAMs.

  5. A compilation of research working groups on drug utilisation across Europe.

    Science.gov (United States)

    Sabaté, Mònica; Pacheco, Juan Fernando; Ballarín, Elena; Ferrer, Pili; Petri, Hans; Hasford, Joerg; Schoonen, Marieke Wilma; Rottenkolber, Marietta; Fortuny, Joan; Laporte, Joan-Ramon; Ibáñez, Luisa

    2014-03-13

    The assessment of the benefit-risk of medicines needs careful consideration concerning their patterns of utilization. Systems for the monitoring of medicines consumption have been established in many European countries, and several international groups have identified and described them. No other compilation of European working groups has been published. As part of the PROTECT project, as a first step in searching for European data sources on the consumption of five selected groups of medicines, we aimed to identify and describe the main characteristics of the existing collaborative European working groups. Google and bibliographic searches (PubMed) of articles containing information on databases and other sources of drug consumption data were conducted. For each working group the main characteristics were recorded.Nineteen selected groups were identified, focusing on: a) general drug utilisation (DU) research (EuroDURG, CNC, ISPE'S SIG-DUR, EURO-MED-STAT, PIPERSKA Group, NorPEN, ENCePP, DURQUIM), b) specific DU research: b.1) antimicrobial drugs (ARPAC, ESAC, ARPEC, ESGAP, HAPPY AUDIT), b.2) cardiovascular disease (ARITMO, EUROASPIRE), b.3) paediatrics (TEDDY), and b.4) mental health/central nervous system effects (ESEMeD, DRUID, TUPP/EUPoMMe). Information on their aims, methods and activities is presented. We assembled and updated information on European working groups in DU research and in the utilisation of five selected groups of drugs for the PROTECT project. This information should be useful for academic researchers, regulatory and health authorities, and pharmaceutical companies conducting and interpreting post-authorisation and safety studies. European health authorities should encourage national research and collaborations in this important field for public health.

  6. National Center for Analysis of Energy Systems: program summaries for 1979

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    This Center, founded in January 1976, is one of four areas comprising the Department of Energy and Environment at Brookhaven National Laboratory. The major ongoing activities of the Center concern integrated, quantitative analyses of technological, economic, and environmental aspects of energy at the regional, national, and international levels. The objectives, activities, and sources of support of each of the programs are described and the major accomplishments during the year are outlined. Some of the planned future activities of the Center are indicated, and recent publications are listed.

  7. A Compilation of Federal Education Laws. Volume III--Higher Education. As Amended through December 31, 1982. Prepared for the Use of the Committee on Education and Labor. House of Representatives, Ninety-Eighth Congress, First Session.

    Science.gov (United States)

    Congress of the U.S., Washington, DC. House Committee on Education and Labor.

    A compilation of federal legislation that pertains to higher education is presented. The sections of this volume cover the general higher education programs, Native American higher education, museums, arts and humanities, and the National Science Foundation. The text of the Higher Education Act of 1965 is presented, covering Titles I-XIII. The…

  8. National Hydrogen Roadmap Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-04-01

    This document summarizes the presentations and suggestions put forth by officials, industry experts and policymakers in their efforts to come together to develop a roadmap for America''s clean energy future and outline the key barriers and needs to achieve the hydrogen vision. The National Hydrogen Roadmap Workshop was held April 2-3, 2002. These proceedings were compiled into a formal report, The National Hydrogen Energy Roadmap, which is also available online.

  9. H11174: NOS Hydrographic Survey , Gray's Reef National Marine Sanctuary, Georgia, 2002-05-24

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  10. H11055: NOS Hydrographic Survey , Gray's Reef National Marine Sanctuary, Georgia, 2002-07-04

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  11. H10107: NOS Hydrographic Survey , Grays Reef National Marine Sanctuary Brunswick, Georgia, 1983-08-08

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  12. H12382: NOS Hydrographic Survey , Florida Keys National Marine Sanctuary, Florida, 2011-11-15

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  13. H11086: NOS Hydrographic Survey , Olympic Coast National Marine Sanctuary, Washington, 2001-10-24

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  14. H12377: NOS Hydrographic Survey , Florida Keys National Marine Sanctuary, Florida, 2011-11-15

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  15. Kanuti National Wildlife Refuge Water Resource Inventory and Assessment (WRIA), Appendices, and geoPDF

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Water Resource Inventory and Assessment (WRIA) for the Kanuti National Wildlife Refuge (NWR) compiles the results of data mining from national and regional...

  16. H12383: NOS Hydrographic Survey , Florida Keys National Marine Sanctuary, Florida, 2011-11-15

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  17. Ada Compiler Validation Summary Report. Certificate Number: 911003N1. 11222, International Computers Limited VME Ada Compiler VA3,00 ICL Series 39 Level 80

    Science.gov (United States)

    1991-01-01

    statements set forth in this report are accurate and complete, or that the subject implementation has no nonconformities to the Ada Standard other than those...Some of the class B tests contain legal Ada code which must not be flagged illegal by the compiler. This behaviour is also verified. Class L tests check...implementation’s behaviour should be graded as passed because the implementation passed the integer and fixed-point checks; the following REPORT.FAILED messages

  18. National Land Cover Database 2001 Version 2: 1985-2006

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The National Land Cover Database 2001 Version 2 (NLCD 2001 Version 2) is being compiled across all 50 states and Puerto Rico as a cooperative mapping effort of the...

  19. Alaska maritime national wildlife refuge - Bearing Sea unit contaminant assessment

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The purpose of the Contaminant Assessment Process (CAP) is to compile and summarize known past, present, and potential contaminant issues on National Wildlife...

  20. An ECMA-55 Minimal BASIC Compiler for x86-64 Linux®

    Directory of Open Access Journals (Sweden)

    John Gatewood Ham

    2014-10-01

    Full Text Available This paper describes a new non-optimizing compiler for the ECMA-55 Minimal BASIC language that generates x86-64 assembler code for use on the x86-64 Linux® [1] 3.x platform. The compiler was implemented in C99 and the generated assembly language is in the AT&T style and is for the GNU assembler. The generated code is stand-alone and does not require any shared libraries to run, since it makes system calls to the Linux® kernel directly. The floating point math uses the Single Instruction Multiple Data (SIMD instructions and the compiler fully implements all of the floating point exception handling required by the ECMA-55 standard. This compiler is designed to be small, simple, and easy to understand for people who want to study a compiler that actually implements full error checking on floating point on x86-64 CPUs even if those people have little programming experience. The generated assembly code is also designed to be simple to read.

  1. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2016-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  2. Towards droplet size-aware biochemical application compilation for AM-EWOD biochips

    DEFF Research Database (Denmark)

    Pop, Paul; Alistar, Mirela

    2015-01-01

    , but as discrete droplets on an array of electrodes. Microfluidic operations, such as transport, mixing, split, are performed on this array by routing the corresponding droplets on a series of electrodes. Several approaches have been proposed for the compilation of digital microfluidic biochips, which, starting...... from a biochemical application and a given biochip architecture, determine the allocation, resource binding, scheduling, placement and routing of the operations in the application. To simplify the compilation problem, researchers have assumed an abstract droplet size of one electrode. However......, the droplet size abstraction is not realistic and it impacts negatively the execution of the biochemical application, leading in most cases to its failure. Hence the existing compilation approaches have to be revisited to consider the size of the droplets. In this paper we take the first step towards...

  3. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  4. Wide Band-Gap Semiconductor Radiation Detectors: Science Fiction, Horror Story, or Headlines (460th Brookhaven Lecture)

    Energy Technology Data Exchange (ETDEWEB)

    James, Ralph (BNL Nonproliferation and National Security Department)

    2010-08-18

    With radiation constantly occurring from natural sources all around us -- from food, building materials, and rays from the sun, to name a few -- detecting radiotracers for medical procedures and other radiation to keep people safe is not easy. In order to make better use of radiation to diagnose or treat certain health conditions, or to track radiological materials being transported, stored, and used, the quest is on to develop improved radiation detectors. James gives a brief introduction on radiation detection and explain how it is used in applications ranging from medical to homeland security. He then discusses how new materials and better ways to analyze them here at the National Synchrotron Light Source (NSLS) and the future NSLS-II will lead to a new class of radiation detectors that will provide unprecedented advances in medical and industrial imaging, basic science, and the nonproliferation of nuclear materials.

  5. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  6. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    -initiative art tool. Early results in this paper show the potential of DrawCompileEvolve to jumpstart evolutionary art with meaningful drawings as well as the power of the underlying genetic representation to transform the user’s initial drawing into a different, yet potentially meaningful, artistic rendering......., which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  7. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  8. Data compilation of respiration, feeding, and growth rates of marine pelagic organisms

    DEFF Research Database (Denmark)

    2013-01-01

    's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from...... previous compilations by other authors. Data were read from tables or digitized from graphs. Only measurements made on individuals of know size, or groups of individuals of similar and known size were included. We show that clearance and respiration rates have life-form-dependent allometries that have...

  9. Compiling Dictionaries

    African Journals Online (AJOL)

    Information Technology

    Abstract: The task of providing dictionaries for all the world's languages is ..... What words refer to looking at something in order to learn? watch, scrutinize .... lated into French, Spanish, Chinese, and other major languages of the world. 8.

  10. Embedded Compilers

    NARCIS (Netherlands)

    Baars, A.I.

    2009-01-01

    For automation it is important to express the knowledge of the experts in a form that is understood by a computer. Each area of knowledge has its own terminology and ways of formulating things; be it by drawing diagrams, using formulae, or using formalized languages. In the last case we say we have

  11. Compiling Dictionaries

    African Journals Online (AJOL)

    Information Technology

    universals of human experience and universals of linguistic competence, there are striking simi- larities in various lists of semantic domains developed for languages around the world. Using a standardized list of domains to classify multiple dictionaries opens up possibilities for cross-lin- guistic research into semantic and ...

  12. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  13. Face to Face : Interaction with an Intelligent Virtual Agent: The Effect on Learning Tactical Picture Compilation

    NARCIS (Netherlands)

    Doesburg, W.A. van; Looije, R.; Melder, W.A.; Neerincx, M.A.

    2008-01-01

    Learning a process control task, such as tactical picture compilation in the Navy, is difficult, because the students have to spend their limited cognitive resources both on the complex task itself and the act of learning. In addition to the resource limits, motivation can be reduced when learning

  14. Regulatory and technical reports (abstract index journal): Annual compilation for 1997. Volume 22, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility.

  15. Within ARM's reach : compilation of left-linear rewrite systems via minimalrewrite systems

    NARCIS (Netherlands)

    W.J. Fokkink (Wan); J.F.T. Kamperman; H.R. Walters (Pum)

    1997-01-01

    textabstractA new compilation technique for left-linear term rewriting systems is presented, where rewrite rules are transformed into so-called minimal rewrite rules. These minimal rules have such a simple form that they can be viewed as instructions for an abstract rewriting machine (ARM).

  16. 29 CFR 1.3 - Obtaining and compiling wage rate information.

    Science.gov (United States)

    2010-07-01

    ... Obtaining and compiling wage rate information. For the purpose of making wage determinations, the... prevailing wage requirements. (b) The following types of information may be considered in making wage rate... system is to be performed shall be consulted. Before making a determination of wage rates for such a...

  17. Compilation of properties data for Li{sub 2}TiO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Roux, N. [CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1998-03-01

    Properties data obtained at CEA for Li{sub 2}TiO{sub 3} are reported. The compilation includes : stability of Li{sub 2}TiO{sub 3} {beta} phase, specific heat, thermal diffusivity, thermal conductivity, linear thermal expansion, thermal creep, interaction with water and acid. (author)

  18. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  19. A pragmatic approach to the analysis and compilation of lazy functional languages

    NARCIS (Netherlands)

    Glaser, H.; Hartel, Pieter H.; Wild, J.M.; Boyanov, K.

    The aim of the FAST Project is to provide an implementation of a functional language, Haskell, on a transputer array. An important component of the system is a highly optimising compiler for Haskell to a single transputer. This paper presents a methodology for describing the optimisations and code

  20. Compiling for Novel Scratch Pad Memory based Multicore Architectures for Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Aviral

    2016-02-05

    The objective of this proposal is to develop tools and techniques (in the compiler) to manage data of a task and communication among tasks on the scratch pad memory (SPM) of the core, so that any application (a set of tasks) can be executed efficiently on an SPM based manycore architecture.

  1. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  2. "BLAST": A compilation of codes for the numerical simulation of the gas dynamics of explosions

    NARCIS (Netherlands)

    Berg, A.C. van den

    2009-01-01

    The availability of powerful computers these days increasingly enables the use of CFD for the numerical simulation of explosion phenomena. The BLAST software consists of a compilation of codes for the numerical simulation of the gas dynamics of explosions. Each individual code has been tailored to a

  3. Compilation of 3D global conductivity model of the Earth for space weather applications

    Science.gov (United States)

    Alekseev, Dmitry; Kuvshinov, Alexey; Palshin, Nikolay

    2015-07-01

    We have compiled a global three-dimensional (3D) conductivity model of the Earth with an ultimate goal to be used for realistic simulation of geomagnetically induced currents (GIC), posing a potential threat to man-made electric systems. Bearing in mind the intrinsic frequency range of the most intense disturbances (magnetospheric substorms) with typical periods ranging from a few minutes to a few hours, the compiled 3D model represents the structure in depth range of 0-100 km, including seawater, sediments, earth crust, and partly the lithosphere/asthenosphere. More explicitly, the model consists of a series of spherical layers, whose vertical and lateral boundaries are established based on available data. To compile a model, global maps of bathymetry, sediment thickness, and upper and lower crust thicknesses as well as lithosphere thickness are utilized. All maps are re-interpolated on a common grid of 0.25×0.25 degree lateral spacing. Once the geometry of different structures is specified, each element of the structure is assigned either a certain conductivity value or conductivity versus depth distribution, according to available laboratory data and conversion laws. A numerical formalism developed for compilation of the model, allows for its further refinement by incorporation of regional 3D conductivity distributions inferred from the real electromagnetic data. So far we included into our model four regional conductivity models, available from recent publications, namely, surface conductance model of Russia, and 3D conductivity models of Fennoscandia, Australia, and northwest of the United States.

  4. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  5. Mentoring in Early Childhood Education: A Compilation of Thinking, Pedagogy and Practice

    Science.gov (United States)

    Murphy, Caterina, Ed.; Thornton, Kate, Ed.

    2015-01-01

    Mentoring is a fundamental and increasingly important part of professional learning and development for teachers in Aotearoa New Zealand. This book is a much-needed resource for mentors, leaders and teachers in early childhood education. It is the first of its kind: a wide ranging compilation that explores the thinking, pedagogy and practice of…

  6. Face to face interaction with an intelligent virtual agent: The effect on learning tactical picture compilation

    NARCIS (Netherlands)

    Looije, R.; Melder, W.A.; Neerincx, M.A.

    2008-01-01

    Learning a process control task, such as tactical picture compilation in the Navy, is difficult, because the students have to spend their limited cognitive resources both on the complex task itself and the act of learning. In addition to the resource limits, motivation can be reduced when learning

  7. De la problématique des articles synopsis dans la compilation des ...

    African Journals Online (AJOL)

    Le caractère multilingue et multiculturel du Gabon résulte de l'environnement linguistique et sociolinguistique, dont il faut tenir compte dans la compilation des dictionnaires. Par rapport aux articles simples les articles synopsis offrent une meilleure approche dans le transfert des données linguistiques et culturelles dans les ...

  8. Towards behavioral synthesis of asynchronous circuits - an implementation template targeting syntax directed compilation

    DEFF Research Database (Denmark)

    Nielsen, Sune Fallgaard; Sparsø, Jens; Madsen, Jan

    2004-01-01

    domain by introducing a computation model, which resembles the synchronous datapath and control architecture, but which is completely asynchronous. The datapath and control architecture is then expressed in the Balsa-language, and using syntax directed compilation a corresponding handshake circuit...

  9. THE COMPILATION OF CHINA’S INTERREGIONAL INPUT–OUTPUT MODEL 2002

    NARCIS (Netherlands)

    Zhang, Zhuoying; Shi, Minjun; Zhao, Zhao

    2015-01-01

    The increasing economic interaction among various regions in China makes the construction of an interregional input–output table relevant for economic studies. This paper elaborates the model compilation procedure of the China Interregional Input–output model 2002. The key features of the model

  10. Compose*: a Language- and Platform-Independent Aspect Compiler for Composition Filters

    NARCIS (Netherlands)

    de Roo, Arjan; Hendriks, M.F.H.; Havinga, W.K.; Durr, P.E.A.; Bergmans, Lodewijk

    2008-01-01

    This paper presents Compose*, a compilation and execution framework for the Composition Filters model. The Composition Filters model is designed to improve the composability of object-based programs. It is claimed that this approach is largely language-independent, and has previously been applied to

  11. Java-through-C Compilation: An Enabling Technology for Java in Embedded Systems

    National Research Council Canada - National Science Library

    Varma, Ankush; Bhattacharyya, Shuvra S

    2004-01-01

    .... In addition, they require that a JVM be ported to each such platform. We demonstrate the first Java-to-C compilation strategy that is suitable for a wide range of embedded systems, thereby enabling broad use of Java on embedded platforms...

  12. Compilation of Spectroscopic Data of Radium (Ra I and Ra II)

    NARCIS (Netherlands)

    Dammalapati, U.; Jungmann, K.; Willmann, L.

    2016-01-01

    Energy levels, wavelengths, lifetimes, and hyperfine structure constants for the isotopes of the first and second spectra of radium, Ra I and Ra II, have been compiled. Wavelengths and wavenumbers are tabulated for 226Ra and for other Ra isotopes. Isotope shifts and hyperfine structure constants of

  13. A data structure for describing sampling designs to aid in compilation of stand attributes

    Science.gov (United States)

    John C. Byrne; Albert R. Stage

    1988-01-01

    Maintaining permanent plot data with different sampling designs over long periods within an organization, and sharing such information between organizations, requires that common standards be used. A data structure for the description of the sampling design within a stand is proposed. It is composed of just those variables and their relationships needed to compile...

  14. Some measurements of Java-to-bytecode compiler performance in the Java Virtual Machine

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies.

  15. Ada (Trade Name) Compiler Validation Summary Report: Alsys Inc., AlsyCOMP 003, V3.1, Wang PC 280.

    Science.gov (United States)

    1988-06-04

    constructs and that i.t identifies and rejects illegal language constructs. The testing also identifies behaviour that is implementation dependent but...test for which the compiler generates a result that demonstrates nonconformity to the Ada Standard. 1-3 % INTRODUCTION Host The computer on which the...INFORMATION 2.2 IMPLEMENTATION CHARACTERISTICS One of the purposes of validating compilers is to determine the behaviour of a compiler in those areas of the

  16. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  17. Compilation of Japanese Basic Verb Usage Handbook for JFL Learners: A Project Report

    Directory of Open Access Journals (Sweden)

    Prashant PARDESHI

    2012-10-01

    Full Text Available In this article we introduce a collaborative research project entitled 'nihongogakushuushayou kihondoushi youhouhandbook no sakusei (Compilation of Japanese Basic Verb Usage Handbook for Japanese as Foreign Language (JFL Learners' carried out at the National Institute for Japanese Language and Linguistics (NINJAL and report on the progress of its research product, namely, a prototype of a basic verb usage handbook (referred to as 'handbook' below. The handbook differs in many ways from the conventional printed dictionaries or electronic dictionaries available at present. First, the handbook is compiled online and will be made available on internet for free access. Secondly, the handbook is corpus-based: the contents of the entry are written taking into consideration the actual use of the headword using the BCCWJ corpus. Also, it contains illustrative examples of a particular meaning culled from the BCCWJ corpus as well as those coined by the entry-writers. Third, the framework used in the description of semantic issues (polysemy network, cognitive mechanism underlying semantic extensions and semantic relationships among various meanings, etc. is cognitive grammar, which adopts prototype approach. Fourth, it includes audio-visual contents (such as audio files and animations/video clips etc. for effective understanding, acquisition and retention of various meanings of a polysemous verb. Fifth, the handbook is bi-lingual (Japanese-Chinese, Japanese-Korean and Japanese-Marathi and incorporates insights of contrastive studies and second language acquisition. The handbook is an attempt to share cutting edge research insights of various branches of linguistics with Japanese language pedagogy. It is hoped that the handbook will prove to be useful for JFL learners as well as Japanese language teachers across the globe. -----Članek predstavlja skupinski raziskovalni projekt z naslovom ‘Nihongogakushuushayou kihondoushi youhouhandbook no sakusei

  18. Compilation of Japanese Basic Verb Usage Handbook for JFL Learners: A Project Report

    Directory of Open Access Journals (Sweden)

    PARDESHI, Prashant

    2012-10-01

    Full Text Available In this article we introduce a collaborative research project entitled 'nihongogakushuushayou kihondoushi youhouhandbook no sakusei (Compilation of Japanese Basic Verb Usage Handbook for Japanese as Foreign Language (JFL Learners' carried out at the National Institute for Japanese Language and Linguistics (NINJAL and report on the progress of its research product, namely, a prototype of a basic verb usage handbook (referred to as 'handbook' below. The handbook differs in many ways from the conventional printed dictionaries or electronic dictionaries available at present. First, the handbook is compiled online and will be made available on internet for free access. Secondly, the handbook is corpus-based: the contents of the entry are written taking into consideration the actual use of the headword using the BCCWJ corpus. Also, it contains illustrative examples of a particular meaning culled from the BCCWJ corpus as well as those coined by the entry-writers. Third, the framework used in the description of semantic issues (polysemy network, cognitive mechanism underlying semantic extensions and semantic relationships among various meanings, etc. is cognitive grammar, which adopts prototype approach. Fourth, it includes audio-visual contents (such as audio files and animations/video clips etc. for effective understanding, acquisition and retention of various meanings of a polysemous verb. Fifth, the handbook is bi-lingual (Japanese-Chinese, Japanese-Korean and Japanese-Marathi and incorporates insights of contrastive studies and second language acquisition. The handbook is an attempt to share cutting edge research insights of various branches of linguistics with Japanese language pedagogy. It is hoped that the handbook will prove to be useful for JFL learners as well as Japanese language teachers across the globe.-----Članek predstavlja skupinski raziskovalni projekt z naslovom ‘Nihongogakushuushayou kihondoushi youhouhandbook no sakusei

  19. H11342: NOS Hydrographic Survey , Flower Garden Banks National Marine Sanctuary, Louisiana and Texas, 2004-05-19

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...

  20. H11343: NOS Hydrographic Survey , Flower Garden Banks National Marine Sanctuary, Louisiana and Texas, 2004-05-20

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Oceanic and Atmospheric Administration (NOAA) has the statutory mandate to collect hydrographic data in support of nautical chart compilation for safe...