WorldWideScience

Sample records for central computing facility

  1. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    Energy Technology Data Exchange (ETDEWEB)

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  2. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  3. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  4. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  5. Physics Division computer facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cyborski, D.R.; Teh, K.M.

    1995-08-01

    The Physics Division maintains several computer systems for data analysis, general-purpose computing, and word processing. While the VMS VAX clusters are still used, this past year saw a greater shift to the Unix Cluster with the addition of more RISC-based Unix workstations. The main Divisional VAX cluster which consists of two VAX 3300s configured as a dual-host system serves as boot nodes and disk servers to seven other satellite nodes consisting of two VAXstation 3200s, three VAXstation 3100 machines, a VAX-11/750, and a MicroVAX II. There are three 6250/1600 bpi 9-track tape drives, six 8-mm tapes and about 9.1 GB of disk storage served to the cluster by the various satellites. Also, two of the satellites (the MicroVAX and VAX-11/750) have DAPHNE front-end interfaces for data acquisition. Since the tape drives are accessible cluster-wide via a software package, they are, in addition to replay, used for tape-to-tape copies. There is however, a satellite node outfitted with two 8 mm drives available for this purpose. Although not part of the main cluster, a DEC 3000 Alpha machine obtained for data acquisition is also available for data replay. In one case, users reported a performance increase by a factor of 10 when using this machine.

  6. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  7. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  8. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  9. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Giesbrecht, Alan [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    The Central Facilities Area (CFA) located in Butte County, Idaho at Idaho National Laboratory (INL) has an existing wastewater system to collect and treat sanitary wastewater and non contact cooling water from the facility. The existing treatment facility consists of three cells: Cell 1 has a surface area of 1.7 acres, Cell 2 has a surface area of 10.3 acres, and Cell 3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5 acre land application site that utilizes a center pivot irrigation sprinkler system. The purpose of this current study is to update the analysis and conclusions of the December 2013 study. In this current study, the new seepage rate and influent flow rate data have been used to update the calculations, model, and analysis.

  10. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mark R. Cole

    2013-12-01

    The Central Facilities Area (CFA), located in Butte County, Idaho, at the Idaho National Laboratory has an existing wastewater system to collect and treat sanitary wastewater and non-contact cooling water from the facility. The existing treatment facility consists of three cells: Cell #1 has a surface area of 1.7 acres, Cell #2 has a surface area of 10.3 acres, and Cell #3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5-acre land application site that uses a center-pivot irrigation sprinkler system. As flows at CFA have decreased in recent years, the amount of wastewater discharged to the land application site has decreased from 13.64 million gallons in 2004 to no discharge in 2012 and 2013. In addition to the decreasing need for land application, approximately 7.7 MG of supplemental water was added to the system in 2013 to maintain a water level and prevent the clay soil liners in the cells from drying out and “cracking.” The Idaho National Laboratory is concerned that the sewage lagoons and land application site may be oversized for current and future flows. A further concern is the sustainability of the large volumes of supplemental water that are added to the system according to current operational practices. Therefore, this study was initiated to evaluate the system capacity, operational practices, and potential improvement alternatives, as warranted.

  11. Oak Ridge Leadership Computing Facility (OLCF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of standing up a supercomputer 100 times...

  12. Rendezvous Facilities in a Distributed Computer System

    Institute of Scientific and Technical Information of China (English)

    廖先Zhi; 金兰

    1995-01-01

    The distributed computer system described in this paper is a set of computer nodes interconnected in an interconnection network via packet-switching interfaces.The nodes communicate with each other by means of message-passing protocols.This paper presents the implementation of rendezvous facilities as high-level primitives provided by a parallel programming language to support interprocess communication and synchronization.

  13. Central nervous system and computation.

    Science.gov (United States)

    Guidolin, Diego; Albertin, Giovanna; Guescini, Michele; Fuxe, Kjell; Agnati, Luigi F

    2011-12-01

    Computational systems are useful in neuroscience in many ways. For instance, they may be used to construct maps of brain structure and activation, or to describe brain processes mathematically. Furthermore, they inspired a powerful theory of brain function, in which the brain is viewed as a system characterized by intrinsic computational activities or as a "computational information processor. "Although many neuroscientists believe that neural systems really perform computations, some are more cautious about computationalism or reject it. Thus, does the brain really compute? Answering this question requires getting clear on a definition of computation that is able to draw a line between physical systems that compute and systems that do not, so that we can discern on which side of the line the brain (or parts of it) could fall. In order to shed some light on the role of computational processes in brain function, available neurobiological data will be summarized from the standpoint of a recently proposed taxonomy of notions of computation, with the aim of identifying which brain processes can be considered computational. The emerging picture shows the brain as a very peculiar system, in which genuine computational features act in concert with noncomputational dynamical processes, leading to continuous self-organization and remodeling under the action of external stimuli from the environment and from the rest of the organism.

  14. Computing betweenness centrality in external memory

    DEFF Research Database (Denmark)

    Arge, Lars; Goodrich, Michael T.; Walderveen, Freek van

    2013-01-01

    Betweenness centrality is one of the most well-known measures of the importance of nodes in a social-network graph. In this paper we describe the first known external-memory and cache-oblivious algorithms for computing betweenness centrality. We present four different external-memory algorithms...

  15. The Central Laser Facility at the Pierre Auger Observatory

    CERN Document Server

    Arqueros, F; Covault, C; D'Urso, D; Giulio, C D; Facal, P; Fick, B; Guarino, F; Malek, M; Matthews, J A J; Matthews, J; Meyhandan, R; Monasor, M; Mostafa, M; Petrinca, P; Roberts, M; Sommers, P; Travnicek, P; Valore, L; Verzi, V; Wiencke, L

    2005-01-01

    The Central Laser Facility is located near the middle of the Pierre Auger Observatory in Argentina. It features a UV laser and optics that direct a beam of calibrated pulsed light into the sky. Light scattered from this beam produces tracks in the Auger optical detectors which normally record nitrogen fluorescence tracks from cosmic ray air showers. The Central Laser Facility provides a "test beam" to investigate properties of the atmosphere and the fluorescence detectors. The laser can send light via optical fiber simultaneously to the nearest surface detector tank for hybrid timing analyses. We describe the facility and show some examples of its many uses.

  16. The Central laser facility at the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Arqueros, F.; Bellido, J.; Covault, C.; D' Urso, D.; Di Giulio, C.; Facal, P.; Fick, B.; Guarino, F.; Malek, M.; Matthews, J.A.J.; Matthews, J.; Meyhandan, R.; Monasor,; Mostafa, M.; Petrinca, P.; Roberts, M.; Sommers, P.; Travnicek, P.; Valore, L.; Verzi, V.; Wiencke, Lawrence; /Utah U.

    2005-07-01

    The Central Laser Facility is located near the middle of the Pierre Auger Observatory in Argentina. It features a UV laser and optics that direct a beam of calibrated pulsed light into the sky. Light scattered from this beam produces tracks in the Auger optical detectors which normally record nitrogen fluorescence tracks from cosmic ray air showers. The Central Laser Facility provides a ''test beam'' to investigate properties of the atmosphere and the fluorescence detectors. The laser can send light via optical fiber simultaneously to the nearest surface detector tank for hybrid timing analyses. We describe the facility and show some examples of its many uses.

  17. 12 CFR 741.210 - Central liquidity facility.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Central liquidity facility. 741.210 Section 741.210 Banks and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS... Unions That Also Apply to Federally Insured State-Chartered Credit Unions § 741.210 Central...

  18. Computational evaluation oa a neutron field facility

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Jose Julio de O.; Pazianotto, Mauricio T., E-mail: jjfilos@hotmail.com, E-mail: mpazianotto@gmail.com [Instituto Tecnologico de Aeronautica (ITA/DCTA), Sao Jose dos Campos, SP (Brazil); Federico, Claudio A.; Passaro, Angelo, E-mail: claudiofederico@ieav.cta.br, E-mail: angelo@ieav.cta.br [Instituto de Estudos Avancados (IEAv/DCTA), Sao Jose dos Campos, SP (Brazil)

    2015-07-01

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction {sup 3}H(d,n){sup 4}He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 10{sup 8} n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  19. The Central Raman Laser Facility at the Pierre Auger Observatory

    Science.gov (United States)

    medina, C.; Mayotte, E.; Wiencke, L. R.; Rizi, V.; Grillo, A.

    2013-12-01

    We describe the newly upgraded Central Raman Laser Facility (CRLF) located close to the center of the Piere Auger observatory (PAO) in Argentina. The CRLF features a Raman Lidar receiver, a 335 nm wavelength solid state laser, a robotic beam energy calibration system, and a weather station, all powered by solar energy and operated autonomously using a single board computer. The system optics are arranged to direct the laser beam into the atmosphere in steered and vertical modes with adjustable polarization settings,and it is measured in a bi-static configuration by the 4 fluorescence stations of the Pierre Auger observatory. Additionally the system optics can be easily switched to provide a fixed vertical beam that is measured by a Raman Lidar receiver in mono-static configuration,allowing an independent measurement of the aerosol optical depth τ(z,t) and other properties of the atmosphere. A description of the CLRF's installation, hardware and software integration, initial operations and examples of data collected, will also be presented.

  20. Central Computer IMS Processing System (CIMS).

    Science.gov (United States)

    Wolfe, Howard

    As part of the IMS Version 3 tryout in 1971-72, software was developed to enable data submitted by IMS users to be transmitted to the central computer, which acted on the data to create IMS reports and to update the Pupil Data Base with criterion exercise and class roster information. The program logic is described, and the subroutines and…

  1. FUEL HANDLING FACILITY BACKUP CENTRAL COMMUNICATIONS ROOM SPACE REQUIREMENTS CALCULATION

    Energy Technology Data Exchange (ETDEWEB)

    B. SZALEWSKI

    2005-03-22

    The purpose of the Fuel Handling Facility Backup Central Communications Room Space Requirements Calculation is to determine a preliminary estimate of the space required to house the backup central communications room in the Fuel Handling Facility (FHF). This room provides backup communications capability to the primary communication systems located in the Central Control Center Facility. This calculation will help guide FHF designers in allocating adequate space for communications system equipment in the FHF. This is a preliminary calculation determining preliminary estimates based on the assumptions listed in Section 4. As such, there are currently no limitations on the use of this preliminary calculation. The calculations contained in this document were developed by Design and Engineering and are intended solely for the use of Design and Engineering in its work regarding the FHF Backup Central Communications Room Space Requirements. Yucca Mountain Project personnel from Design and Engineering should be consulted before the use of the calculations for purposes other than those stated herein or use by individuals other than authorized personnel in Design and Engineering.

  2. Technical evaluation of proposed Ukrainian Central Radioactive Waste Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gates, R.; Glukhov, A.; Markowski, F.

    1996-06-01

    This technical report is a comprehensive evaluation of the proposal by the Ukrainian State Committee on Nuclear Power Utilization to create a central facility for radioactive waste (not spent fuel) processing. The central facility is intended to process liquid and solid radioactive wastes generated from all of the Ukrainian nuclear power plants and the waste generated as a result of Chernobyl 1, 2 and 3 decommissioning efforts. In addition, this report provides general information on the quantity and total activity of radioactive waste in the 30-km Zone and the Sarcophagus from the Chernobyl accident. Processing options are described that may ultimately be used in the long-term disposal of selected 30-km Zone and Sarcophagus wastes. A detailed report on the issues concerning the construction of a Ukrainian Central Radioactive Waste Processing Facility (CRWPF) from the Ukrainian Scientific Research and Design institute for Industrial Technology was obtained and incorporated into this report. This report outlines various processing options, their associated costs and construction schedules, which can be applied to solving the operating and decommissioning radioactive waste management problems in Ukraine. The costs and schedules are best estimates based upon the most current US industry practice and vendor information. This report focuses primarily on the handling and processing of what is defined in the US as low-level radioactive wastes.

  3. Computer Profile of School Facilities Energy Consumption.

    Science.gov (United States)

    Oswalt, Felix E.

    This document outlines a computerized management tool designed to enable building managers to identify energy consumption as related to types and uses of school facilities for the purpose of evaluating and managing the operation, maintenance, modification, and planning of new facilities. Specifically, it is expected that the statistics generated…

  4. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  5. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  6. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  7. The Cost of Supplying Segmented Consumers From a Central Facility

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Klose, Andreas

    consider three measures of dispersion of demand points: the average distance between demand points, the maximum distance and the surface size.In our distribution model, all demand points are restocked from a central facility. The observed logistics costs are determined using the tour length estimations......Organizations regularly face the strategic marketing decision which groups of consumers they should target. A potential problem, highlighted in Steenkamp et al. (2002), is that the target consumers may be so widely dispersed that an organization cannot serve its customers cost-effectively. We...... measure if there are many stops on a route and with our average distance measure if there are relatively few....

  8. Volatile Organic Compound Emissions from Dairy Facilities in Central California

    Science.gov (United States)

    Hasson, A. S.; Ogunjemiyo, S. O.; Trabue, S.; Middala, S. R.; Ashkan, S.; Scoggin, K.; Vu, K. K.; Addala, L.; Olea, C.; Nana, L.; Scruggs, A. K.; Steele, J.; Shelton, T. C.; Osborne, B.; McHenry, J. R.

    2011-12-01

    Emissions of volatile organic compounds (VOCs) from dairy facilities are thought to be an important contributor to high ozone levels in Central California, but emissions inventories from these sources contain significant uncertainties. In this work, VOC emissions were measured at two Central California dairies during 2010 and 2011. Isolation flux chambers were used to measure direct emissions from specific dairy sources, and upwind/downwind ambient profiles were measured from ground level up to heights of 60 m. Samples were collected using a combination of canisters and sorbent tubes, and were analyzed by GC-MS. Additional in-situ measurements were made using infra-red photoaccoustic detectors and Diode Laser Absorption Spectroscopy. Temperature and ozone profiles up to 250 m above ground level were also measured using a tethersonde. Substantial fluxes of a number of VOCs including alcohols, volatile fatty acids and esters were observed at both sites. Implications of these measurements for regional air quality will be discussed.

  9. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  10. Radiation analysis for a generic centralized interim storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, S.G.; Lopez, P. [TRW Environmental Safety Systems, Inc., Vienna, VA (United States); Eble, R.G. [Duke Engineering and Services, Inc., Charlotte, NC (United States)

    1997-12-31

    This paper documents the radiation analysis performed for the storage area of a generic Centralized Interim Storage Facility (CISF) for commercial spent nuclear fuel (SNF). The purpose of the analysis is to establish the CISF Protected Area and Restricted Area boundaries by modeling a representative SNF storage array, calculating the radiation dose at selected locations outside the storage area, and comparing the results with regulatory radiation dose limits. The particular challenge for this analysis is to adequately model a large (6000 cask) storage array with a reasonable amount of analysis time and effort. Previous analyses of SNF storage systems for Independent Spent Fuel Storage Installations at nuclear plant sites (for example in References 5.1 and 5.2) had only considered small arrays of storage casks. For such analyses, the dose contribution from each storage cask can be modeled individually. Since the large number of casks in the CISF storage array make such an approach unrealistic, a simplified model is required.

  11. ARN Integrated Retail Module (IRM) System at Ft. Bliss - Central Issue Facility (CIF) Local Tariff

    Science.gov (United States)

    2008-03-14

    for issue to deploying soldiers, a Central Initial Issue Point ( CIIP ) for issue to recruits, and a 3D Whole Body Scanner for obtaining body...Apparel Research Network (ARN); Integrated Retail Module (IRM), Clothing Initial Issue Point ( CIIP ); Central Issue Facility (CIF); Supply Chain...automated support to a Central Issue Facility for issue to deploying soldiers, a Central Initial Issue Point ( CIIP ) for issue to recruits, and a 3D

  12. Computational Modeling in Support of National Ignition Facility Operations

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M J; Sacks, R A; Haynam, C A; Williams, W H

    2001-10-23

    Numerical simulation of the National Ignition Facility (NIF) laser performance and automated control of laser setup process are crucial to the project's success. These functions will be performed by two closely coupled computer codes: the virtual beamline (VBL) and the laser operations performance model (LPOM).

  13. Hanazonobashi facilities control system. Centralized SCADA for metropolitan express-way; Shutokosokudoro koden Kanagawa kanribudono osame. Hanazonokyo shisetsu kanri system

    Energy Technology Data Exchange (ETDEWEB)

    Higashino, K.; Imai, K.; Kitaura, M.; Kayama, C.; Tsujita, H. [Nisshin Electric Co. Ltd., Kyoto (Japan)

    1996-03-29

    This paper introduces the centralized SCADA system in the express-way line of whole Kanagawa region, which can control various facilities, such as power receiving, distribution and switch facilities, pump houses, transformer towers, and lighting facilities for bridges. The system was designed so as to monitor the whole of the Wangan line to be extended in the future. It consists of a central facility control station (CS) and local control stations (LS). The CS can control the whole facilities and the LS have control functions of start/stop for individual facilities at the maintenance and inspection of the CS. The SCADA system is surpassing in the operability of a huge amount of information, information exchange with other control systems and disaster preventing systems, and the extendability and maintainability. It is a distributed computer control system and also a multi-window type highly functional man-machine system. Multiple projection type large displays were employed to use information in common among operators. To support the facility maintenance works effectively, a data base has been made for collective control of information including drawings, facilities registers, and manual books. 9 figs., 1 tab.

  14. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  15. Central Facilities Area Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Brion Bennett

    2011-11-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Central Facilities Area facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facilityspecific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  16. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  17. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  18. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    Science.gov (United States)

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  19. Distributed computer control system in the Nova Laser Fusion Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    1985-09-01

    The EE Technical Review has two purposes - to inform readers of various activities within the Electronics Engineering Department and to promote the exchange of ideas. The articles, by design, are brief summaries of EE work. The articles included in this report are as follows: Overview - Nova Control System; Centralized Computer-Based Controls for the Nova Laser Facility; Nova Pulse-Power Control System; Nova Laser Alignment Control System; Nova Beam Diagnostic System; Nova Target-Diagnostics Control System; and Nova Shot Scheduler. The 7 papers are individually abstracted.

  20. Computational investigations of low-emission burner facilities for char gas burning in a power boiler

    Science.gov (United States)

    Roslyakov, P. V.; Morozov, I. V.; Zaychenko, M. N.; Sidorkin, V. T.

    2016-04-01

    Various variants for the structure of low-emission burner facilities, which are meant for char gas burning in an operating TP-101 boiler of the Estonia power plant, are considered. The planned increase in volumes of shale reprocessing and, correspondingly, a rise in char gas volumes cause the necessity in their cocombustion. In this connection, there was a need to develop a burner facility with a given capacity, which yields effective char gas burning with the fulfillment of reliability and environmental requirements. For this purpose, the burner structure base was based on the staging burning of fuel with the gas recirculation. As a result of the preliminary analysis of possible structure variants, three types of early well-operated burner facilities were chosen: vortex burner with the supply of recirculation gases into the secondary air, vortex burner with the baffle supply of recirculation gases between flows of the primary and secondary air, and burner facility with the vortex pilot burner. Optimum structural characteristics and operation parameters were determined using numerical experiments. These experiments using ANSYS CFX bundled software of computational hydrodynamics were carried out with simulation of mixing, ignition, and burning of char gas. Numerical experiments determined the structural and operation parameters, which gave effective char gas burning and corresponded to required environmental standard on nitrogen oxide emission, for every type of the burner facility. The burner facility for char gas burning with the pilot diffusion burner in the central part was developed and made subject to computation results. Preliminary verification nature tests on the TP-101 boiler showed that the actual content of nitrogen oxides in burner flames of char gas did not exceed a claimed concentration of 150 ppm (200 mg/m3).

  1. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  2. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Blood Establishment Computer System... ``Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April... establishment computer system validation program, consistent with recognized principles of software...

  3. Empirical Foundation of Central Concepts for Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Spannagel, Christian

    2008-01-01

    The design of computer science curricula should rely on central concepts of the discipline rather than on technical short-term developments. Several authors have proposed lists of basic concepts or fundamental ideas in the past. However, these catalogs were based on subjective decisions without any empirical support. This article describes the…

  4. Distributed Computing with Centralized Support Works at Brigham Young.

    Science.gov (United States)

    McDonald, Kelly; Stone, Brad

    1992-01-01

    Brigham Young University (Utah) has addressed the need for maintenance and support of distributed computing systems on campus by implementing a program patterned after a national business franchise, providing the support and training of a centralized administration but allowing each unit to operate much as an independent small business.…

  5. Improvement of the Computing - Related Procurement Process at a Government Research Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gittins, C.

    2000-04-03

    The purpose of the project was to develop, implement, and market value-added services through the Computing Resource Center in an effort to streamline computing-related procurement processes across the Lawrence Livermore National Laboratory (LLNL). The power of the project was in focusing attention on and value of centralizing the delivery of computer related products and services to the institution. The project required a plan and marketing strategy that would drive attention to the facility's value-added offerings and services. A significant outcome of the project has been the change in the CRC internal organization. The realignment of internal policies and practices, together with additions to its product and service offerings has brought an increased focus to the facility. This movement from a small, fractious organization into one that is still small yet well organized and focused on its mission and goals has been a significant transition. Indicative of this turnaround was the sharing of information. One-on-one and small group meetings, together with statistics showing work activity was invaluable in gaining support for more equitable workload distribution, and the removal of blame and finger pointing. Sharing monthly reports on sales and operating costs also had a positive impact.

  6. Status of the National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Bryant, R; Carey, R; Casavant, D; Edwards, O; Ferguson, W; Krammen, J; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Van Arsdall, P J; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1{omega}, 2{omega}, and 3{omega} beamlines in the world

  7. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  8. Supporting Air and Space Expeditionary Forces: Analysis of CONUS Centralized Intermediate Repair Facilities

    Science.gov (United States)

    2008-01-01

    22 Accounting for these 30 aircraft plus the 24 F100-220A/E PAA realigned from Elmen- dorf AFB, a shortfall of 131 F100-220A/E PAA remains. Note...Management Division (ILMM), Centralized Intermediate Repair Facility CIRF Test Report, Washington, D.C.: Pentagon, June 2002. Hillestad, Richard J

  9. Operation and Maintenance Manual for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Norm Stanley

    2011-02-01

    This Operation and Maintenance Manual lists operator and management responsibilities, permit standards, general operating procedures, maintenance requirements and monitoring methods for the Sewage Treatment Plant at the Central Facilities Area at the Idaho National Laboratory. The manual is required by the Municipal Wastewater Reuse Permit (LA-000141-03) the sewage treatment plant.

  10. Solar Cogeneration Facility: Cimarron River Station, Central Telephone and Utilities-Western Power

    Science.gov (United States)

    1981-08-01

    A site-specific conceptual design and evaluation of a solar central receiver system integrated with an existing cogeneration facility are described. The system generates electricity and delivers a portion of that electricity and process steam to a natural gas processing plant. Early in the project, tradeoff studies were performed to establish key system characteristics. As a result of these studies the use of energy storage was eliminated, the size of the solar facility was established at 37.13 MW (sub t), and other site-specific features were selected. The conceptual design addressed critical components and system interfaces. The result is a hybrid solar/fossil central receiver facility which utilizes a collector system of Department of Energy second generation heliostats.

  11. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  12. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  13. Tama river facilities control system. Centralized SCADA for metropolitan express-way; Shuto kosoku doro kodan wangan kensetsukyoku dono osame. Tamagawa shisetsu kansei system

    Energy Technology Data Exchange (ETDEWEB)

    Araki, J.; Sasaki, S.; Doi, E.; Takahashi, M. [Nisshin Electric Co. Ltd., Kyoto (Japan)

    1996-03-29

    This paper describes the centralized SCADA system for the smooth and effective operation and control of various facilities dispersed in the Wangan express-way, such as the power receiving and distribution facilities, lighting and ventilating facilities for tunnels, road drainage facilities, and disaster preventing facilities in tunnels. A central control station (CS) was constructed as a center for collective control of the whole. Four sub-control stations (SS) distributed in tunnels and four to ten local stations (LS) were constructed and hierarchized under the center. The SS support the CS as disaster preventing stations at disasters, such as fires, accidents and earthquakes. They can also control and monitor within the jurisdiction as stations of maintenance. The LS have control functions with simple interlocking circuits for the start/stop of individual facilities to conduct the minimum operation at the sites during maintenance and inspection of CS and SS. The SCADA system is a distributed computer control system and also a highly functional man-machine system, which has highly functional transmission processing devices for the ventilation control in tunnels and information exchange. It also has a facility maintenance work supporting system. 9 figs., 1 tab.

  14. Control computers and automation subsystem equipment in Del'fin facility

    Energy Technology Data Exchange (ETDEWEB)

    Allin, A.P.; Belen' kiy, Yu.M.; Borzyak, Yu.V.; Bykovskiy, N.E.; Grigor' ev, V.E.; Gusyatnikov, B.S.; Doroshkevich, I.L.; Ivanov, V.V.; Kuchinskiy, A.G.; Savchenko, V.M.

    1983-01-01

    The power equipment of the Del'fin laser facility contains a 10/sup 7/ J capacitor bank divided into four identical sections feeding a power preamplifier and three output stages each, with 328 IFP-20,000 flash tubes designed to produce 2.5 kJ laser radiation. The system for controlling and automating laser experiments, modeled after the SHIVA system (Lawrence Livermore Laboratory), includes a computer complex in the central console and the sequential ring bus, with CAMAC peripheral stations inside the optical chamber including three microcomputers (Polon, Nuclear Enterprise, HENESA). The control computer with a 28 K memory is linked to the CAMAC stations and to a terminal DECWRITER-II. The system crate contains a 9030/32 interface, a 3992 driver for the sequential bus, and a 064 LAM GRADER interrogation processing module. The computer complex also includes an RSX-11M multiprogram multipurpose real-time disk operating system which uses standard DECNET-11 software and includes a translator from MACRO-11 assembler language to FORTRAN-4, BASIC-11, COBOL and a few other languages. The laser power is automatically controlled through the CAMAC stations according to a main program as well as dialog maintenance programs (BCE, KASKN, KASN, DIAL, STRB, MODB, BKYM, STRB, BKYB, BKY) and measurement programs (CONTRO, CNTRO, KOD) designed to ensure simple and reliable high-speed control of laser experiments. All alignment and regulation of the laser facility is automated through optical channels (aligning LTI-501 laser, collimators, lenses, auxiliary optics) and servomechanisms (coordinate photoreceiver-homing signal module-step motors) designed for positioning and orientating mirrors 80 mm and 30 mm in diameter. 25 references, 31 figures, 2 tables.

  15. Recycled Water Reuse Permit Renewal Application for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Mike [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    This renewal application for a Recycled Water Reuse Permit is being submitted in accordance with the Idaho Administrative Procedures Act 58.01.17 “Recycled Water Rules” and the Municipal Wastewater Reuse Permit LA-000141-03 for continuing the operation of the Central Facilities Area Sewage Treatment Plant located at the Idaho National Laboratory. The permit expires March 16, 2015. The permit requires a renewal application to be submitted six months prior to the expiration date of the existing permit. For the Central Facilities Area Sewage Treatment Plant, the renewal application must be submitted by September 16, 2014. The information in this application is consistent with the Idaho Department of Environmental Quality’s Guidance for Reclamation and Reuse of Municipal and Industrial Wastewater and discussions with Idaho Department of Environmental Quality personnel.

  16. Developing a Central “Point-of-Truth” Database for Core Facility Information

    Science.gov (United States)

    Cheng, S.; Webber, G.; Bourges-Waldegg, D.; Pearse, R.V.

    2014-01-01

    Core facilities need customers, yet an average researcher is aware of a very small fraction of the facilities available near them. Diligent facilities combat this knowledge gap by broadcasting information about their facility either locally, or by publishing information on one or multiple public-facing websites or third-party repositories of information (e.g. VGN cores database or Science Exchange). Each additional site of information about a facility increases visibility but also impairs their ability to maintain up-to-date information on all sites. Additionally, most third-party repositories house their data in traditional relational databases that are not indexable by common search engines. To start addressing these problems, the eagle-i project (an free, open-source, open-access, publication platform), has begun integrating its core facility database with external websites and web applications allowing them to synchronize their information in real-time. We present here two experimental integrations. The Harvard Catalyst Cores webpage originally required independent updates which were not within the direct control of the core directors themselves. The eagle-i linked open data architecture developed now allows the Catalyst cores page to pull information from the Harvard eagle-i server and update all data on it's page accordingly. Additionally, Harvard's “Profiles” web application references eagle-i data and links resource information from eagle-i to personnel information in the Profiles database. Because of these direct links, updating information in Harvard's eagle-i server (which can be accessed directly by facility directors through an account on the eagle-i SWEET), automatically updates information on the Catalyst Cores webpage and updates resource counts linked to a researcher's public profile. This functionality of the eagle-i platform as a central “point-of-truth” for information has the potential to drastically reduce the effort required to

  17. A methodology for assessing computer software applicability to inventory and facility management

    OpenAIRE

    Paul, Debashis

    1989-01-01

    Computer applications have become popular and widespread in architecture and other related fields. While the architect uses a computer for design and construction of a building, the user takes the advantage of computer for maintenance of the building. Inventory and facility management are two such fields where computer applications have become predominant. The project has investigated the use and application of different commercially available computer software in the above men...

  18. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  19. Computing collinear 4-Body Problem central configurations with given masses

    CERN Document Server

    Piña, E

    2011-01-01

    An interesting description of a collinear configuration of four particles is found in terms of two spherical coordinates. An algorithm to compute the four coordinates of particles of a collinear Four-Body central configuration is presented by using an orthocentric tetrahedron, which edge lengths are function of given masses. Each mass is placed at the corresponding vertex of the tetrahedron. The center of mass (and orthocenter) of the tetrahedron is at the origin of coordinates. The initial position of the tetrahedron is placed with two pairs of vertices each in a coordinate plan, the lines joining any pair of them parallel to a coordinate axis, the center of masses of each and the center of mass of the four on one coordinate axis. From this original position the tetrahedron is rotated by two angles around the center of mass until the direction of configuration coincides with one axis of coordinates. The four coordinates of the vertices of the tetrahedron along this direction determine the central configurati...

  20. [Use of personal computers in forensic medicine facilities].

    Science.gov (United States)

    Vorel, F

    1995-08-01

    The authors present a brief account of possibilities to use computers, type PC, in departments of forensic medicine and discuss basic technical and programme equipment. In the author's opinion the basic reason for using computers is to create an extensive database of post-mortem findings which would make it possible to process them on a large scale and use them for research and prevention. Introduction of computers depends on the management of the department and it is necessary to persuade workers-future users of computers-of the advantages associated with their use.

  1. Language Facilities for Programming User-Computer Dialogues.

    Science.gov (United States)

    Lafuente, J. M.; Gries, D.

    1978-01-01

    Proposes extensions to PASCAL that provide for programing man-computer dialogues. An interactive dialogue application program is viewed as a sequence of frames and separate computational steps. PASCAL extensions allow the description of the items of information in each frame and the inclusion of behavior rules specifying the interactive dialogue.…

  2. Techniques for Measuring Aerosol Attenuation using the Central Laser Facility at the Pierre Auger Observatory

    CERN Document Server

    ,

    2013-01-01

    The Pierre Auger Observatory in Malarg\\"ue, Argentina, is designed to study the properties of ultra-high energy cosmic rays with energies above 1018 eV. It is a hybrid facility that employs a Fluorescence Detector to perform nearly calorimetric measurements of Extensive Air Shower energies. To obtain reliable calorimetric information from the FD, the atmospheric conditions at the observatory need to be continuously monitored during data acquisition. In particular, light attenuation due to aerosols is an important atmospheric correction. The aerosol concentration is highly variable, so that the aerosol attenuation needs to be evaluated hourly. We use light from the Central Laser Facility, located near the center of the observatory site, having an optical signature comparable to that of the highest energy showers detected by the FD. This paper presents two procedures developed to retrieve the aerosol attenuation of fluorescence light from CLF laser shots. Cross checks between the two methods demonstrate that re...

  3. 2010 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike lewis

    2011-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2009, through October 31, 2010. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of special compliance conditions • Discussion of the facility’s environmental impacts. During the 2010 permit year, approximately 2.2 million gallons of treated wastewater was land-applied to the irrigation area at Central Facilities Area Sewage Treatment plant.

  4. 2012 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2013-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2011, through October 31, 2012. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2012 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant.

  5. On the possiblity of using vertically pointing Central Laser Facilities to calibrate the Cherenkov Telescope Array

    CERN Document Server

    Gaug, Markus

    2014-01-01

    A Central Laser Facility is a system composed of a laser placed at a certain distance from a light-detector array, emitting fast light pulses, typically in the vertical direction, with the aim to calibrate that array. During calibration runs, all detectors are pointed towards the same portion of the laser beam at a given altitude. Central Laser Facilities are used for various currently operating ultra-high-energy cosmic ray and imaging atmospheric Cherenkov telescope arrays. In view of the future Cherenkov Telescope Array, a similar device could provide a fast calibration of the whole installation at different wavelengths. The relative precision (i.e. each individual telescope with respect to the rest of the array is expected) to be better than 5%, while an absolute calibration should reach a precisions of 4-11%, if certain design requirements are met. Additionally, a preciser monitoring of the sensitivity of each telescope can be made on time-scales of days to years.

  6. Upgrade of the Central Laser Facility at the Pierre Auger Observatory and first results

    Science.gov (United States)

    Medina-Hernandez, Carlos; Wiencke, Lawrence; Mayotte, Eric; Pierre Auger Collaboration

    2015-04-01

    The Pierre Auger Observatory (PAO) explores the nature and origin of cosmic rays with energies above 1018 eV. It uses a hybrid technique that combines a Fluorescence Detector (FD) and a 3000 Km2 surface Detector (SD) array. Two laser test beam facilities are located near the center of the observatory. The Central Laser Facility (CLF) and the eXtreme Laser Facility (XLF) track the atmospheric conditions during FD's operations and perform additional functions. The CLF was upgraded substantially in 2013 with a solid state laser, new generation GPS, robotic beam calibration system, and better thermal and dust isolation. The upgrade also includes a back scatter Raman Lidar receiver, providing an independent measurement the aerosol optical depth (tau(z,t)). We describe the new features, capabilities, and applications of the updated instrument, including, tau(z,t) calculations for atmospheric monitoring using a data normalized method, laser energy calibration, and steered laser firing for arrival directions studies. We also present the first tau(z,t) results after the upgrade,using two using two independent techniques. One method uses the FD's measurements of the CLF's laser shots in bi-static configuration. The other uses the Raman LIDAR in back scattered configuration.

  7. High Resolution Muon Computed Tomography at Neutrino Beam Facilities

    CERN Document Server

    Suerfu, Burkhant

    2015-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pio...

  8. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  9. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  10. Evaluating Computer Technology Integration in a Centralized School System

    Science.gov (United States)

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  11. Peta-scale QMC simulations on DOE leadership computing facilities

    Science.gov (United States)

    Kim, Jeongnim; Ab Initio Network Collaboration

    2014-03-01

    Continuum quantum Monte Carlo (QMC) has proved to be an invaluable tool for predicting the properties of matter from fundamental principles. Even with numerous innovations in methods, algorithms and codes, QMC simulations of realistic problems of 1000s and more electrons are demanding, requiring millions of core hours to achieve the target chemical accuracy. The multiple forms of parallelism afforded by QMC algorithms and high compute-to-communication ratio make them ideal candidates for acceleration in the multi/many-core paradigm. We have ported and tuned QMCPACK to recently deployed DOE doca-petaflop systems, Titan (Cray XK7 CPU/GPGPU) and Mira (IBM Blue Gene/Q). The efficiency gains through improved algorithms and architecture-specific tuning and, most importantly, the vast increase in computing powers have opened up opportunities to apply QMC at unprecedent scales, accuracy and time-to-solution. We present large-scale QMC simulations to study energetics of layered materials where vdW interactions play critical roles. Collaboration supported through the Predictive Theory and Modeling for Materials and Chemical Science program by the Basic Energy Science, Department of Energy.

  12. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  13. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States). Safeguards Systems Group; Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States). Surety/Dismantlement Dept.

    1993-12-31

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security issues being addressed in the integrated design effort for the tritium, uranium/lithium, plutonium, plutonium storage, and high explosive/assembly facilities.

  14. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States); Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States)

    1993-08-01

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security addressed in the integrated design effort, uranium/lithium, plutonium, plutonium high explosive/assembly facilities.

  15. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  16. Feasibility of a medium-size central cogenerated energy facility, energy management memorandum

    Science.gov (United States)

    Porter, R. W.

    1982-09-01

    The thermal-economic feasibility was studied of a medium-size central cogenerated energy facility designed to serve five varied industries. Generation options included one dual-fuel diesel and one gas turbine, both with waste heat boilers, and five fired boilers. Fuels included natural gas, and for the fired-boiler cases, also low-sulphur coal and municipal refuse. The fired-boiler cogeneration systems employed back-pressure steam turbines. For coal and refuse, the option of steam only without cogeneration was also assessed. The refuse-fired cases utilized modular incinerators. The options provided for a wide range of steam and electrical capacities. Deficient steam was assumed generated independently in existing equipment. Excess electrical power over that which could be displaced was assumed sold to Commonwealth Edison Company under PURPA (Public Utility Regulator Policies Act). The facility was assumed operated by a mutually owned corporation formed by the cogenerated power users. The economic analysis was predicted on currently applicable energy-investment tax credits and accelerated depreciation for a January 1985 startup date. Based on 100% equity financing, the results indicated that the best alternative was the modular-incinerator cogeneration system.

  17. 2011 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Michael G. Lewis

    2012-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site's Central Facilities Area Sewage Treatment Plant from November 1, 2010, through October 31, 2011. The report contains the following information: (1) Site description; (2) Facility and system description; (3) Permit required monitoring data and loading rates; (4) Status of special compliance conditions and activities; and (5) Discussion of the facility's environmental impacts. During the 2011 permit year, approximately 1.22 million gallons of treated wastewater was land-applied to the irrigation area at Central Facilities Area Sewage Treatment plant.

  18. A digital computer propulsion control facility: Description of capabilities and summary of experimental program results

    Science.gov (United States)

    Zeller, J. R.; Arpasi, D. J.; Lehtinen, B.

    1976-01-01

    Flight weight digital computers are being used today to carry out many of the propulsion system control functions previously delegated exclusively to hydromechanical controllers. An operational digital computer facility for propulsion control mode studies has been used successfully in several experimental programs. This paper describes the system and some of the results concerned with engine control, inlet control, and inlet engine integrated control. Analytical designs for the digital propulsion control modes include both classical and modern/optimal techniques.

  19. Thermal analysis of the unloading cell of the Spanish centralized interim storage facility (CISF); Analisis termico de la celda de desarga del almacen temporal centralizado (ATC)

    Energy Technology Data Exchange (ETDEWEB)

    Perez Dominguez, J. R.; Perez Vara, R.; Huelamo Martinez, E.

    2016-08-01

    This article deals with the thermal analysis performed for the Untoading Cell of Spain Centralized Interim Storage Facility, CISF (ATC, in Spanish). The analyses are done using computational fluid dynamics (CFD) simulation, with the aim of obtaining the air flow required to remove the residual heat of the elements stored in the cell. Compliance with the admissible heat limits is checked with the results obtained in the various operation and accident modes. The calculation model is flexible enough to allow carrying out a number of sensitivity analyses with the different parameters involved in the process. (Author)

  20. Process as Content in Computer Science Education: Empirical Determination of Central Processes

    Science.gov (United States)

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2008-01-01

    Computer science education should not be based on short-term developments but on content that is observable in multiple domains of computer science, may be taught at every intellectual level, will be relevant in the longer term, and is related to everyday language and/or thinking. Recently, a catalogue of "central concepts" for computer science…

  1. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities Project

    Energy Technology Data Exchange (ETDEWEB)

    Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller; Laura L. Glaser; Kathryn L. Hanson; Ross D. Hartleb; William R. Lettis; Scott C. Lindvall; Stephen M. McDuffie; Robin K. McGuire; Gerry L. Stirewalt; Gabriel R. Toro; Robert R. Youngs; David L. Slayter; Serkan B. Bozkurt; Randolph J. Cumbest; Valentina Montaldo Falero; Roseanne C. Perman' Allison M. Shumway; Frank H. Syms; Martitia (Tish) P. Tuttle

    2012-01-31

    Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan

  2. A Fruitful Collaboration between ESO and the Max Planck Computing and Data Facility

    Science.gov (United States)

    Fourniol, N.; Zampieri, S.; Panea, M.

    2016-06-01

    The ESO Science Archive Facility (SAF), contains all La Silla Paranal Observatory raw data, as well as, more recently introduced, processed data created at ESO with state-of-the-art pipelines or returned by the astronomical community. The SAF has been established for over 20 years and its current holding exceeds 700 terabytes. An overview of the content of the SAF and the preservation of its content is provided. The latest development to ensure the preservation of the SAF data, provision of an independent backup copy of the whole SAF at the Max Planck Computing and Data Facility in Garching, is described.

  3. The development and operation of the international solar-terrestrial physics central data handling facility

    Science.gov (United States)

    Lehtonen, Kenneth

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) International Solar-Terrestrial Physics (ISTP) Program is committed to the development of a comprehensive, multi-mission ground data system which will support a variety of national and international scientific missions in an effort to study the flow of energy from the sun through the Earth-space environment, known as the geospace. A major component of the ISTP ground data system is an ISTP-dedicated Central Data Handling Facility (CDHF). Acquisition, development, and operation of the ISTP CDHF were delegated by the ISTP Project Office within the Flight Projects Directorate to the Information Processing Division (IPD) within the Mission Operations and Data Systems Directorate (MO&DSD). The ISTP CDHF supports the receipt, storage, and electronic access of the full complement of ISTP Level-zero science data; serves as the linchpin for the centralized processing and long-term storage of all key parameters generated either by the ISTP CDHF itself or received from external, ISTP Program approved sources; and provides the required networking and 'science-friendly' interfaces for the ISTP investigators. Once connected to the ISTP CDHF, the online catalog of key parameters can be browsed from their remote processing facilities for the immediate electronic receipt of selected key parameters using the NASA Science Internet (NSI), managed by NASA's Ames Research Center. The purpose of this paper is twofold: (1) to describe how the ISTP CDHF was successfully implemented and operated to support initially the Japanese Geomagnetic Tail (GEOTAIL) mission and correlative science investigations, and (2) to describe how the ISTP CDHF has been enhanced to support ongoing as well as future ISTP missions. Emphasis will be placed on how various project management approaches were undertaken that proved to be highly effective in delivering an operational ISTP CDHF to the Project on schedule and

  4. 2015 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Michael George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2014, through October 31, 2015.

  5. 2013 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2014-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2012, through October 31, 2013. The report contains, as applicable, the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2013 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. However, soil samples were collected in October from soil monitoring unit SU-014101.

  6. High energy diode-pumped solid-state laser development at the Central Laser Facility

    Science.gov (United States)

    Mason, Paul D.; Banerjee, Saumyabrata; Ertel, Klaus; Phillips, P. Jonathan; Butcher, Thomas; Smith, Jodie; De Vido, Mariastefania; Chekhlov, Oleg; Hernandez-Gomez, Cristina; Edwards, Chris; Collier, John

    2016-04-01

    In this paper we review the development of high energy, nanosecond pulsed diode-pumped solid state lasers within the Central Laser Facility (CLF) based on cryogenic gas cooled multi-slab ceramic Yb:YAG amplifier technology. To date two 10J-scale systems, the DiPOLE prototype amplifier and an improved DIPOLE10 system, have been developed, and most recently a larger scale system, DiPOLE100, designed to produce 100 J pulses at up to 10 Hz. These systems have demonstrated amplification of 10 ns duration pulses at 1030 nm to energies in excess of 10 J at 10 Hz pulse repetition rate, and over 100 J at 1 Hz, with optical-to-optical conversion efficiencies of up to 27%. We present an overview of the cryo-amplifier concept and compare the design features of these three systems, including details of the amplifier designs, gain media, diode pump lasers and the cryogenic gas cooling systems. The most recent performance results from the three systems are presented along with future plans for high energy DPSSL development within the CLF.

  7. COMPUTING

    CERN Document Server

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  8. Computer Support for Document Management in the Danish Central Government

    DEFF Research Database (Denmark)

    Hertzum, Morten

    1995-01-01

    Document management systems are generally assumed to hold a potential for delegating the recording and retrieval of documents to professionals such as civil servants and for supporting the coordination and control of work, so-called workflow management. This study investigates the use and organiz......Document management systems are generally assumed to hold a potential for delegating the recording and retrieval of documents to professionals such as civil servants and for supporting the coordination and control of work, so-called workflow management. This study investigates the use...... and organizational impact of document management systems in the Danish central government. The currently used systems unfold around the recording of incoming and outgoing paper mail and have typically not been accompanied by organizational changes. Rather, document management tends to remain an appendix...

  9. Access to the energy system network simulator (ESNS), via remote computer terminals. [BNL CDC 7600/6600 computer facility

    Energy Technology Data Exchange (ETDEWEB)

    Reisman, A W

    1976-08-15

    The Energy System Network Simulator (ESNS) flow model is installed on the Brookhaven National Laboratory (BNL) CDC 7600/6600 computer facility for access by off-site users. The method of access available to outside users is through a system called CDC-INTERCOM, which allows communication between the BNL machines and remote teletype terminals. This write-up gives a brief description of INTERCOM for users unfamiliar with this system and a step-by-step guide to using INTERCOM in order to access ESNS.

  10. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  11. SynapSense Wireless Environmental Monitoring System of the RHIC & ATLAS Computing Facility at BNL

    Science.gov (United States)

    Casella, K.; Garcia, E.; Hogue, R.; Hollowell, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    RHIC & ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  12. Characterization and reclamation assessment for the Central Shops Diesel Storage Facility, Savannah River Site, Aiken, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    Fliermans, C.B.; Hazen, T.C.; Bledsoe, H.

    1993-10-01

    The contamination of subsurface terrestrial environments by organic contaminants is a global phenomenon. The remediation of such environments requires innovative assessment techniques and strategies for successful clean-ups. Central Shops Diesel Storage Facility at Savannah River Site was characterized to determine the extent of subsurface diesel fuel contamination using innovative approaches and effective bioremediation techniques for clean-up of the contaminant plume have been established.

  13. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  14. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  15. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    Energy Technology Data Exchange (ETDEWEB)

    Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Norman, A. [Fermilab; Timm, S. [Fermilab; Tiradani, A. [Fermilab

    2017-03-15

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. 2014 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Mike [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2013, through October 31, 2014. The report contains, as applicable, the following information; Site description; Facility and system description; Permit required monitoring data and loading rates; Status of compliance conditions and activities; and Discussion of the facility’s environmental impacts. The current permit expires on March 16, 2015. A permit renewal application was submitted to Idaho Department of Environmental Quality on September 15, 2014. During the 2014 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. Seepage testing of the three lagoons was performed between August 26, 2014 and September 22, 2014. Seepage rates from Lagoons 1 and 2 were below the 0.25 inches/day requirement; however, Lagoon 3 was above the 0.25 inches/day. Lagoon 3 has been isolated and is being evaluated for future use or permanent removal from service.

  18. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.

    1999-08-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike.

  19. Investigating Preterm Care at the Facility Level: Stakeholder Qualitative Study in Central and Southern Malawi.

    Science.gov (United States)

    Gondwe, Austrida; Munthali, Alister; Ashorn, Per; Ashorn, Ulla

    2016-07-01

    Objectives Malawi is estimated to have one of the highest preterm birth rates in the world. However, care of preterm infants at facility level in Malawi has not been explored. We aimed to explore the views of health stakeholders about the care of preterm infants in health facilities and the existence of any policy protocol documents guiding the delivery of care to these infants. Methods We conducted 16 in-depth interviews with health stakeholders (11 service providers and 5 policy makers) using an interview guide and asked for any existing policy protocol documents guiding care for preterm infants in the health facilities in Malawi. The collected documents were reviewed and all the interviews were digitally recorded, transcribed and translated. All data were analysed using content analysis approach. Results We identified four policy protocol documents and out of these, one had detailed information explaining the care of preterm infants. Policy makers reported that policy protocol documents to guide care for preterm infants were available in the health facilities but majority (63.6 %) of the service providers lacked knowledge about the existence of these documents. Health stakeholders reported several challenges in caring for preterm infants including lack of trained staff in preterm infant care, antibiotics, space, supervision and poor referral system. Conclusions Our study highlights that improving health care service provider knowledge of preterm infant care is an integral part in preterm child birth. Our findings suggests that policy makers and health decision makers should retain those trained in preterm new born care in the health facility's preterm unit.

  20. The Overview of the National Ignition Facility Distributed Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L J; Bettenhausen, R C; Carey, R A; Estes, C M; Fisher, J M; Krammen, J E; Reed, R K; VanArsdall, P J; Woodruff, J P

    2001-10-15

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008.

  1. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  2. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  3. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  4. Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Brunton, G; Carey, R; Demaret, R; Fisher, J; Fishler, B; Ludwigsen, P; Marshall, C; Reed, R; Shelton, R; Townsend, S

    2011-03-18

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for controls and information systems to support these experiments on the path to ignition.

  5. Interventions on central computing services during the weekend of 21 and 22 August

    CERN Multimedia

    2004-01-01

    As part of the planned upgrade of the computer centre infrastructure to meet the LHC computing needs, approximately 150 servers, hosting in particular the NICE home directories, Mail services and Web services, will need to be physically relocated to another part of the computing hall during the weekend of the 21 and 22 August. On Saturday 21 August, starting from 8:30a.m. interruptions of typically 60 minutes will take place on the following central computing services: NICE and the whole Windows infrastructure, Mail services, file services (including home directories and DFS workspaces), Web services, VPN access, Windows Terminal Services. During any interruption, incoming mail from outside CERN will be queued and delivered as soon as the service is operational again. All Services should be available again on Saturday 21 at 17:30 but a few additional interruptions will be possible after that time and on Sunday 22 August. IT Department

  6. Central Nervous System Based Computing Models for Shelf Life Prediction of Soft Mouth Melting Milk Cakes

    Directory of Open Access Journals (Sweden)

    Gyanendra Kumar Goyal

    2012-04-01

    Full Text Available This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve good quality of food products, detection of shelf life is important. Central nervous system based intelligent computing model was developed which detected 19.82 days shelf life, as against 21 days experimental shelf life.

  7. A centralized hazardous waste treatment plant: the facilities of the ZVSMM at Schwabach as an example

    Energy Technology Data Exchange (ETDEWEB)

    Amsoneit, Norbert [Zweckverband Sondermuell-Entsorgung Mittelfranken, Rednitzhembach (Germany)

    1993-12-31

    In this work a centralized hazardous waste treatment plant is described and its infra-structure is presented. Special emphasis is given to the handling of the residues produced and the different treatment processes at the final disposal. 2 refs., 4 figs.

  8. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  9. The effects of buffer strips and bioretention facilities on agricultural productivity and environmental quality in Central Africa

    Science.gov (United States)

    Gilroy, Kristin L.; McCuen, Richard H.

    2010-05-01

    SummaryLand degradation is a growing concern in Central Africa as poor management practices continue to cause erosion and increase water runoff in agricultural fields. The implementation of best management practices (BMPs) is needed; however, productivity is often indirectly related to the environmental benefits of such practices and resource constraints often exist. The purpose of this study was to determine the effects of bioretention facilities and buffer strips on environmental quality with productivity and resources as constraints. A water quantity and quality model for an agricultural field in Central Africa was developed. Analyses were conducted to assess the marginal benefits of each BMP, the effect of different BMP combinations on environmental quality and productivity, and the effect of data uncertainty and location uncertainty on model predictions. The results showed that bioretention pits were more effective than buffer strips in increasing environmental quality. Productivity was shown to be directly related to bioretention pits, thus environmental quality can be attained without sacrificing productivity. Data uncertainties resulted in changes in the environmental quality values, but trends remained the same. Guidelines were provided to assist design engineers in developing BMP scenarios that provide the greatest productivity and environmental quality for the constraints involved. The results of this study will bring awareness to the ability of attaining environmental quality without sacrificing productivity as well as the need for accurate data in Central Africa.

  10. Neuromotor recovery from stroke: Computational models at central, functional, and muscle synergy level

    Directory of Open Access Journals (Sweden)

    Maura eCasadio

    2013-08-01

    Full Text Available Computational models of neuromotor recovery after a stroke might help to unveil the underlying physiological mechanisms and might suggest how to make recovery faster and more effective. At least in principle, these models could serve: (i To provide testable hypotheses on the nature of recovery; (ii To predict the recovery of individual patients; (iii To design patient-specific ’optimal’ therapy, by setting the treatment variables for maximizing the amount of recovery or for achieving a better generalization of the learned abilities across different tasks.Here we review the state of the art of computational models for neuromotor recovery through exercise, and their implications for treatment. We show that to properly account for the computational mechanisms of neuromotor recovery, multiple levels of description need to be taken into account. The review specifically covers models of recovery at central, functional and muscle synergy level.

  11. Techno-economic assessment of central sorting at material recovery facilities

    DEFF Research Database (Denmark)

    Cimpan, Ciprian; Maul, Anja; Wenzel, Henrik

    2016-01-01

    Simulation of technical and economic performance for materials recovery facilities (MRFs) is a basic requirement for planning new, or evaluating existing, separate waste collection and recycling systems. This study mitigates the current pervasive scarcity of data on process efficiency and costs...... by documenting typical steps taken in a techno-economic assessment of MRFs, using the specific example of lightweight packaging waste (LWP) sorting in Germany. Thus, the study followed the steps of dimensioning of buildings and equipment, calculation of processing costs and projections of revenues from material...... sales and sorting residues disposal costs. Material flows through the plants were simulated considering both optimal process conditions and real or typical conditions characterised by downtime and frequent operation at overcapacity. By modelling four plants of progressively higher capacity (size...

  12. The National Ignition Facility: Status of the Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P J; Bryant, R; Carey, R; Casavant, D; Demaret, R; Edwards, O; Ferguson, W; Krammen, J; Lagin, L; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3{omega} single laser beam performance is 10.4 kJ, equivalent to 2 MJ

  13. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    Science.gov (United States)

    Karasawa, Mizuka; Chan, Tony; Smith, Jason

    2010-04-01

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  14. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: lagin1@llnl.gov; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)] (and others)

    2008-04-15

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including

  15. Nuclear-nuclear collision centrality determination by the spectators calorimeter for the MPD setup at the NICA facility

    Energy Technology Data Exchange (ETDEWEB)

    Golubeva, M. B.; Guber, F. F.; Ivashkin, A. P. [Russian Academy of Sciences, Institute for Nuclear Research (Russian Federation); Isupov, A. Yu. [Joint Institute for Nuclear Research (Russian Federation); Kurepin, A. B. [Russian Academy of Sciences, Institute for Nuclear Research (Russian Federation); Litvinenko, A. G., E-mail: litvin@moonhe.jinr.ru; Litvinenko, E. I.; Migulina, I. I.; Peresedov, V. F. [Joint Institute for Nuclear Research (Russian Federation)

    2013-01-15

    The work conditions of the hadron calorimeter for spectators registration (Zero Degree Calorimeter, ZDC) were studied for the heavy nuclei collisions with the several GeV invariant energy. The ZDC simulations were performed for the MPD (Multi-Purpose Detector) at the NICA (Nuclotron-based Ion Collider fAcility) collider, which are under developement at the Joint Institute for Nuclear Research (JINR, Dubna). Taking into account the spectator nuclear fragments leads to a nonmonotonic dependence of the ZDC response on the impact parameter. The reason for this dependence studied with several event generators is the primary beam hole in the ZDC center. It is shown, that the ZDC signal should be combined with a data from other MPD-NICA detector subsystems to determine centrality.

  16. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  17. Assess and improve the sustainability of water treatment facility using Computational Fluid Dynamics

    Science.gov (United States)

    Zhang, Jie; Tejada-Martinez, Andres; Lei, Hongxia; Zhang, Qiong

    2016-11-01

    Fluids problems in water treatment industry are often simplified or omitted since the focus is usually on chemical process only. However hydraulics also plays an important role in determining effluent water quality. Recent studies have demonstrated that computational fluid dynamics (CFD) has the ability to simulate the physical and chemical processes in reactive flows in water treatment facilities, such as in chlorine and ozone disinfection tanks. This study presents the results from CFD simulations of reactive flow in an existing full-scale ozone disinfection tank and in potential designs. Through analysis of the simulation results, we found that baffling factor and CT10 are not optimal indicators of disinfection performance. We also found that the relationship between effluent CT (the product of disinfectant concentration and contact time) obtained from CT transport simulation and baffling factor depends on the location of ozone release. In addition, we analyzed the environmental and economic impacts of ozone disinfection tank designs and developed a composite indicator to quantify the sustainability of ozone disinfection tank in technological, environmental and economic dimensions.

  18. Comparison of Elementary Educational Facility Allocation Patterns and Planning Strategies in Rural Areas:Case Studies of Central and Eastern China

    Institute of Scientific and Technical Information of China (English)

    Zhao; Min; Shao; Lin; Li; Wei; Qian; Fang

    2016-01-01

    Elementary educational facilities are one of the essential public service facilities in rural China. Based on case studies of two countylevel cities in central and eastern China, this paper explores issues regarding a rational allocation of elementary educational facilities in rural areas, including different allocation patterns and effects, and rural residents’ views on elementary educational facilities, as well as their preference when choosing schools. This paper, on the basis of empirical studies, concludes that relevant policies and planning strategies should be adjusted in accordance with the general trends of rural population being decreasing continuously and the student pool concentrating toward towns, so as to achieve an efficient and fair allocation of elementary educational facilities.

  19. Guide to computing at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Peavler, J. (ed.)

    1979-06-01

    This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)

  20. Gestational age at booking for antenatal care in a tertiary health facility in north-central, Nigeria

    Directory of Open Access Journals (Sweden)

    Dennis Isaac Ifenne

    2012-01-01

    Full Text Available Background: Early initiation of antenatal care is widely believed to improve maternal and fetal outcome. This study was designed to ascertain the gestational age at booking using World Health Organization recommendations for developing countries. Materials and Methods: This cross-sectional study was carried out using interviewer-administered questionnaire to 345 willing participants at a booking clinic in a tertiary health facility in North-Central, Nigeria. Results: A total of 345 women were interviewed. The average age of the clients was 27.1±5.1 years. Almost (45.8% had at least secondary level of education. One-third of the women were not working women. The average gestational age at booking was 19.1±7.8. Late booking (≥17 weeks was significantly influenced by the client′s level of education ( P=0.017. Reasons for booking late were given as follows: Not being sick (26.1%, Lack of knowledge of booking time (22.8%, having booked elsewhere (14.1%, financial constraints (9.2%, fear of too many follow-up visits (4.9%, spouse′s un co-operative attitude (3.9%, lack of transport to the health care facility (2.2%, and other minor reasons (16.8%. Conclusion: Most women booked for antenatal care (ANC late. Efforts toward maternal education, public health enlightenment campaigns, poverty reduction, and use of focused antenatal care model should be sustained as measures to encourage early initiation of ANC.

  1. A computational model of the integration of landmarks and motion in the insect central complex

    Science.gov (United States)

    Sabo, Chelsea; Vasilaki, Eleni; Barron, Andrew B.; Marshall, James A. R.

    2017-01-01

    The insect central complex (CX) is an enigmatic structure whose computational function has evaded inquiry, but has been implicated in a wide range of behaviours. Recent experimental evidence from the fruit fly (Drosophila melanogaster) and the cockroach (Blaberus discoidalis) has demonstrated the existence of neural activity corresponding to the animal’s orientation within a virtual arena (a neural ‘compass’), and this provides an insight into one component of the CX structure. There are two key features of the compass activity: an offset between the angle represented by the compass and the true angular position of visual features in the arena, and the remapping of the 270° visual arena onto an entire circle of neurons in the compass. Here we present a computational model which can reproduce this experimental evidence in detail, and predicts the computational mechanisms that underlie the data. We predict that both the offset and remapping of the fly’s orientation onto the neural compass can be explained by plasticity in the synaptic weights between segments of the visual field and the neurons representing orientation. Furthermore, we predict that this learning is reliant on the existence of neural pathways that detect rotational motion across the whole visual field and uses this rotation signal to drive the rotation of activity in a neural ring attractor. Our model also reproduces the ‘transitioning’ between visual landmarks seen when rotationally symmetric landmarks are presented. This model can provide the basis for further investigation into the role of the central complex, which promises to be a key structure for understanding insect behaviour, as well as suggesting approaches towards creating fully autonomous robotic agents. PMID:28241061

  2. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  3. Analysing the scalability of thermal hydraulic test facility data to reactor scale with a computer code; Vertailuanalyysin kaeyttoe termohydraulisten koelaitteistojen tulosten laitosmittakaavaan skaalautumisen tutkimisessa

    Energy Technology Data Exchange (ETDEWEB)

    Suikkanen, P.

    2009-01-15

    The objective of the Masters thesis was to study guidelines and procedures for scaling of thermal hydraulic test facilities and to compare results from two test facility models and from EPR model. Aim was to get an impression of how well the studied test facilities describe the behaviour in power plant scale during accident scenarios with computer codes. Models were used to determine the influence of primary circuit mass inventory on the behaviour of the circuit. The data from test facility models represent the same phenomena as the data from EPR model. The results calculated with PKL model were also compared against PKL test facility data. They showed good agreement. Test facility data is used to validate computer codes, which are used in nuclear safety analysis. The scale of the facility has effect on the behaviour of the phenomena and therefore special care must be taken in using the data. (orig.)

  4. Design of a Computer-Based Control System Using LabVIEW for the NEMESYS Electromagnetic Launcher Facility

    Science.gov (United States)

    2007-06-01

    quickly was necessary. A railgun shot typically occurs in less than 10 ms, and firing capacitor banks to shape the current pulse are in the 100s of...DESIGN OF A COMPUTER-BASED CONTROL SYSTEM USING LABVIEW FOR THE NEMESYS ELECTROMAGNETIC LAUNCHER FACILITY∗ B. M. Huhmanξ 1, J. M. Neri Plasma...has assembled a facility to develop and test materials for the study of barrel lifetime in electromagnetic launchers (EML) for surface-fire support

  5. Safety Assessment Of The Centralized Storage Facility For Disused Sealed Radioactive Sources In United Republic Of Tanzania

    Energy Technology Data Exchange (ETDEWEB)

    Abel, Vitus [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, JaeSeong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    SRS are no longer in use, they are declared as disused, and they are transferred to the central radioactive management facility (CRMF) belonging to Tanzania Atomic Energy Commission (regulatory body) and managed as radioactive waste. In order to reduce the risk associated with disused sealed radioactive sources (DSRS), the first priority would be to bring them to appropriate controls under the regulatory body. When DSRS are safely managed, regulatory body need to make assessment of the likelihood and potential impact of incidents, accidents and hazards for proper management plan. The paper applies Analytical Hierarchy Process (AHP) for assessing and allocating weights and priorities for solving the problem of mult criteria consideration for management plan. Using pairwise comparisons, the relative importance of one criterion over another can be expressed. The method allows decision makers to provide judgments about the relative importance of each criterion and to estimate radiological risk by using expert's judgments or probability of occurrence. AHP is the step by step manner where the resulting priorities are shown and the possible inconsistencies are determined. The Information provided by experts helps to rank hazards according to probability of occurrence, potential impact and mitigation cost. The strength of the AHP method lies in its ability to incorporate both qualitative and quantitative data in decision making. AHP present a powerful tool for weighting and prioritizing hazards in terms of occurrence probability. However, AHP also has some weak points. AHP requires data based on experience, knowledge and judgment which are subjective for each decision-maker.

  6. Computational simulation of multi-strut central lobed injection of hydrogen in a scramjet combustor

    Directory of Open Access Journals (Sweden)

    Gautam Choubey

    2016-09-01

    Full Text Available Multi-strut injection is an approach to increase the overall performance of Scramjet while reducing the risk of thermal choking in a supersonic combustor. Hence computational simulation of Scramjet combustor at Mach 2.5 through multiple central lobed struts (three struts have been presented and discussed in the present research article. The geometry and model used here is slight modification of the DLR (German Aerospace Center scramjet model. Present results show that the presence of three struts injector improves the performance of scramjet combustor as compared to single strut injector. The combustion efficiency is also found to be highest in case of three strut fuel injection system. In order to validate the results, the numerical data for single strut injection is compared with experimental result which is taken from the literature.

  7. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  8. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  9. In-Core Computation of Geometric Centralities with HyperBall: A Hundred Billion Nodes and Beyond

    CERN Document Server

    Boldi, Paolo

    2013-01-01

    Given a social network, which of its nodes are more central? This question has been asked many times in sociology, psychology and computer science, and a whole plethora of centrality measures (a.k.a. centrality indices, or rankings) were proposed to account for the importance of the nodes of a network. In this paper, we approach the problem of computing geometric centralities, such as closeness and harmonic centrality, on very large graphs; traditionally this task requires an all-pairs shortest-path computation in the exact case, or a number of breadth-first traversals for approximated computations, but these techniques yield very weak statistical guarantees on highly disconnected graphs. We rather assume that the graph is accessed in a semi-streaming fashion, that is, that adjacency lists are scanned almost sequentially, and that a very small amount of memory (in the order of a dozen bytes) per node is available in core memory. We leverage the newly discovered algorithms based on HyperLogLog counters, making...

  10. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [FRIB, MSU; Mokhov, Nikolai [FNAL; Niita, Koji [RIST, Japan

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    Energy Technology Data Exchange (ETDEWEB)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-09-05

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above.

  13. Computational Studies of X-ray Framing Cameras for the National Ignition Facility

    Science.gov (United States)

    2013-06-01

    Livermore National Laboratory 7000 East Avenue Livermore, CA 94550 USA Abstract The NIF is the world’s most powerful laser facility and is...a phosphor screen where the output is recorded. The x-ray framing cameras have provided excellent information. As the yields at NIF have increased...experiments on the NIF . The basic operation of these cameras is shown in Fig. 1. Incident photons generate photoelectrons both in the pores of the MCP and

  14. Computing multiple aggregation levels and contextual features for road facilities recognition using mobile laser scanning data

    Science.gov (United States)

    Yang, Bisheng; Dong, Zhen; Liu, Yuan; Liang, Fuxun; Wang, Yongjun

    2017-04-01

    In recent years, updating the inventory of road infrastructures based on field work is labor intensive, time consuming, and costly. Fortunately, vehicle-based mobile laser scanning (MLS) systems provide an efficient solution to rapidly capture three-dimensional (3D) point clouds of road environments with high flexibility and precision. However, robust recognition of road facilities from huge volumes of 3D point clouds is still a challenging issue because of complicated and incomplete structures, occlusions and varied point densities. Most existing methods utilize point or object based features to recognize object candidates, and can only extract limited types of objects with a relatively low recognition rate, especially for incomplete and small objects. To overcome these drawbacks, this paper proposes a semantic labeling framework by combing multiple aggregation levels (point-segment-object) of features and contextual features to recognize road facilities, such as road surfaces, road boundaries, buildings, guardrails, street lamps, traffic signs, roadside-trees, power lines, and cars, for highway infrastructure inventory. The proposed method first identifies ground and non-ground points, and extracts road surfaces facilities from ground points. Non-ground points are segmented into individual candidate objects based on the proposed multi-rule region growing method. Then, the multiple aggregation levels of features and the contextual features (relative positions, relative directions, and spatial patterns) associated with each candidate object are calculated and fed into a SVM classifier to label the corresponding candidate object. The recognition performance of combining multiple aggregation levels and contextual features was compared with single level (point, segment, or object) based features using large-scale highway scene point clouds. Comparative studies demonstrated that the proposed semantic labeling framework significantly improves road facilities recognition

  15. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  16. Injection and extraction computer control system HIRFL-SSC The HIRFL-SSC is stated for Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron

    CERN Document Server

    Zhang Wei; Chen Yun; Zhang Xia; Hu Jian Jun; Xu Xing Ming

    2002-01-01

    The injection and extraction computer control system of HIRFL-SSC (Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron) have been introduced. Software is described briefly. Hardware structure is mainly presented. The computer control system realize that the adjustment of injection and extraction can done by PC and operate interface is Windows style. The system can make the adjustment convenient and veracious

  17. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda

    DEFF Research Database (Denmark)

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham;

    2016-01-01

    clinicians, less likely to have people with tertiary education (OR 0.34; 95 % CI 0.17-0.66) and less likely to have zinc tablets (OR 0.38; 95 % CI 0.19-0.78). In both urban and rural areas, there was low usage of stock cards and patient registers. About half of the facilities in both rural and urban areas...... keeping, essential drugs for the treatment of malaria, pneumonia and diarrhoea; the sex, level of education, professional and in-service training of the persons found attending to patients in these facilities. A comparison was made between urban and rural facilities. Univariate and bivariate analysis...... attended to at least one sick child in the week prior to the interview. CONCLUSION: There were big gaps between rural and urban private facilities with rural ones having less trained personnel and less zinc tablets' availability. In both rural and urban areas, record keeping was low. Child health...

  18. A computer-controlled experimental facility for krypton and xenon adsorption coefficient measurements on activated carbons

    Energy Technology Data Exchange (ETDEWEB)

    Del Serra, Daniele; Aquaro, Donato; Mazed, Dahmane; Pazzagli, Fabio; Ciolini, Riccardo, E-mail: r.ciolini@ing.unipi.it

    2015-07-15

    Highlights: • An experimental test facility for qualification of the krypton and xenon adsorption properties of activated carbons. • The measurement of the adsorption coefficient by using the elution curve method. • The simultaneous on-line control of the main physical parameters influencing the adsorption property of activated carbon. - Abstract: An automated experimental test facility, intended specifically for qualification of the krypton and xenon adsorption properties of activated carbon samples, was designed and constructed. The experimental apparatus was designed to allow an on-line control of the main physical parameters influencing greatly the adsorption property of activated carbon. The measurement of the adsorption coefficient, based upon the elution curve method, can be performed with a precision better than 5% at gas pressure values ranging from atmospheric pressure up to 9 bar and bed temperature from 0 up to 80 °C. The carrier gas flow rate can be varied from 40 up to 4000 N cm{sup 3} min{sup −1} allowing measurement of dynamic adsorption coefficient with face velocities from 0.3 up to 923 cm min{sup −1} depending on the gas pressure and the test cell being used. The moisture content of the activated carbon can be precisely controlled during measurement, through the relative humidity of the carrier gas.

  19. The Use of Public Computing Facilities by Library Patrons: Demography, Motivations, and Barriers

    Science.gov (United States)

    DeMaagd, Kurt; Chew, Han Ei; Huang, Guanxiong; Khan, M. Laeeq; Sreenivasan, Akshaya; LaRose, Robert

    2013-01-01

    Public libraries play an important part in the development of a community. Today, they are seen as more than store houses of books; they are also responsible for the dissemination of online, and offline information. Public access computers are becoming increasingly popular as more and more people understand the need for internet access. Using a…

  20. Cambridge-Cranfield High Performance Computing Facility (HPCF) purchases ten Sun Fire(TM) 15K servers to dramatically increase power of eScience research

    CERN Multimedia

    2002-01-01

    "The Cambridge-Cranfield High Performance Computing Facility (HPCF), a collaborative environment for data and numerical intensive computing privately run by the University of Cambridge and Cranfield University, has purchased 10 Sun Fire(TM) 15K servers from Sun Microsystems, Inc.. The total investment, which includes more than $40 million in Sun technology, will dramatically increase the computing power, reliability, availability and scalability of the HPCF" (1 page).

  1. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    Science.gov (United States)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  2. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Duran, Felicia Angelica [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.; Waymire, Russell L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documents have also been provided to KHNP-CRI.

  3. ONR Europe Reports. Computer Science/Computer Engineering in Central Europe: A Report on Czechoslovakia, Hungary, and Poland

    Science.gov (United States)

    1992-08-01

    Routing on Printed Circuit Boards. Computer Aided Design, Vol.12, No.5, 1980, pp.231-234. 15 Servit, M.: Heuristic Algorithms for Rectilinear Steiner ...practical utilization ver. 5.0, KOPP publ. comp., 1992 Herout P, Rudolf V., Smrha P.: ABC of Programmer in the C Language (ANSI C, Borland C and C++), KOPP

  4. A Model of an Expert Computer Vision and Recognition Facility with Applications of a Proportion Technique.

    Science.gov (United States)

    2014-09-26

    of research is being 14 function called WHATISFACE. [Rhodes][Tucker][ Hogg ][Sowa] The model offering the most specific information about structure and...1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image and Vision Computing", Vol. 1, No. 1, February 1983, pp. 5-20...Systems", Addison-Wesley Publishing Company, Inc., Massachusetts, 1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image

  5. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  6. Facile identification of dual FLT3-Aurora A inhibitors: a computer-guided drug design approach.

    Science.gov (United States)

    Chang Hsu, Yung; Ke, Yi-Yu; Shiao, Hui-Yi; Lee, Chieh-Chien; Lin, Wen-Hsing; Chen, Chun-Hwa; Yen, Kuei-Jung; Hsu, John T-A; Chang, Chungming; Hsieh, Hsing-Pang

    2014-05-01

    Computer-guided drug design is a powerful tool for drug discovery. Herein we disclose the use of this approach for the discovery of dual FMS-like receptor tyrosine kinase-3 (FLT3)-Aurora A inhibitors against cancer. An Aurora hit compound was selected as a starting point, from which 288 virtual molecules were screened. Subsequently, some of these were synthesized and evaluated for their capacity to inhibit FLT3 and Aurora kinase A. To further enhance FLT3 inhibition, structure-activity relationship studies of the lead compound were conducted through a simplification strategy and bioisosteric replacement, followed by the use of computer-guided drug design to prioritize molecules bearing a variety of different terminal groups in terms of favorable binding energy. Selected compounds were then synthesized, and their bioactivity was evaluated. Of these, one novel inhibitor was found to exhibit excellent inhibition of FLT3 and Aurora kinase A and exert a dramatic antiproliferative effect on MOLM-13 and MV4-11 cells, with an IC50 value of 7 nM. Accordingly, it is considered a highly promising candidate for further development.

  7. Hanford facility dangerous waste Part A, Form 3 and Part B permit application documentation, Central Waste Complex (WA7890008967)(TSD: TS-2-4)

    Energy Technology Data Exchange (ETDEWEB)

    Saueressig, D.G.

    1998-05-20

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998.

  8. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  9. VisMatchmaker: Cooperation of the User and the Computer in Centralized Matching Adjustment.

    Science.gov (United States)

    Law, Po-Ming; Wu, Wenchao; Zheng, Yixian; Qu, Huamin

    2017-01-01

    Centralized matching is a ubiquitous resource allocation problem. In a centralized matching problem, each agent has a preference list ranking the other agents and a central planner is responsible for matching the agents manually or with an algorithm. While algorithms can find a matching which optimizes some performance metrics, they are used as a black box and preclude the central planner from applying his domain knowledge to find a matching which aligns better with the user tasks. Furthermore, the existing matching visualization techniques (i.e. bipartite graph and adjacency matrix) fail in helping the central planner understand the differences between matchings. In this paper, we present VisMatchmaker, a visualization system which allows the central planner to explore alternatives to an algorithm-generated matching. We identified three common tasks in the process of matching adjustment: problem detection, matching recommendation and matching evaluation. We classified matching comparison into three levels and designed visualization techniques for them, including the number line view and the stacked graph view. Two types of algorithmic support, namely direct assignment and range search, and their interactive operations are also provided to enable the user to apply his domain knowledge in matching adjustment.

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  12. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Science.gov (United States)

    2010-07-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the...

  13. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Science.gov (United States)

    2010-07-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the...

  14. Central Issues in the Use of Computer-Based Materials for High Volume Entrepreneurship Education

    Science.gov (United States)

    Cooper, Billy

    2007-01-01

    This article discusses issues relating to the use of computer-based learning (CBL) materials for entrepreneurship education at university level. It considers CBL as a means of addressing the increased volume and range of provision required in the current context. The issues raised in this article have importance for all forms of computer-based…

  15. Computational Design of High Efficiency Release Targets for Use at ISOL Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Alton, G.D.; Liu, Y.; Middleton, J.W.

    1998-11-04

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated vitreous carbon fiber (RVCF) or carbon-bonded-carbon-fiber (CBCF) to form highly permeable composite target matrices. Computational studies which simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived tlom diffusion release-rate simulation studies for selected targets and thermal analyses of temperature distributions within a prototype target/heat-sink system subjected to primary ion beam irradiation will be presented in this report.

  16. Simulation concept of NICA-MPD-SPD Tier0-Tier1 computing facilities

    Science.gov (United States)

    Korenkov, V. V.; Nechaevskiy, A. V.; Ososkov, G. A.; Pryahina, D. I.; Trofomov, V. V.; Uzhinskiy, A. V.

    2016-09-01

    The simulation concept for grid-cloud services of contemporary HENP experiments of the Big Data scale was formulated in practicing the simulation system developed in LIT JINR Dubna. This system is intended to improve the efficiency of the design and development of a wide class of grid-cloud structures by using the work quality indicators of some real system to design and predict its evolution. For these purposes the simulation program is combined with a real monitoring system of the grid-cloud service through a special database (DB). The DB accomplishes acquisition and analysis of monitoring data to carry out dynamical corrections of the simulation. Such an approach allows us to construct a general model pattern which should not depend on a specific simulated object, while the parameters describing this object can be used as input to run the pattern. The simulation of some processes of the NICA-MPD-SPD Tier0-Tier1 distributed computing is considered as an example of our approach applications.

  17. An interactive computer program for randomization analysis of response curves with facilities for multiple comparisons.

    Science.gov (United States)

    Tan, E S; Roos, J M; Volovics, A; Van Baak, M A; Does, R J

    1992-04-01

    An interactive Fortran program, MUCRA, is presented. The program can perform randomization analysis of a completely randomized or randomized-blocks design extended to growth and response curves. A single-step Scheffé-type procedure as well as the Peritz's closed step-down procedure have been implemented which control the familywise type I error-rate. In general, MUCRA is suitable as a computer tool for a distribution-free analysis of variance with repeated measures. The use of MUCRA is demonstrated by analyzing the effects oxprenolol and atenolol have on exercise heart rate. Oxprenolol is a non-selective beta-blocker with moderate intrinsic sympathomimetic activity (ISA), given by the Oros delivery system. Atenolol is a beta 1-selective blocker without ISA. A randomized placebo-controlled crossover design was used to compare the effects of the beta 1-blockers on heart rate during a progressive maximal exercise test on a bicycle ergometer. Application of the Scheffé-type procedure showed that the two drugs significantly (alpha = .05) reduce the heart rate during the exercise test at the three prechosen times (2, 5, and 24 hr) after intake. The reduction from atenolol is more pronounced than from oxprenolol Oros at 2 and 5 hr.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. Cardiac tamponade in an infant during contrast infusion through central venous catheter for chest computed tomography; Tamponamento cardiaco durante infusao de contraste em acesso venoso central para realizacao de tomografia computadorizada do torax em lactente

    Energy Technology Data Exchange (ETDEWEB)

    Daud, Danilo Felix; Campos, Marcos Menezes Freitas de; Fleury Neto, Augusto de Padua [Hospital Geral de Palmas, TO (Brazil)

    2013-11-15

    Complications from central venous catheterization include infectious conditions, pneumothorax, hemothorax and venous thrombosis. Pericardial effusion with cardiac tamponade hardly occurs, and in infants is generally caused by umbilical catheterization. The authors describe the case of cardiac tamponade occurred in an infant during chest computed tomography with contrast infusion through a central venous catheter inserted into the right internal jugular vein. (author)

  20. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda

    DEFF Research Database (Denmark)

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham;

    2016-01-01

    in diagnostic capabilities, operations and human resource in the management of malaria, pneumonia and diarrhoea. METHODS: A survey was conducted in pharmacies, private clinics and drug shops in Mukono district in October 2014. An assessment was done on availability of diagnostic equipment for malaria, record...... attended to at least one sick child in the week prior to the interview. CONCLUSION: There were big gaps between rural and urban private facilities with rural ones having less trained personnel and less zinc tablets' availability. In both rural and urban areas, record keeping was low. Child health...

  1. Distributed randomized algorithms for opinion formation, centrality computation and power systems estimation: A tutorial overview

    NARCIS (Netherlands)

    Frasca, Paolo; Ishii, Hideaki; Ravazzi, Chiara; Tempo, Roberto

    2015-01-01

    In this tutorial paper, we study three specific applications: opinion formation in social networks, centrality measures in complex networks and estimation problems in large-scale power systems. These applications fall under a general framework which aims at the construction of algorithms for distrib

  2. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana.

    Science.gov (United States)

    Nkansah, A; Schandorf, C; Boadu, M; Fletcher, J J

    2013-08-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s(-1). An average dose equivalent rate estimated for supervised areas is 3.4±0.27 µSv week(-1) and that for the controlled area is 18.0±0.15 µSv week(-1), which are within acceptable values.

  4. Computer code analysis of steam generator in thermal-hydraulic test facility simulating nuclear power plant; Ydinvoimalaitosta kuvaavan koelaitteiston hoeyrystimien analysointi tietokoneohjelmilla

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.

    1995-12-31

    In the study three loss-of-feedwater type experiments which were preformed with the PACTEL facility has been calculated with two computer codes. The purpose of the experiments was to gain information about the behaviour of horizontal steam generator in a situation where the water level on the secondary side of the steam generator is decreasing. At the same time data that can be used in the assessment of thermal-hydraulic computer codes was assembled. The purpose of the work was to study the capabilities of two computer codes, APROS version 2.11 and RELAP5/MOD3.1, to calculate the phenomena in horizontal steam generator. In order to make the comparison of the calculation results easier the same kind of model of the steam generator was made for both codes. Only the steam generator was modelled, the rest of the facility was given for the codes as a boundary condition. (23 refs.).

  5. Computed tomography as a problem solving tool in non-radiopaque central venous port systems – A report of three cases

    OpenAIRE

    Gossner, Johannes

    2014-01-01

    Central venous port systems are now routinely used in oncology. The non-functioning port system is a common issue in radiology departments. Fluoroscopy is a first-line imaging modality. The potential usefulness of computed tomography as a problem-solving tool in three complex cases with non-radiopaque central venous port systems is presented.

  6. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  7. Determinants of anemia among pregnant mothers attending antenatal care in Dessie town health facilities, northern central Ethiopia, unmatched case -control study

    Science.gov (United States)

    Seid, Omer; G/Mariam, Yemane; Fekadu, Abel; Wasihun, Yitbarek; Endris, Kedir; Bitew, Abebayehu

    2017-01-01

    Introduction Anemia affects around 38.2% and 22% of pregnant women at a global and national level respectively. In developing countries, women start pregnancy with already depleted body stores of iron and other vitamins with significant variation of anemia within and between regions. Objective To identify the determinants of anemia among pregnant mothers attending antenatal care in Dessie town health facilities, northern central Ethiopia. Methods A health facility based unmatched case control study was conducted among 112 cases and 336 controls from January to March 2016 G.C. The sample size was determined by using Epi Info version 7.1.5.2. Study subjects were selected using consecutive sampling technique. Data were collected using a structured questionnaire, entered using Epi Data version 3.1 and analyzed using SPSS version 20. Bivariable and multivariable logistic regression model was used to see the determinants of anemia. Adjusted odds ratio (AOR) with 95% confidence interval (CI) and p-valueanemia. Conclusions Inadequate intake of dark green leafy vegetables, inadequate consumption of chicken, trimester of the current pregnancy, HIV infection and medication were the determinants of anemia among pregnant women. Therefore, anemia prevention strategy should include promotion of adequate intake of dark green leafy vegetables and chicken, increase meal pattern during the entire pregnancy and strengthen the prevention of mother to child HIV transmission/antenatal care programs. PMID:28288159

  8. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    Directory of Open Access Journals (Sweden)

    J. Delamere

    2011-09-01

    Full Text Available We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM facility at the Southern Great Plains (SGP site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs, four different surface types (snow-covered, green vegetation, partial vegetation, non-vegetated can be identified. A normalized difference vegetation index (NDVI is used to distinguish between vegetated and non-vegetated surfaces, and a scaled NDVI index is used to estimate the percentage of green vegetation in partially vegetated surfaces. Based on libraries of spectral albedo measurements, a piecewise continuous function is developed to estimate the high spectral resolution surface albedo for each surface type given the MFR albedo values as input. For partially vegetated surfaces, the albedo is estimated as a linear combination of the green vegetation and non-vegetated surface albedo values. The estimated albedo values are evaluated through comparison to high spectral resolution albedo measurements taken during several Intensive Observational Periods (IOPs and through comparison of the integrated spectral albedo values to observed broadband albedo measurements. The estimated spectral albedo values agree well with observations for the visible wavelengths constrained by the MFR measurements, but have larger biases and variability at longer wavelengths. Additional MFR channels at 1100 nm and/or 1600 nm would help constrain the high resolution spectral albedo in the near infrared region.

  9. Quality of Sulfadoxine-Pyrimethamine Given as Antimalarial Prophylaxis in Pregnant Women in Selected Health Facilities in Central Region of Ghana

    Directory of Open Access Journals (Sweden)

    Danny F. Yeboah

    2016-01-01

    Full Text Available The use of sulfadoxine-pyrimethamine (SP as an intermittent preventive treatment (IPT against malaria during pregnancy has become a policy in most sub-Sahara African countries and crucially depends on the efficacy of SP. This study sets out to evaluate the effectiveness of the SP given to the pregnant women in some selected health facilities in the Central Region of Ghana to prevent maternal malaria in pregnant women. A total of 543 pregnant women recruited from 7 selected health centres in Central Region of Ghana participated in the study. Parasite density of Plasmodium falciparum was determined from peripheral blood of the pregnant women using microscopy. High performance liquid chromatography (HPLC and dissolution tester were used to determine the quality of the SP. Malaria infection was recorded in 11.2% of pregnant women who had a history of SP consumption. SP failed the dissolution test. Pregnant women who did not receive IPT-SP were 44%. Low haemoglobin level was recorded in 73.5% of the pregnant women. The results indicated that SP was substandard. IPT-SP is ineffective in preventing malaria infection.

  10. Computational Insights into the Central Role of Nonbonding Interactions in Modern Covalent Organocatalysis.

    Science.gov (United States)

    Walden, Daniel M; Ogba, O Maduka; Johnston, Ryne C; Cheong, Paul Ha-Yeon

    2016-06-21

    The flexibility, complexity, and size of contemporary organocatalytic transformations pose interesting and powerful opportunities to computational and experimental chemists alike. In this Account, we disclose our recent computational investigations of three branches of organocatalysis in which nonbonding interactions, such as C-H···O/N interactions, play a crucial role in the organization of transition states, catalysis, and selectivity. We begin with two examples of N-heterocyclic carbene (NHC) catalysis, both collaborations with the Scheidt laboratory at Northwestern. In the first example, we discuss the discovery of an unusual diverging mechanism in a catalytic kinetic resolution of a dynamic racemate that depends on the stereochemistry of the product being formed. Specifically, the major product is formed through a concerted asynchronous [2 + 2] aldol-lactonization, while the minor products come from a stepwise spiro-lactonization pathway. Stereoselectivity and catalysis are the results of electrophilic activation from C-H···O interactions between the catalyst and the substrate and conjugative stabilization of the electrophile. In the second example, we show how knowledge and understanding of the computed transition states led to the development of a more enantioselective NHC catalyst for the butyrolactonization of acyl phosphonates. The identification of mutually exclusive C-H···O interactions in the computed major and minor TSs directly resulted in structural hypotheses that would lead to targeted destabilization of the minor TS, leading to enhanced stereoinduction. Synthesis and evaluation of the newly designed NHC catalyst validated our hypotheses. Next, we discuss two works related to Lewis base catalysis involving 4-dimethylaminopyridine (DMAP) and its derivatives. In the first, we discuss our collaboration with the Smith laboratory at St Andrews, in which we discovered the origins of the regioselectivity in carboxyl transfer reactions. We

  11. Facile formation of dendrimer-stabilized gold nanoparticles modified with diatrizoic acid for enhanced computed tomography imaging applications.

    Science.gov (United States)

    Peng, Chen; Li, Kangan; Cao, Xueyan; Xiao, Tingting; Hou, Wenxiu; Zheng, Linfeng; Guo, Rui; Shen, Mingwu; Zhang, Guixiang; Shi, Xiangyang

    2012-11-07

    We report a facile approach to forming dendrimer-stabilized gold nanoparticles (Au DSNPs) through the use of amine-terminated fifth-generation poly(amidoamine) (PAMAM) dendrimers modified by diatrizoic acid (G5.NH(2)-DTA) as stabilizers for enhanced computed tomography (CT) imaging applications. In this study, by simply mixing G5.NH(2)-DTA dendrimers with gold salt in aqueous solution at room temperature, dendrimer-entrapped gold nanoparticles (Au DENPs) with a mean core size of 2.5 nm were able to be spontaneously formed. Followed by an acetylation reaction to neutralize the dendrimer remaining terminal amines, Au DSNPs with a mean size of 6 nm were formed. The formed DTA-containing [(Au(0))(50)-G5.NHAc-DTA] DSNPs were characterized via different techniques. We show that the Au DSNPs are colloid stable in aqueous solution under different pH and temperature conditions. In vitro hemolytic assay, cytotoxicity assay, flow cytometry analysis, and cell morphology observation reveal that the formed Au DSNPs have good hemocompatibility and are non-cytotoxic at a concentration up to 3.0 μM. X-ray absorption coefficient measurements show that the DTA-containing Au DSNPs have enhanced attenuation intensity, much higher than that of [(Au(0))(50)-G5.NHAc] DENPs without DTA or Omnipaque at the same molar concentration of the active element (Au or iodine). The formed DTA-containing Au DSNPs can be used for CT imaging of cancer cells in vitro as well as for blood pool CT imaging of mice in vivo with significantly improved signal enhancement. With the two radiodense elements of Au and iodine incorporated within one particle, the formed DTA-containing Au DSNPs may be applicable for CT imaging of various biological systems with enhanced X-ray attenuation property and detection sensitivity.

  12. Central nervous system involvement in systemic lupus erythematosus. The application of cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nagaoka, S.; Ishigatsubo, Y.; Katou, K.; Sakamoto, H.; Chiba, J. (Yokohama City Univ. (Japan). Faculty of Medicine)

    1982-06-01

    Cranial computed tomography scans were performed on 47 patients with systemic lupus erythematosus (SLE). Abnormal findings in the computed tomograms (CT) were observed in 17 patients (36.2%). Cerebral atrophy was the most common feature (eight cases), followed by abnormal high density areas (five cases), abnormal low density areas (three cases), sulcal enlargement (two cases), intracranial hemorrhage (one case) and others (two cases). The abnormal cranial CT group of SLE was associated with a significantly higher incidence of urinary casts and of thrombocytopenia. In particular, the frequency of urinary casts was greater in the group with cerebral atrophy than in the group with normal CT findings, and there was a higher incidence of alopecia, leukopenia and thrombocytopenia in the group with intracranial calcifications. Neuropsychiatric involvements were noted in 70.6% of patients with CT abnormalities, but neuropsychiatric features (20.7%) and electroencephalographic abnormalities (44.8%) were also observed in patients with normal CT findings. The age at onset of SLE, the mean duration of the disease and the survival rate were not significantly different between the groups with and without CT abnormalities, but the mortality rate was significantly greater in the group with CT abnormalities, especially among those with brain atrophy. Concerning the relationship between the findings of cranial CT and corticosteroid treatment, there was no significant difference in either the total dose or the mean duration of prednisolone therapy. Although SLE patients with cerebral atrophy were taking a larger maintenance dose of corticosteroids, the differences were not statistically significant.

  13. Distribution and health risk assessment of some organic and inorganic substances in a petroleum facility in central Mexico

    Science.gov (United States)

    Flores-Serrano, R. M.; Torres, L. G.; Flores, C.; Castro, A.; Iturbe, R.

    An oil distribution and storage station was subjected to an Environmental Auditory and results showed soil contamination in part of the surface. An assessment of the site was required in order to fulfill a complete characterization of the contaminants present in soil and groundwater, as well as to establish the probable sources of contamination. Besides, a health risk assessment was performed to set remediation goals. The aim of this work is to show how the entire characterization and risk assessment process performed in this storage station from central Mexico, regarding to subsoil and groundwater. Thirty sample points were examined. Total petroleum hydrocarbons concentrations in soil were in a very low range (20-268 mg/kg). Ethylbenzene, methyl tert-butyl ether, tert-amyl methyl ether, and lead were identified in one sampling point. Iron and zinc were found in all soil samples. There was no correlation between total petroleum hydrocarbons and any of the metals or between both metals. Only two out of four monitoring wells showed total petroleum hydrocarbons levels (1.4 and 66 mg/L, respectively). Regarding lead, all four monitored wells showed lead concentrations (0.043-0.15 mg/L). Results suggested that metal concentrations were not associated to petroleum contamination, but to iron scrap deposits placed over the soil; nevertheless more data is needed to make a clear conclusion. Health risk assessment showed that none of the evaluated contaminants represented a risk either for the on-site or the off-site receptors, since the hazardous quotients estimated did not exceed the acceptable values.

  14. Analysis of First Level Health Care Facility (FKTP Readiness as ‘Gatekeeper’ on The JKN Implementation in East Kalimantan and Central Java Year 2014

    Directory of Open Access Journals (Sweden)

    Wasis Budiarto

    2016-06-01

    Full Text Available Background: FKTP as a gatekeeperhas four function such us: the fi rst contact, sustainability, comprehensive healthcare and coordination. Readiness from input aspects on health care involves health care facilities, fi nance, humanresources, medicines and medical devices. This research is to identify the FKTP readiness as a gatekeeper in the inthe implementationof JKN in East Kalimantan and Central Java. Methods: Data collection wasby conducting interviews,observation and documents’ review. The data analysis techniques were descriptive statistics for quantitative and Miles,Huberman and Spradley concept qualitative. There were 6 health centers, 2 primary clinics, 6 doctors and 3 private dentistryclinics for each province. Results: The numbers of FKTP availability were 23.7 health centers and 3.2 primary clinics perdistrict/city. The average was 51.7 FKTP per district/city. Not all of primary clinics facilited by inpatient care. There were44% primary clinics facilitated by laboratory and 56% in collaboration with privatelaboratory. The highest ratio of numberof members by the population was Surakarta i. e. 1 member by 1.98 population. While, the lowest was East Kutai with 1member by 4.65 population. All health centres had been implementing gatekeeper functions. Most of primary clinics had beendone the functions while less of them had not already implemented excellent service yet. Phycisians had implemented theirfunctions as the fi rst contact and coordination very well. Moreover, dentist did the same as the fi rst contact. Conclusion:Availability of health centers as FKTP was adequate. Health centers were also ready to function as a gatekeeper for JKNimplementation. Primary clinics were ready for the fi rst contact and coordination; and dentist as the fi rst contact were allready. Recomendation: The numbers of health facilities need to be build and improved according to district/city’s capability.Moreover, health workers’ distribution should

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Science.gov (United States)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  17. Prevalence of enterobacteriaceae in Tupinambis merianae (Squamata: Teiidae from a captive facility in Central Brazil, with a profile of antimicrobial drug resistance in Salmonella enterica

    Directory of Open Access Journals (Sweden)

    Andréa de Moraes Carvalho

    2013-06-01

    Full Text Available The present study reports the presence of enterobacteriaceae in Tegu Lizards (Tupinambis merianaefrom a captive facility in central Brazil. From a total of 30 animals, 10 juveniles and 20 adults (10 males, 10 females, 60 samples were collected, in two periods separated by 15 days. The samples were cultivated in Xylose-lysine-deoxycholate agar (XLT4 and MacConkey agar. The Salmonella enterica were tested for antimicrobial susceptibility. A total of 78 bacteria was isolated, of wich 27 were from juveniles of T. merianae, 30 from adult males and 21 from adult females. Salmonella enterica was the most frequent bacteria followed by Citrobacter freundii, Escherichia coli, Enterobacter sakasakii, Kluivera sp., Citrobacter amalonaticus, Serratia marcescens, Citrobacter diversus, Yersinia frederiksenii, Serratia odorifera, and Serratia liquefaciens. Salmonella enterica subsp. diarizonae and houtenae showed resistance to cotrimoxazole, and serum Salmonella enterica Worthington showed resistance to tetracycline and gentamicin. Salmonella enterica Panama and S. enterica subsp. diarizonae showed intermediate sensitivity to cotrimoxazole. In addition to Enterobacteriaceae in the Tegu lizard, pathogenic serotypes of S. enterica also occur, and their antimicrobial resistance was confirmed.

  18. Modeling Potential Carbon Monoxide Exposure Due to Operation of a Major Rocket Engine Altitude Test Facility Using Computational Fluid Dynamics

    Science.gov (United States)

    Blotzer, Michael J.; Woods, Jody L.

    2009-01-01

    This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.

  19. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...

  20. Computer-Enriched Instruction (CEI) Is Better for Preview Material Instead of Review Material: An Example of a Biostatistics Chapter, the Central Limit Theorem

    Science.gov (United States)

    See, Lai-Chu; Huang, Yu-Hsun; Chang, Yi-Hu; Chiu, Yeo-Ju; Chen, Yi-Fen; Napper, Vicki S.

    2010-01-01

    This study examines the timing using computer-enriched instruction (CEI), before or after a traditional lecture to determine cross-over effect, period effect, and learning effect arising from sequencing of instruction. A 2 x 2 cross-over design was used with CEI to teach central limit theorem (CLT). Two sequences of graduate students in nursing…

  1. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  2. Investigation of seismicity and related effects at NASA Ames-Dryden Flight Research Facility, Computer Center, Edwards, California

    Science.gov (United States)

    Cousineau, R. D.; Crook, R., Jr.; Leeds, D. J.

    1985-01-01

    This report discusses a geological and seismological investigation of the NASA Ames-Dryden Flight Research Facility site at Edwards, California. Results are presented as seismic design criteria, with design values of the pertinent ground motion parameters, probability of recurrence, and recommended analogous time-history accelerograms with their corresponding spectra. The recommendations apply specifically to the Dryden site and should not be extrapolated to other sites with varying foundation and geologic conditions or different seismic environments.

  3. A Computer-Aided Instruction Program for Teaching the TOPS20-MM Facility on the DDN (Defense Data Network)

    Science.gov (United States)

    1988-06-01

    provide learning-by-doing, which means a student aquires knowledge by solving real-world problems [Ref. 21. To tutor well, a system has to specify...apply artificial intelligence (AI) techniques. AI research such as natural- language understanding, knowledge representation, and inferencing, have been...programming language by providing a programming environment and online help facility. It has a tutoring module called Soft Tutor that examines the

  4. Computational and Experimental Characterization of the Mach 6 Facility Nozzle Flow for the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Cabell, Karen F.; Passe, Bradley J.; Baurle, Robert A.

    2017-01-01

    Computational fluid dynamics analyses and experimental data are presented for the Mach 6 facility nozzle used in the Arc-Heated Scramjet Test Facility for the Enhanced Injection and Mixing Project (EIMP). This project, conducted at the NASA Langley Research Center, aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics relevant to flight Mach numbers greater than 8. The EIMP experiments use a two-dimensional Mach 6 facility nozzle to provide the high-speed air simulating the combustor entrance flow of a scramjet engine. Of interest are the physical extent and the thermodynamic properties of the core flow at the nozzle exit plane. The detailed characterization of this flow is obtained from three-dimensional, viscous, Reynolds-averaged simulations. Thermodynamic nonequilibrium effects are also investigated. The simulations are compared with the available experimental data, which includes wall static pressures as well as in-stream static pressure, pitot pressure and total temperature obtained via in-stream probes positioned just downstream of the nozzle exit plane.

  5. Liquid Effluent Retention Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Liquid Effluent Retention Facility (LERF) is located in the central part of the Hanford Site. LERF is permitted by the State of Washington and has three liquid...

  6. 中部地区高校体育场馆资源建设现状的分析%Analysis on Present Situation of Universities' Sport Facility Resources Construction in Central China

    Institute of Scientific and Technical Information of China (English)

    张大超; 陈海青

    2012-01-01

    采用文献资料、访问调查、数理统计等方法,以中部地区6省的16所"211工程"院校和12所一本院校的体育场馆资源为研究对象,对其体育场馆资源建设现状进行研究分析。结果发现:1)我国中部地区"211工程"建设院校和一本院校的室外场地总体数量不足,配备率不高,学生人均面积较少,不能满足学生课余体育训练和普通学生课外体育锻炼的需要。2)一本院校体育场馆设施的配备率低于"211工程"院校的配备率。3)游泳馆、风雨操场的数量不但相对较少,而且游泳馆的使用率也比较低。4)以发展学生智力与体力为主的项目和激发学生勇于探险胆识的场所,像野外活动基地、攀岩场所严重匮乏。并提出加大建设、合理规划、提高利用率、拓展校外资源等促进中部地区高校学校体育场馆资源建设健康发展的建议。%Taking 16 "211 project" universities and 12 top universities of six provinces in Central China as research objects,we investigated and analyzed the present situation of sports facilities resources construction of universities in Central China through the methods of literature study,investigation and mathematical statistics.The results showed that: 1) At present,indoor sports facilities development in Central China is good and the per capita area of sports facilities reaches standards prescribed by the state.In general,quantity of outdoor sports facilities in Central China is inadequate,the per capita area of sport facilities can not reach the standards prescribed by the state and can't meet the students' extracurricular physical exercise and extracurricular sport training needs;2) provision rate of sports facilities of first-class universities is lower than the "211 project" university in Central China;3) quantity of swimming pools and indoor stadiums relatively is lower than other sport facilities and the utilization rate of swimming pools

  7. Estimating Groundwater Concentrations from Mass Releases to the Aquifer at Integrated Disposal Facility and Tank Farm Locations Within the Central Plateau of the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Marcel P.; Freeman, Eugene J.

    2005-06-09

    This report summarizes groundwater-related numerical calculations that will support groundwater flow and transport analyses associated with the scheduled 2005 performance assessment of the Integrated Disposal Facility (IDF) at the Hanford Site. The report also provides potential supporting information to other ongoing Hanford Site risk analyses associated with the closure of single-shell tank farms and related actions. The IDF 2005 performance assessment analysis is using well intercept factors (WIFs), as outlined in the 2001 performance assessment of the IDF. The flow and transport analyses applied to these calculations use both a site-wide regional-scale model and a local-scale model of the area near the IDF. The regional-scale model is used to evaluate flow conditions, groundwater transport, and impacts from the IDF in the central part of the Hanford Site, at the core zone boundary around the 200 East and 200 West Areas, and along the Columbia River. The local-scale model is used to evaluate impacts from transport of contaminants to a hypothetical well 100 m downgradient from the IDF boundaries. Analyses similar to the regional-scale analysis of IDF releases are also provided at individual tank farm areas as additional information. To gain insight on how the WIF approach compares with other approaches for estimating groundwater concentrations from mass releases to the unconfined aquifer, groundwater concentrations were estimated with the WIF approach for two hypothetical release scenarios and compared with similar results using a calculational approach (the convolution approach). One release scenario evaluated with both approaches (WIF and convolution) involved a long-term source release from immobilized low-activity waste glass containing 25,550 Ci of technetium-99 near the IDF; another involved a hypothetical shorter-term release of {approx}0.7 Ci of technetium over 600 years from the S-SX tank farm area. In addition, direct simulation results for both release

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.

  10. Reliable Facility Location Problem with Facility Protection.

    Science.gov (United States)

    Tang, Luohao; Zhu, Cheng; Lin, Zaili; Shi, Jianmai; Zhang, Weiming

    2016-01-01

    This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. A rare case of dilated invaginated odontome with talon cusp in a permanent maxillary central incisor diagnosed by cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jaya, Ranganathan; Kumar, Rangarajan Sundaresan Mohan; Srinivasan, Ramasamy [Dept. of Conservative Dentistry and Endodontics, Priyadarshini Dental College and Hospital, Chennai (India)

    2013-09-15

    It has been a challenge to establish the accurate diagnosis of developmental tooth anomalies based on periapical radiographs. Recently, three-dimensional imaging by cone beam computed tomography has provided useful information to investigate the complex anatomy of and establish the proper management for tooth anomalies. The most severe variant of dens invaginatus, known as dilated odontome, is a rare occurrence, and the cone beam computed tomographic findings of this anomaly have never been reported for an erupted permanent maxillary central incisor. The occurrence of talon cusp occurring along with dens invaginatus is also unusual. The aim of this report was to show the importance of cone beam computed tomography in contributing to the accurate diagnosis and evaluation of the complex anatomy of this rare anomaly.

  15. A Study of Computer Facilities Share for the Teaching Purpose%教学为主的计算机设备共享研究

    Institute of Scientific and Technical Information of China (English)

    陶燕丽

    2014-01-01

    The essay tries to examine the utilization efficiency of computer facilities for the purpose of instruction to maximize the utilization of independent lab computer equipment. It uses the cost-sharing method based on Shapley Value in game-playing theory to work out a cost-sharing and-apportioning mechanism for independent labs resources.%本文通过对教学为主的计算机设备的利用率进行分析,对如何最大限度地提高独立性实验室计算机设备利用率进行了研究,提出利用博弈论中的Shapley值法,给出了成本分摊方法,合理设计各独立性实验室之间的成本分摊机制,提出了实验室资源共享管理中合理的成本摊派对策。

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  1. Design of vehicle-mounted central control computer%一种车载中控计算机的设计

    Institute of Scientific and Technical Information of China (English)

    尹申燕; 黄莹莹

    2014-01-01

    The design and implementation of a vehicle-mounted central control computer are introduced in this paper. Based on the function demands of its central control and management,and the reliability requirement of the vehicle-mounted sys-tem applied in severe environment,the electrical and structural framework design of the whole system is discussed. The key points of design of main circuit boards such as embedded CPCI computing board,KVM switching board,gigabit network switch-ing board and multi-serial-port expanding board are elaborated. Some measures of structural strengthening,vibration reduction and heat dissipation for the computer are proposed. The central control computer has high processing capability,convenient oper-ation,abundant expanding interfaces and high reliability. The design method can be used as a reference for design of similar central control computers.%全面介绍一种车载平台的中心控制计算机的设计与实现,基于其中心控制与管理的功能需求和车载平台严酷使用环境的可靠性要求,首先论述整机电气和结构的架构设计,然后详细阐述单机内部主要电路板卡如嵌入式CPCI计算机板、KVM切换板、千兆网络交换板、多串口扩展板的电气设计要点,最后提出单机结构加固及减振散热措施。该中心控制计算处理能力强、操作便捷、接口众多、可靠性高,对同类车载平台中心控制计算机的设计具有很强的参考意义。

  2. Using Centrality of Concept Maps as a Measure of Problem Space States in Computer-Supported Collaborative Problem Solving

    Science.gov (United States)

    Clariana, Roy B.; Engelmann, Tanja; Yu, Wu

    2013-01-01

    Problem solving likely involves at least two broad stages, problem space representation and then problem solution (Newell and Simon, Human problem solving, 1972). The metric centrality that Freeman ("Social Networks" 1:215-239, 1978) implemented in social network analysis is offered here as a potential measure of both. This development research…

  3. User's manual for DELSOL2: a computer code for calculating the optical performance and optimal system design for solar-thermal central-receiver plants

    Energy Technology Data Exchange (ETDEWEB)

    Dellin, T.A.; Fish, M.J.; Yang, C.L.

    1981-08-01

    DELSOL2 is a revised and substantially extended version of the DELSOL computer program for calculating collector field performance and layout, and optimal system design for solar thermal central receiver plants. The code consists of a detailed model of the optical performance, a simpler model of the non-optical performance, an algorithm for field layout, and a searching algorithm to find the best system design. The latter two features are coupled to a cost model of central receiver components and an economic model for calculating energy costs. The code can handle flat, focused and/or canted heliostats, and external cylindrical, multi-aperture cavity, and flat plate receivers. The program optimizes the tower height, receiver size, field layout, heliostat spacings, and tower position at user specified power levels subject to flux limits on the receiver and land constraints for field layout. The advantages of speed and accuracy characteristic of Version I are maintained in DELSOL2.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  5. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  6. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  8. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. Treatment of malaria from monotherapy to artemisinin-based combination therapy by health professionals in urban health facilities in Yaoundé, central province, Cameroon

    OpenAIRE

    Bley Daniel; Malvy Denis; Vernazza-Licht Nicole; Gausseres Mathieu; Sayang Collins; Millet Pascal

    2009-01-01

    Abstract Background After adoption of artesunate-amodiaquine (AS/AQ) as first-line therapy for the treatment of uncomplicated malaria by the malaria control programme, this study was designed to assess the availability of anti-malarial drugs, treatment practices and acceptability of the new protocol by health professionals, in the urban health facilities and drugstores of Yaoundé city, Cameroon. Methods Between April and August 2005, retrospective and current information was collected by cons...

  11. The Remote Centralized Control Management Strategy of the Campus Network-based Multi-media Facilities%基于校园网的多媒体设备远程集控管理策略

    Institute of Scientific and Technical Information of China (English)

    李伽

    2012-01-01

    The remote centralized control management system which is based on the campus network was designed.Its structure,function and mainly introduces operating mode which is managed by the central control unit and the remote control of Intel AMT technology is descripted.In connection with the experience in the management of the multimedia facilities,a series of measures to perfect the management system of multimedia facilities,such as building the classroom aided teaching platform,teaching management platform,and realizing the integration of multi-system,etc.,are presented.%针对某高校较多的多媒体设备,设计了基于校园网的多媒体设备远程集控管理系统,并介绍了该系统主要的结构、功能,着重介绍了中央控制器管理和利用Intel AMT技术实现远程管理的工作方式.结合多年从事多媒体设备管理实践,对学校多媒体设备管理系统提出了建立教室辅助教学平台、教学管理平台、实现多系统整合等一系列完善措施.

  12. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  13. FRS (Facility Registration System) Sites, Geographic NAD83, EPA (2007) [facility_registration_system_sites_LA_EPA_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — This dataset contains locations of Facility Registry System (FRS) sites which were pulled from a centrally managed database that identifies facilities, sites or...

  14. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  15. Development of a computer code for shielding calculation in X-ray facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em salas radiograficas

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Diogo da S.; Lava, Deise D.; Affonso, Renato R.W.; Moreira, Maria de L.; Guimaraes, Antonio C.F., E-mail: diogosb@outlook.com, E-mail: deise_dy@hotmail.com, E-mail: raoniwa@yahoo.com.br, E-mail: malu@ien.gov.br, E-mail: tony@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011.

  16. Development of a computational code for calculations of shielding in dental facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em instalacoes odontologicas

    Energy Technology Data Exchange (ETDEWEB)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L., E-mail: deise_dy@hotmail.com, E-mail: diogosb@outlook.com, E-mail: raoniwa@yahoo.com.br, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report.

  17. Modelado del sistema nervioso central humano usando una metodología híbrida de soft computing

    OpenAIRE

    Acosta, Jesús; Nebot Castells, M. Àngela; Fuertes Armengol, José Mª

    2006-01-01

    Desde el punto de vista médico, el estudio y análisis del sistema cardiovascular humano por medio de metodologías de modelado y simulación es de gran relevancia porque permite a los doctores adquirir un mejor conocimiento de la fisiología cardiovascular, pudiendo ofrecer un diagnóstico más preciso y seleccionar la terapia más adecuada. El sistema cardiovascular humano está conformado por el sistema hemodinámico y el Sistema Nervioso Central (CNS). El CNS genera las señales que son trasmit...

  18. Cost-effectiveness analysis of introducing RDTs for malaria diagnosis as compared to microscopy and presumptive diagnosis in central and peripheral public health facilities in Ghana

    DEFF Research Database (Denmark)

    Ansah, Evelyn K; Epokor, Michael; Whitty, Christopher J M

    2013-01-01

    Cost-effectiveness information on where malaria rapid diagnostic tests (RDTs) should be introduced is limited. We developed incremental cost-effectiveness analyses with data from rural health facilities in Ghana with and without microscopy. In the latter, where diagnosis had been presumptive......, the introduction of RDTs increased the proportion of patients who were correctly treated in relation to treatment with antimalarials, from 42% to 65% at an incremental societal cost of Ghana cedis (GHS)12.2 (US$8.3) per additional correctly treated patients. In the "microscopy setting" there was no advantage...

  19. Case of acute meningitis with clear cerebrospinal fluid: value of computed tomography for the diagnosis of central nervous system tuberculosis

    Energy Technology Data Exchange (ETDEWEB)

    Cesari, V.

    1986-11-06

    The author reports a case of acute meningitis with clear cerebrospinal fluid in which extensive bacteriologic investigations were negative making the etiologic diagnosis exceedingly difficult. Initiation of empiric antituberculous therapy was rapidly followed by clinical and biological improvement, without complications, and by resolution of abnormal findings on computed tomography of the brain. On these grounds, meningitis secondary to a tuberculoma in the temporal lobe was diagnosed. The author points out that tuberculous meningitis is still a severe, potentially fatal condition; this, together with the fact that tubercle bacilli are often very scarce or absent, requires that tuberculous meningitis be routinely considered in every patient with clear cerebrospinal fluid meningitis whose condition deteriorates. Computed tomography of the brain is essential to ensure rapid diagnosis and prompt initiation of antituberculous therapy. Lastly, the author points out that nowadays herpes simplex virus encephalopathy should also be considered.

  20. Central nervous system abnormalities on midline facial defects with hypertelorism detected by magnetic resonance image and computed tomography; Anomalias de sistema nervoso central em defeitos de linha media facial com hipertelorismo detectados por ressonancia magnetica e tomografia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, Vera Lucia Gil da Silva; Giffoni, Silvio David Araujo [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Ciencias Medicas. Dep. de Genetica Medica]. E-mail: vlopes@fcm.unicamp.br

    2006-10-15

    The aim of this study were to describe and to compare structural central nervous system (CNS) anomalies detected by magnetic resonance image (MRI) and computed tomography (CT) in individuals affected by midline facial defects with hypertelorism (MFDH) isolated or associated with multiple congenital anomalies (MCA). The investigation protocol included dysmorphological examination, skull and facial X-rays, brain CT and/or MRI. We studied 24 individuals, 12 of them had an isolated form (Group I) and the others, MCA with unknown etiology (Group II). There was no significant difference between Group I and II and the results are presented in set. In addition to the several CNS anomalies previously described, MRI (n=18) was useful for detection of neuronal migration errors. These data suggested that structural CNS anomalies and MFDH seem to have an intrinsic embryological relationship, which should be taken in account during the clinical follow-up. (author)

  1. Treatment of malaria from monotherapy to artemisinin-based combination therapy by health professionals in urban health facilities in Yaoundé, central province, Cameroon

    Directory of Open Access Journals (Sweden)

    Bley Daniel

    2009-07-01

    Full Text Available Abstract Background After adoption of artesunate-amodiaquine (AS/AQ as first-line therapy for the treatment of uncomplicated malaria by the malaria control programme, this study was designed to assess the availability of anti-malarial drugs, treatment practices and acceptability of the new protocol by health professionals, in the urban health facilities and drugstores of Yaoundé city, Cameroon. Methods Between April and August 2005, retrospective and current information was collected by consulting registers and interviewing health practitioners in urban health facilities using a structured questionnaire. Results In 2005, twenty-seven trade-named drugs have been identified in drugstores; quinine tablets (300 mg were the most affordable anti-malarial drugs. Chloroquine was restricted to food market places and no generic artemisinin derivative was available in public health centres. In public health facilities, 13.6% of health professionals were informed about the new guidelines; 73.5% supported the use of AS-AQ as first-line therapy. However, 38.6% apprehended its use due to adverse events attributed to amodiaquine. Malaria treatment was mainly based on the diagnosis of fever. Quinine (300 mg tablets was the most commonly prescribed first-line anti-malarial drug in adults (44.5% and pregnant women (52.5%. Artequin® was the most cited artemsinin-based combination therapy (ACT (9.9%. Medical sales representatives were the main sources of information on anti-malarials. Conclusion The use of AS/AQ was not implemented in 2005 in Yaoundé, despite the wide range of anti-malarials and trade-named artemisinin derivatives available. Nevertheless, medical practitioners will support the use of this combination, when it is available in a paediatric formulation, at an affordable price. Training, information and participation of health professionals in decision-making is one of the key elements to improve adherence to new protocol guidelines. This baseline

  2. Energetic behavior of a facility. Optical losses: Energetic behavior of a photovoltaic power plant; Centrales fotovoltaicas con sequimiento azimutal. Perdidas opticas: Comportamiento energetico de una instalacion

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, M.; Marroyo, L.; Lorenzo, E.; Perez, M.

    2009-07-01

    The quantity of radiation which reaches to the cells inside the photovoltaic modules is usually less than radiation received by on the surface of modules. this is mainly caused by accumulated dirtiness on referred surface (dust, pollution...) as well as losses produced by reflection and absorption in the cells cover materials. These reflection and absorption losses depend on the radiation incidence angle, and that is why they are normally known as angular losses. the angular losses tend to increase in the same proportion that increases uncleanness on the surface, and consequently it must be considered together with the ones occasioned by the mentioned uncleanness. Therefore, the term optical losses includes both types of losses, and the knowledge of the optical loss value is very important when we analyze the energetic behavior of a photovoltaic facility, and also when the prediction of this behavior is intended. (Author) 18 refs.

  3. Computer tomography and magnetic resonance imaging of a central neurocytoma; Computertomographische und magnetresonanztomographische Befunde beim zentralen Neurozytom

    Energy Technology Data Exchange (ETDEWEB)

    Freund, M.; Jansen, O.; Haehnel, S.; Sartor, K. [Heidelberg Univ. (Germany). Abt. fuer Neuroradiologie; Geletneky, K. [Heidelberg Univ. (Germany). Neurochirurgische Klinik

    1998-05-01

    Purpose: Analysis of our CT and MRI findings in patients with central neurocytomas in comparison with the relevant literature. Results: All tumors originated in the roof of a lateral ventricle with participation of the pellucid septum and extended intraventricularly (5/5), showed cystic components (5/5), and took up contrast medium (5/5). Contrast medium uptake was visible in both CT and MRI. In contrast, only three of the 5 tumors revealed calcifications (3/5) and these were better visible in CT than in MRI, no pathological vessels were detected. Because of its multi-planar representations and better soft-tissue contrast, MRI was superior to CT for the exact determination of origin and position of the tumors. The small cystic, inhomogenous appearance in T{sub 2}-weighted images was considered to be an especially typical feature. Conclusions: The typical appearance of central neurocytomas in CT and MRI provides information for the differential diagnosis from other intraventricular tumors. The definitive diagnosis is provided by neuropathological evaluation. (orig./AJ) [Deutsch] Ziel: Analyse eigener CT- und MRT-Befunde bei Patienten mit zentralen Neurozytomen im Vergleich zur einschlaegigen Literatur. Ergebnisse: Alle Tumoren nahmen ihren Ursprung vom Dach eines Seitenventrikels mit Beteiligung des Septum pellucidum und dehnten sich nach intraventrikulaer aus (5/5), zeigten zystische Anteile (5/5) und nahmen Kontrastmittel auf (5/5). Die KM-Aufnahme war in der CT und der MRT nachweisbar. Verkalkte Anteile zeigten dagegen nur drei der 5 Tumoren (3/5), dies war besser in der CT als in der MRT erkennbar, pathologische Gefaesse wurden nicht nachgewiesen (0/5). Aufgrund der multiplanaren Darstellung und des besseren Weichteilkontrasts war die MRT der CT ueberlegen bei der exakten Bestimmung von Tumorursprung und -lage. Als besonders typisch fanden wir das kleinsystische, inhomogene Erscheinungsbild auf T{sub 2}-gewichteten Aufnahmen. Schlussfolgerungen: Das typische

  4. 10-MWe solar-thermal central-receiver pilot plant, solar facilities design integration: collector-field optimization report (RADL item 2-25)

    Energy Technology Data Exchange (ETDEWEB)

    1981-01-01

    Appropriate cost and performance models and computer codes have been developed to carry out the collector field optimization, as well as additional computer codes to define the actual heliostat locations in the optimized field and to compute in detail the performance to be expected of the defined field. The range of capabilities of the available optimization and performance codes is described. The role of the optimization code in the definition of the pilot plant is specified, and a complete description of the optimization process itself is given. The detailed cost model used by the optimizer for the commercial system optimization is presented in the form of equations relating the cost element to each of the factors that determine it. The design basis for the commercial system is presented together with the rationale for its selection. The development of the individual heliostat performance code is presented. Use of the individual heliostat code in a completed study of receiver panel power under sunrise startup conditions is described. The procedure whereby performance and heliostat spacing data from the representative commercial-scale system are converted into coefficients of use in the layout processor is described, and the actual procedure used in the layout processor is described. Numerous special studies in support of the pilot plant design are described. (LEW)

  5. A mathematical model using computed tomography for the diagnosis of metastatic central compartment lymph nodes in papillary thyroid carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Tianrun [The Sixth Affiliated Hospital of Sun Yat-sen University, Guangzhou (China); Su, Xuan; Chen, Weichao; Zheng, Lie; Li, Li; Yang, AnKui [Sun Yat-sen University Cancer Center, Guangzhou (China)

    2014-11-15

    The purpose of this study was to establish a potential mathematical model for the diagnosis of the central compartment lymph node (LN) metastases of papillary thyroid carcinoma (PTC) using CT imaging. 303 patients with PTC were enrolled. We determined the optimal cut-off points of LN size and nodal grouping by calculating the diagnostic value of each cut-off point. Then, calcification, cystic or necrotic change, abnormal enhancement, size, and nodal grouping were analysed by univariate and multivariate statistical methods. The mathematical model was obtained using binary logistic regression analysis, and a scoring system was developed for convenient use in clinical practice. 30 mm{sup 2} for LNs area (size) and two LNs as the nodal grouping criterion had the best diagnostic value. The mathematical model was: p = e{sup y} /(1+ e {sup y}), y = -0.670-0.087 x size + 1.010 x cystic or necrotic change + 1.371 x abnormal enhancement + 0.828 x nodal grouping + 0.909 x area. We assigned the value for cystic or necrotic change, abnormal enhancement, size and nodal grouping value as 25, 33, 20, and 22, respectively, yielding a scoring system. This mathematical model has a high diagnostic value and is a convenient clinical tool. (orig.)

  6. Endodontic and esthetic management of a dilacerated maxillary central incisor having two root canals using cone beam computed tomography as a diagnostic aid.

    Science.gov (United States)

    Sharma, Sarang; Grover, Shibani; Sharma, Vivek; Srivastava, Dhirendra; Mittal, Meenu

    2014-01-01

    Traumatic injuries to the primary dentition are quite common. When primary teeth are subjected to trauma, force transmission and/or invasion of the underlying tooth germs lying in close proximity can result in a variety of disturbances in the permanent successors. Few of these disturbances include hypoplasia, dilaceration, or alteration in the eruption sequence and pattern. Dilaceration is defined as an angulation or sharp bend or curve in the linear relationship of the crown of a tooth to its root. A rare case of maxillary left central incisor having crown dilaceration and Vertucci's type II canal configuration with symptomatic periapical periodontitis is reported. Cone beam computed tomography was used for better understanding of the anomaly and complicated root canal morphology. The tooth was successfully managed by nonsurgical root canal therapy and restoration with resin composite to restore esthetics.

  7. Endodontic and Esthetic Management of a Dilacerated Maxillary Central Incisor Having Two Root Canals Using Cone Beam Computed Tomography as a Diagnostic Aid

    Directory of Open Access Journals (Sweden)

    Sarang Sharma

    2014-01-01

    Full Text Available Traumatic injuries to the primary dentition are quite common. When primary teeth are subjected to trauma, force transmission and/or invasion of the underlying tooth germs lying in close proximity can result in a variety of disturbances in the permanent successors. Few of these disturbances include hypoplasia, dilaceration, or alteration in the eruption sequence and pattern. Dilaceration is defined as an angulation or sharp bend or curve in the linear relationship of the crown of a tooth to its root. A rare case of maxillary left central incisor having crown dilaceration and Vertucci’s type II canal configuration with symptomatic periapical periodontitis is reported. Cone beam computed tomography was used for better understanding of the anomaly and complicated root canal morphology. The tooth was successfully managed by nonsurgical root canal therapy and restoration with resin composite to restore esthetics.

  8. Central odontogenic fibroma (simple type) in a four year old boy: Atypical cone-beam computed tomographic appearance with periosteal reaction

    Energy Technology Data Exchange (ETDEWEB)

    Anbiaee, Najme; Ebrahimnejad, Hamed; Sanaei, Alireza [Dept. of , Oral and Maxillofacial Radiology, Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2015-06-15

    Central odontogenic fibroma (COF) is a rare benign tumor that accounts for 0.1% of all odontogenic tumors. A case of COF (simple type) of the mandible in a four-year-old boy is described in this report. The patient showed asymptomatic swelling in the right inferior border of the lower jaw for one week. A panoramic radiograph showed a poorly-defined destructive unilocular radiolucent area. Cone-beam computed tomography showed expansion and perforation of the adjacent cortical bone plates. A periosteal reaction with the Codman triangle pattern was clearly visible in the buccal cortex. Since the tumor had destroyed a considerable amount of bone, surgical resection was performed. No recurrence was noted.

  9. Geohydrology of the High Energy Laser System Test Facility site, White Sands Missile Range, Tularosa Basin, south-central New Mexico

    Science.gov (United States)

    Basabilvazo, G.T.; Nickerson, E.L.; Myers, R.G.

    1994-01-01

    The Yesum-HoHoman and Gypsum land (hummocky) soils at the High Energy Laser System Test Facility (HELSTF) represent wind deposits from recently desiccated lacustrine deposits and deposits from the ancestral Lake Otero. The upper 15-20 feet of the subsurface consists of varved gypsiferous clay and silt. Below these surfidai deposits the lithology consists of interbedded clay units, silty-clay units, and fine- to medium-grained quartz arenite units in continuous and discontinuous horizons. Clay horizons can cause perched water above the water table. Analyses of selected clay samples indicate that clay units are composed chiefly of kaolinire and mixed-layer illite/ smectite. The main aquifer is representative of a leaky-confined aquifer. Estimated aquifer properties are: transmissivity (T) = 780 feet squared per day, storage coefficient (S) = 3.1 x 10-3, and hydraulic conductivity (K) = 6.0 feet per day. Ground water flows south and southwest; the estimated hydraulic gradient is 5.3 feet per mile. Analyses of water samples indicate that ground water at the HELSTF site is brackish to slightly saline at the top of the main aquifer. Dissolved-solids concentration near the top of the main aquifer ranges from 5,940 to 11,800 milligrams per liter. Predominant ions are sodium and sulfate. At 815 feet below land surface, the largest dissolved-solids concentration measured is 111,000 milligrams per liter, which indicates increasing salinity with depth. Predominant ions are sodium and chloride.

  10. A Milky Way with a massive, centrally concentrated thick disc: new Galactic mass models for orbit computations

    CERN Document Server

    Pouliasis, Ektoras; Haywood, Misha

    2016-01-01

    In this work, two new axisymmetric models for the Galactic mass distribution are presented. Motivated by recent results, these two models include the contribution of a stellar thin disc and of a thick disc, as massive as the thin counterpart but with a shorter scale-length. Both models satisfy a number of observational constraints: stellar densities at the solar vicinity, thin and thick disc scale lengths and heights, rotation curve(s), and the absolute value of the perpendicular force Kz as a function of distance to the Galactic centre. We numerically integrate into these new models the motion of all Galactic globular clusters for which distances, proper motions, and radial velocities are available, and the orbits of about one thousand stars in the solar vicinity. The retrieved orbital characteristics are compared to those obtained by integrating the clusters and stellar orbits in pure thin disc models. We find that, due to the possible presence of a thick disc, the computed orbital parameters of disc stars ...

  11. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurie, Carol

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  12. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  13. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Science.gov (United States)

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  14. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  15. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    Science.gov (United States)

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-08-18

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  16. Temporal evaluation of computed tomographic scans at a Level 1 trauma department in a central South African hospital

    Directory of Open Access Journals (Sweden)

    Tony Tiemesmann

    2016-03-01

    Full Text Available Background: Time is a precious commodity, especially in the trauma setting, which requires continuous evaluation to ensure streamlined service delivery, quality patient care and employee efficiency.Objectives: The present study analyses the authors’ institution’s multi-detector computed tomography (MDCT scan process as part of the imaging turnaround time of trauma patients. It is intended to serve as a baseline for the institution, to offer a comparison with institutions worldwide and to improve service delivery.Method: Relevant categorical data were collected from the trauma patient register and radiological information system (RIS from 01 February 2013 to 31 January 2014. A population of 1107 trauma patients who received a MDCT scan was included in the study. Temporal data were analysed as a continuum with reference to triage priority, time of day, type of CT scan and admission status. Results: The median trauma arrival to MDCT scan time (TTS and reporting turnaround time (RTAT were 69 (39–126 and 86 (53–146 minutes respectively. TTS was subdivided into the time when the patient arrived at trauma to the radiology referral (TTRef and submission of the radiology request, to the arrival at the MDCT (RefTS location. TTRef was statistically significantly longer than RefTS (p < 0.0001. RTAT was subdivided into the arrival at the MDCT to the start of the radiology report (STR and time taken to complete the report (RT. STR was statistically significantly longer than RT (p < 0.0001. Conclusion: The time to scan (TTS was comparable to, but unfortunately the report turnaround time (RTAT lagged behind, the findings of some first-world institutions.

  17. Fabrication Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Fabrication Facilities are a direct result of years of testing support. Through years of experience, the three fabrication facilities (Fort Hood, Fort Lewis, and...

  18. Computational and biochemical docking of the irreversible cocaine analog RTI 82 directly demonstrates ligand positioning in the dopamine transporter central substrate-binding site.

    Science.gov (United States)

    Dahal, Rejwi Acharya; Pramod, Akula Bala; Sharma, Babita; Krout, Danielle; Foster, James D; Cha, Joo Hwan; Cao, Jianjing; Newman, Amy Hauck; Lever, John R; Vaughan, Roxanne A; Henry, L Keith

    2014-10-24

    The dopamine transporter (DAT) functions as a key regulator of dopaminergic neurotransmission via re-uptake of synaptic dopamine (DA). Cocaine binding to DAT blocks this activity and elevates extracellular DA, leading to psychomotor stimulation and addiction, but the mechanisms by which cocaine interacts with DAT and inhibits transport remain incompletely understood. Here, we addressed these questions using computational and biochemical methodologies to localize the binding and adduction sites of the photoactivatable irreversible cocaine analog 3β-(p-chlorophenyl)tropane-2β-carboxylic acid, 4'-azido-3'-iodophenylethyl ester ([(125)I]RTI 82). Comparative modeling and small molecule docking indicated that the tropane pharmacophore of RTI 82 was positioned in the central DA active site with an orientation that juxtaposed the aryliodoazide group for cross-linking to rat DAT Phe-319. This prediction was verified by focused methionine substitution of residues flanking this site followed by cyanogen bromide mapping of the [(125)I]RTI 82-labeled mutants and by the substituted cysteine accessibility method protection analyses. These findings provide positive functional evidence linking tropane pharmacophore interaction with the core substrate-binding site and support a competitive mechanism for transport inhibition. This synergistic application of computational and biochemical methodologies overcomes many uncertainties inherent in other approaches and furnishes a schematic framework for elucidating the ligand-protein interactions of other classes of DA transport inhibitors.

  19. Facility Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Z.; Walling, R.; Miller, N.; Du, P.; Nelson, K.

    2005-05-01

    Microgrids are receiving a considerable interest from the power industry, partly because their business and technical structure shows promise as a means of taking full advantage of distributed generation. This report investigates three issues associated with facility microgrids: (1) Multiple-distributed generation facility microgrids' unintentional islanding protection, (2) Facility microgrids' response to bulk grid disturbances, and (3) Facility microgrids' intentional islanding.

  20. RCRA Facility Investigation/Remedial Investigation Report with Baseline Risk Assessment for the Central Shops Burning/Rubble Pit (631-6G), Volume 1 Final

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    The Burning/Rubble Pits at the Savannah River Site were usually shallow excavations approximately 3 to 4 meters in depth. Operations at the pits consisted of collecting waste on a continuous basis and burning on a monthly basis. The Central Shops Burning/Rubble Pit 631- 6G (BRP6G) was constructed in 1951 as an unlined earthen pit in surficial sediments for disposal of paper, lumber, cans and empty galvanized steel drums. The unit may have received other materials such as plastics, rubber, rags, cardboard, oil, degreasers, or drummed solvents. The BRP6G was operated from 1951 until 1955. After disposal activities ceased, the area was covered with soil. Hazardous substances, if present, may have migrated into the surrounding soil and/or groundwater. Because of this possibility, the United States Environmental Protection Agency (EPA) has designated the BRP6G as a Solid Waste Management Unit (SWMU) subject to the Resource Conservation Recovery Act/Comprehensive Environmental Response, Compensation and Liability Act (RCRA/CERCLA) process.

  1. 47 CFR 69.110 - Entrance facilities.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Entrance facilities. 69.110 Section 69.110... Computation of Charges § 69.110 Entrance facilities. (a) A flat-rated entrance facilities charge expressed in... that use telephone company facilities between the interexchange carrier or other person's point...

  2. Effect of Spinal Degenerative Changes on Volumetric Bone Mineral Density of the Central Skeleton as Measured by Quantitative Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, G. [Hospital ' Casa Sollievo della Sofferenza' , San Giovanni Rotondo (Italy). Dept. of Radiology; Floriani, I.; Torri, V.; Li, J.; Kuijk, C. van; Genant, H.K.; Lang, T.F. [Scientific Inst. ' Mario Negri' , Milan (Italy). Biometry and Data Management Unit

    2005-05-01

    Purpose: To evaluate the impact of degenerative changes due to osteoarthritis (OA) at the spine on volumetric bone mineral density (BMD) as measured by volumetric quantitative computed tomography (vQCT). Material and Methods: Eighty-four elderly women (mean age 73 {+-}6 years), comprising 33 with vertebral fractures assessed by radiographs and 51 without vertebral fractures, were studied. Trabecular, cortical, and integral BMD were examined at the spine and hip using a helical CT scanner and were compared to dual X-ray absorptiometry (DXA) measurements at the same sites. OA changes visible on the radiographs were categorized into two grades according to severity. Differences in BMD measures obtained in the two groups of patients defined by OA grade using the described radiologic methods were compared using analysis of variance. Standardized difference (effect sizes) was also compared between radiologic methods. Results: Spinal trabecular BMD did not differ significantly between OA grade 0 and OA grade. Spinal cortical and integral BMD measures showed statistically significant differences, as did the lumbar spine DXA BMD measurement (13%, P{approx_equal}0.02). The QCT measurements at the hip were also higher in OA subjects. Femoral trabecular BMD was 3-15% higher in OA grade subjects than in OA grade 0 subjects. The cortical BMD measures in the CT{sub T}OT{sub F}EM and CT{sub T}ROCH ROI's were also higher in the OA subjects. The integral QCT BMD measures in the hip showed difference between grades OA and 0. The DXA measurements in the neck and trochanter ROI's showed smaller differences (9 and 1%, respectively). There were no statistically significant differences in bone size. Conclusion: There is no evidence supporting that trabecular BMD measurements by QCT are influenced by OA. Instead, degenerative changes have an effect on both cortical and integral QCT, and on DXA at the lumbar spine and the hip. For subjects with established OA, assessment of BMD by

  3. LaRC local area networks to support distributed computing

    Science.gov (United States)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  4. Computed tomography morphometric analysis of the central clival depression and petroclival angle for application of the presigmoid approach in the pediatric population

    Directory of Open Access Journals (Sweden)

    Sohum K Desai

    2016-01-01

    Full Text Available Aims: Lateral transtemporal approaches are useful for addressing lesions located ventral to the brainstem, especially when the pathologic diagnosis of the tumor dictates that a gross or near total resection improves outcomes. One approach, the presigmoid approach receives little attention in the pediatric population thus far. We sought to characterize morphometric changes, particularly the clival depth and the petroclival Cobb angle, that occur in the temporal bones of children and draw implications about doing a presigmoid approach in children. Settings and Design: This study was a retrospective study performed at John Sealy Hospital, a level-one trauma center that takes care of pediatric injuries as well. Subjects and Methods: We performed a morphometric analysis of noncontrast computed tomography head studies in 96 boys and 67 girls. Central clival depth and petroclival angle were obtained in the axial plane at the level of the internal auditory meatus using the method described by Abdel Aziz et al. Statistical Analysis Used: Descriptive statistics and Student′s t-test to compare groups were calculated using Microsoft Excel. Results: We found no gender difference in mean central clival depth or petroclival angle (P = 0.98 and P = 0.61, respectively. However, when we broke our cohort by age into those younger than 9 years of age and those 10 years or older, we found the petroclival angle decreased by 6.2° which was statistically significant (P < 0.000000006. Conclusions: These findings suggest that a presigmoid retrolabyrinthine approach is useful for children 9 years of age and younger as the petroclival angle appears to decrease resulting in a shallower clival depression in these patients.

  5. Variable gravity research facility

    Science.gov (United States)

    Allan, Sean; Ancheta, Stan; Beine, Donna; Cink, Brian; Eagon, Mark; Eckstein, Brett; Luhman, Dan; Mccowan, Daniel; Nations, James; Nordtvedt, Todd

    1988-01-01

    Spin and despin requirements; sequence of activities required to assemble the Variable Gravity Research Facility (VGRF); power systems technology; life support; thermal control systems; emergencies; communication systems; space station applications; experimental activities; computer modeling and simulation of tether vibration; cost analysis; configuration of the crew compartments; and tether lengths and rotation speeds are discussed.

  6. Diagnosis of an infected central venous catheter with ultrasound and computed tomography; Diagnose eines infizierten Thrombus der Vena cava inferior mit Sonographie und Computertomographie

    Energy Technology Data Exchange (ETDEWEB)

    Tacke, J. [RWTH Aachen (Germany). Klinik fuer Radiologische Diagnostik; Adam, G. [RWTH Aachen (Germany). Klinik fuer Radiologische Diagnostik; Sliwka, U. [RWTH Aachen (Germany). Neurologische Klinik; Klosterhalfen, B. [RWTH Aachen (Germany). Inst. fuer Pathologie; Schoendube, F. [RWTH Aachen (Germany). Klinik fuer Thorax- Herz- und Gefaesschirurgie

    1995-08-01

    The authors report the case of a 16-year-old male patient, who suffered from meningitis and Waterhouse-Friderichsen syndrome. After initial improvement in the intensive care unit, he developed septic temperatures, caused by an infected thrombus of a central venous catheter in the inferior vena cava, Color-coded ultrasound showed hyperechogenic signals and missing flow detection at the catheter tip. Computed tomography showed air bubbles in the thrombosed catheter tip and confirmed the diagnosis. Vasuclar surgery was done and an infected, 17-cm-long infected thrombus was removed. (orig./VHE) [Deutsch] Die Autoren berichten ueber den Fall eines 16jaehrigen Patienten, dem wegen einer Meningitis und der Zeichen eines Waterhouse-Friderichsen-Syndroms ein femoralvenoeser Zentralkatheter gelegt wurde. Nach initialer Entfieberung entwickelte sich eine Sepsis, deren Ursache in einem infizierten Thrombus des Zentralvenenkatheters lag. Die Diagnose wurde sonographisch gestellt und nachfolgend computertomographisch bestaetigt. In beiden Verfahren wiesen Lufteinschluesse im Katheterthrombus auf die Injektion hin. Der Befund wurde durch eine gefaesschirurgische Thrombektomie bestaetigt und therapiert. (orig./VHE)

  7. 6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

    CERN Multimedia

    Teams : M. Brice, JC Gadmer

    2010-01-01

    6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

  8. Survey of solar thermal test facilities

    Energy Technology Data Exchange (ETDEWEB)

    Masterson, K.

    1979-08-01

    The facilities that are presently available for testing solar thermal energy collection and conversion systems are briefly described. Facilities that are known to meet ASHRAE standard 93-77 for testing flat-plate collectors are listed. The DOE programs and test needs for distributed concentrating collectors are identified. Existing and planned facilities that meet these needs are described and continued support for most of them is recommended. The needs and facilities that are suitable for testing components of central receiver systems, several of which are located overseas, are identified. The central contact point for obtaining additional details and test procedures for these facilities is the Solar Thermal Test Facilities Users' Association in Albuquerque, N.M. The appendices contain data sheets and tables which give additional details on the technical capabilities of each facility. Also included is the 1975 Aerospace Corporation report on test facilities that is frequently referenced in the present work.

  9. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  10. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 2; Unsteady Analyses and Risk Assessment

    Science.gov (United States)

    Ahuja, Vineet; Hosangadi, Ashvin; Allgood, Daniel

    2008-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Design Support of the feasibility of operating conditions and procedures is critical in such cases due to the possibility of startup/shutdown transients, moving shock structures, unsteady shock-boundary layer interactions and engine and diffuser unstart modes that can result in catastrophic failure. Analyses of such systems is difficult due to resolution requirements needed to accurately capture moving shock structures, shock-boundary layer interactions, two-phase flow regimes and engine unstart modes. In a companion paper, we will demonstrate with the use of CFD, steady analyses advanced capability to evaluate supersonic diffuser and steam ejector performance in the sub-scale A-3 facility. In this paper we will address transient issues with the operation of the facility especially at startup and shutdown, and assess risks related to afterburning due to the interaction of a fuel rich plume with oxygen that is a by-product of the steam ejectors. The primary areas that will be addressed in this paper are: (1) analyses of unstart modes due to flow transients especially during startup/ignition, (2) engine safety during the shutdown process (3) interaction of steam ejectors with the primary plume i.e. flow transients as well as probability of afterburning. In this abstract we discuss unsteady analyses of the engine shutdown process. However, the final paper will include analyses of a staged startup, drawdown of the engine test cell pressure, and risk assessment of potential afterburning in the facility. Unsteady

  11. 中央维护计算机系统中的故障处理技术%Fault Data Processing Technology Applied in Central Maintenance Computer System

    Institute of Scientific and Technical Information of China (English)

    李文娟; 贺尔铭; 马存宝

    2014-01-01

    Based on knowledge on how failures are generated and processed onboard the aircraft,Implement functions,modeling strategy and experienced procedure of fault data processing including in central maintenance computer system (CMCS)are analyzed.Fault message generation,cascaded fault screening,multiple fault consolidation and flight deck effect (FDE)correlation with maintenance message are present respectively.It is critical that logic equation-based fault isolation technology is discussed for correlation design strategy to fault diag-nosis module by taking foreign advanced airplane model as the example.With the development and maturity of CMCS,maintenance and man-agement mode for commercial vehicle are changed inevitability.%针对飞机故障的产生逻辑和处理方法,详细分析了中央维护计算机系统故障数据处理模块的主要功能,建模思想和处理流程;再现了故障信息产生、级联效应删除、重复故障合并以及飞机驾驶舱效应与维护信息关联的全过程。并以国外先进机型为例,对基于逻辑方程的故障隔离技术进行了深入分析,使得系统设计思路与机载故障诊断模型相结合;随着 CMCS技术的不断发展和成熟,必将对商用飞机传统维护模式和运营模式产生深远的影响。

  12. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  13. Directory of computer users in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, J.J.; Gurney, J.; McClain, W.J. (eds.)

    1979-09-01

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included. (PCS)

  14. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  15. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  16. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, such as birthing centers and psychiatric care centers. When you ...

  17. Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC

    Science.gov (United States)

    Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet

    1999-01-01

    The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.

  18. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  19. High-speed packet switching network to link computers

    CERN Document Server

    Gerard, F M

    1980-01-01

    Virtually all of the experiments conducted at CERN use minicomputers today; some simply acquire data and store results on magnetic tape while others actually control experiments and help to process the resulting data. Currently there are more than two hundred minicomputers being used in the laboratory. In order to provide the minicomputer users with access to facilities available on mainframes and also to provide intercommunication between various experimental minicomputers, CERN opted for a packet switching network back in 1975. It was decided to use Modcomp II computers as switching nodes. The only software to be taken was a communications-oriented operating system called Maxcom. Today eight Modcomp II 16-bit computers plus six newer Classic minicomputers from Modular Computer Services have been purchased for the CERNET data communications networks. The current configuration comprises 11 nodes connecting more than 40 user machines to one another and to the laboratory's central computing facility. (0 refs).

  20. Report on the control of the safety and security of nuclear facilities. Part 2: the reconversion of military plutonium stocks. The use of the helps given to central and eastern Europe countries and to the new independent states; Rapport sur le controle de la surete et de la securite des installations nucleaires. Deuxieme partie: la reconversion des stocks de plutonium militaire. L'utilisation des aides accordees aux pays d'Europe centrale et orientale et aux nouveaux etats independants

    Energy Technology Data Exchange (ETDEWEB)

    Birraux, C

    2002-07-01

    This report deals with two different aspects of the safety and security of nuclear facilities. The first aspect concerns the reconversion of weapon grade plutonium stocks: the plutonium in excess, plutonium hazards and nuclear fuel potentialities, the US program, the Russian program, the actions of European countries (France, Germany), the intervention of other countries, the unanswered questions (political aspects, uncertainties), the solutions of the future (improvement of reactors, the helium-cooled high temperature reactor technology (gas-turbine modular helium reactor: GT-MHR), the Carlo Rubbia's project). The second aspect concerns the actions carried out by the European Union in favor of the civil nuclear facilities of central and eastern Europe: the European Union competencies through the Euratom treaty, the conclusions of the European audit office about the PHARE and TACIS nuclear programs, the status of committed actions, the coming planned actions, and the critical analysis of the policy adopted so far. (J.S.)

  1. Distributed Assessment of Network Centrality

    CERN Document Server

    Wehmuth, Klaus

    2011-01-01

    We propose a method for the Distributed Assessment of Network CEntrality (DANCE) in complex networks. DANCE attributes to each node a volume-based centrality computed using only localized information, thus not requiring knowledge of the full network topology. We show DANCE is simple, yet efficient, in assessing node centrality in a distributed way. Our proposal also provides a way for locating the most central nodes, again using only the localized information at each node. We also show that the node rankings based on DANCE's centrality and the traditional closeness centrality correlate very well. This is quite useful given the vast potential applicability of closeness centrality, which is however limited by its high computational costs. We experimentally evaluate DANCE against a state-of-the-art proposal to distributively assess network centrality. Results attest that DANCE achieves similar effectiveness in assessing network centrality, but with a significant reduction in the associated costs for practical ap...

  2. Guide to user facilities at the Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1984-04-01

    Lawrence Berkeley Laboratories' user facilities are described. Specific facilities include: the National Center for Electron Microscopy; the Bevalac; the SuperHILAC; the Neutral Beam Engineering Test Facility; the National Tritium Labeling Facility; the 88 inch Cyclotron; the Heavy Charged-Particle Treatment Facility; the 2.5 MeV Van de Graaff; the Sky Simulator; the Center for Computational Seismology; and the Low Background Counting Facility. (GHT)

  3. Research and design of asset management system for coal mine facilities based on cloud computing%基于云计算的煤矿设备资产管理系统的研究与设计

    Institute of Scientific and Technical Information of China (English)

    李勇

    2014-01-01

    The current situation of asset management of coal mine facilities was introduced, the existing typical problems were analyzed,and the new asset management plan was proposed on the basis of cloud computing. Furthermore,the general plan and the system frame were intro-duced in detail. And the key techniques for the system were discussed at last.%介绍了煤矿设备资产管理方案的现状,分析了煤矿设备资产管理存在的典型问题,提出了基于云计算技术的煤矿设备资产管理方案,详细介绍了系统的总体方案和系统框架,最后对系统所涉及的关键技术进行了论述。

  4. Facility level thermal systems for the Advanced Technology Solar Telescope

    Science.gov (United States)

    Phelps, LeEllen; Murga, Gaizka; Fraser, Mark; Climent, Tània

    2012-09-01

    The management and control of the local aero-thermal environment is critical for success of the Advanced Technology Solar Telescope (ATST). In addition to minimizing disturbances to local seeing, the facility thermal systems must meet stringent energy efficiency requirements to minimize impact on the surrounding environment and meet federal requirements along with operational budgetary constraints. This paper describes the major facility thermal equipment and systems to be implemented along with associated energy management features. The systems presented include the central plant, the climate control systems for the computer room and coudé laboratory, the carousel cooling system which actively controls the surface temperature of the rotating telescope enclosure, and the systems used for active and passive ventilation of the telescope chamber.

  5. Thermal energy storage testing facility

    Science.gov (United States)

    Schoenhals, R. J.; Lin, C. P.; Kuehlert, H. F.; Anderson, S. H.

    1981-03-01

    Development of a prototype testing facility for performance evaluation of electrically heated thermal energy storage units is described. Laboratory apparatus and test procedures were evaluated by means of measurements and analysis. A 30kW central unit and several smaller individual room-size units were tested.

  6. Thermal energy storage testing facilities

    Science.gov (United States)

    Schoenhals, R. J.; Anderson, S. H.; Stevens, L. W.; Laster, W. R.; Elter, M. R.

    Development of a prototype testing facility for performance evaluation of electrically heated thermal energy storage units is discussed. Laboratory apparatus and test procedures are being evaluated by means of measurements and analysis. Testing procedures were improved, and test results were acquired for commercially available units. A 30 kW central unit and several smaller individual room-size units were tested.

  7. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.

  8. Centralized Fabric Management Using Puppet, Git, and GLPI

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of automated services. Puppet is a seasoned, open-source tool designed for enterprise-class centralized configuration management. At the RHIC/ATLAS Computing Facility at Brookhaven National Laboratory, we have adopted Puppet as part of a suite of tools, including Git, GLPI, and some custom scripts, that comprise our centralized configuration management system. In this paper, we discuss the use of these tools for centralized configuration management of our servers and services; change management, which requires authorized approval of production changes; a complete, version-controlled history of all changes made; separation of production, testing, and development systems using Puppet environments; semi-automated server inventory using GLPI; and configuration change monitoring and reporting via the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a...

  9. Trends in Facility Management Technology: The Emergence of the Internet, GIS, and Facility Assessment Decision Support.

    Science.gov (United States)

    Teicholz, Eric

    1997-01-01

    Reports research on trends in computer-aided facilities management using the Internet and geographic information system (GIS) technology for space utilization research. Proposes that facility assessment software holds promise for supporting facility management decision making, and outlines four areas for its use: inventory; evaluation; reporting;…

  10. Integral lightning protection system in petroleum facilities

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Horacio; Gallego, Luis; Montana, Johny; Younes, Camilo; Rondon, Daniel; Gonzalez, Diego; Herrera, Javier; Perez, Ernesto; Vargas, Mauricio; Quintana, Carlos; Salgado, Milton [Universidad Nacional de Colombia, Bogota (Colombia)]. E-mail: paas@paas.unal.edu.co

    2001-07-01

    This paper presents an Integral Lightning Protection System, focused mainly in petroleum facilities and applied to a real case in Colombia, South America. As introduction it is presented a summary of the incidents happened in last years, a diagnosis and the proposal of solution. Finally, as part of the analysis, a lightning risk assessment for the Central Process Facility is showed. (author)

  11. The modernization of the process computer of the Trillo Nuclear Power Plant; Modernizacion del ordenador de proceso de la Central Nuclear de Trillo

    Energy Technology Data Exchange (ETDEWEB)

    Martin Aparicio, J.; Atanasio, J.

    2011-07-01

    The paper describes the modernization of the Process computer of the Trillo Nuclear Power Plant. The process computer functions, have been incorporated in the non Safety I and C platform selected in Trillo NPP: the Siemens SPPA-T2000 OM690 (formerly known as Teleperm XP). The upgrade of the Human Machine Interface of the control room has been included in the project. The modernization project has followed the same development process used in the upgrade of the process computer of PWR German nuclear power plants. (Author)

  12. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  13. Facile synthesis of silver nanoparticles and its antibacterial activity against Escherichia coli and unknown bacteria on mobile phone touch surfaces/computer keyboards

    Science.gov (United States)

    Reddy, T. Ranjeth Kumar; Kim, Hyun-Joong

    2016-07-01

    In recent years, there has been significant interest in the development of novel metallic nanoparticles using various top-down and bottom-up synthesis techniques. Kenaf is a huge biomass product and a potential component for industrial applications. In this work, we investigated the green synthesis of silver nanoparticles (AgNPs) by using kenaf ( Hibiscus cannabinus) cellulose extract and sucrose, which act as stabilizing and reducing agents in solution. With this method, by changing the pH of the solution as a function of time, we studied the optical, morphological and antibacterial properties of the synthesized AgNPs. In addition, these nanoparticles were characterized by Ultraviolet-visible spectroscopy, transmission electron microscopy (TEM), field-emission scanning electron microscopy, Fourier transform infrared (FTIR) spectroscopy and energy-dispersive X-ray spectroscopy (EDX). As the pH of the solution varies, the surface plasmon resonance peak also varies. A fast rate of reaction at pH 10 compared with that at pH 5 was identified. TEM micrographs confirm that the shapes of the particles are spherical and polygonal. Furthermore, the average size of the nanoparticles synthesized at pH 5, pH 8 and pH 10 is 40.26, 28.57 and 24.57 nm, respectively. The structure of the synthesized AgNPs was identified as face-centered cubic (fcc) by XRD. The compositional analysis was determined by EDX. FTIR confirms that the kenaf cellulose extract and sucrose act as stabilizing and reducing agents for the silver nanoparticles. Meanwhile, these AgNPs exhibited size-dependent antibacterial activity against Escherichia coli ( E. coli) and two other unknown bacteria from mobile phone screens and computer keyboard surfaces.

  14. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  15. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other facilities...

  16. Theme: Laboratory Facilities Improvement.

    Science.gov (United States)

    Miller, Glen M.; And Others

    1993-01-01

    Includes "Laboratory Facilities Improvement" (Miller); "Remodeling Laboratories for Agriscience Instruction" (Newman, Johnson); "Planning for Change" (Mulcahy); "Laboratory Facilities Improvement for Technology Transfer" (Harper); "Facilities for Agriscience Instruction" (Agnew et al.); "Laboratory Facility Improvement" (Boren, Dwyer); and…

  17. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  18. Deployable centralizers

    Energy Technology Data Exchange (ETDEWEB)

    Grubelich, Mark C.; Su, Jiann-Cherng; Knudsen, Steven D.

    2017-02-28

    A centralizer assembly is disclosed that allows for the assembly to be deployed in-situ. The centralizer assembly includes flexible members that can be extended into the well bore in situ by the initiation of a gas generating device. The centralizer assembly can support a large load carrying capability compared to a traditional bow spring with little or no installation drag. Additionally, larger displacements can be produced to centralize an extremely deviated casing.

  19. Analysis of Various Computer System Monitoring and LCD Projector through the Network TCP/IP

    Directory of Open Access Journals (Sweden)

    Santoso Budijono

    2015-12-01

    Full Text Available Many electronic devices have a network connection facility. Projectors today have network facilities to bolster its customer satisfaction in everyday use. By using a device that can be controlled, the expected availability and reliability of the presentation system (computer and projector can be maintained to keep its condition ready to use for presentation. Nevertheless, there is still a projector device that has no network facilities so that the necessary additional equipment with expensive price. Besides, control equipment in large quantities has problems in timing and the number of technicians in performing controls. This study began with study of literature, from searching for the projectors that has LAN and software to control and finding a number of computer control softwares where the focus is easy to use and affordable. Result of this research is creating a system which contains suggestions of procurement of computer hardware, hardware and software projectors each of which can be controlled centrally from a distance.

  20. LLNL superconducting magnets test facility

    Energy Technology Data Exchange (ETDEWEB)

    Manahan, R; Martovetsky, N; Moller, J; Zbasnik, J

    1999-09-16

    The FENIX facility at Lawrence Livermore National Laboratory was upgraded and refurbished in 1996-1998 for testing CICC superconducting magnets. The FENIX facility was used for superconducting high current, short sample tests for fusion programs in the late 1980s--early 1990s. The new facility includes a 4-m diameter vacuum vessel, two refrigerators, a 40 kA, 42 V computer controlled power supply, a new switchyard with a dump resistor, a new helium distribution valve box, several sets of power leads, data acquisition system and other auxiliary systems, which provide a lot of flexibility in testing of a wide variety of superconducting magnets in a wide range of parameters. The detailed parameters and capabilities of this test facility and its systems are described in the paper.

  1. Biotechnology Facility (BTF) for ISS

    Science.gov (United States)

    1998-01-01

    Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.

  2. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  3. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  4. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  5. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  6. Indoor Lighting Facilities

    Science.gov (United States)

    Matsushima, Koji; Saito, Yoshinori; Ichikawa, Shigenori; Kawauchi, Takao; Tanaka, Tsuneo; Hirano, Rika; Tazuke, Fuyuki

    According to the statistics by the Ministry of Land, Infrastructure and Transport, the total floor space of all building construction started was 188.87 million m2 (1.5% increase y/y), marking the fourth straight year of increase. Many large-scale buildings under construction in central Tokyo become fully occupied by tenants before completion. As for office buildings, it is required to develop comfortable and functional office spaces as working styles are becoming more and more diversified, and lighting is also an element of such functionalities. The total floor space of construction started for exhibition pavilions, multipurpose halls, conference halls and religious architectures decreased 11.1% against the previous year. This marked a decline for 10 consecutive years and the downward trend continues. In exhibition pavilions, the light radiation is measured and adjusted throughout the year so as not to damage the artworks by lighting. Hospitals, while providing higher quality medical services and enhancing the dwelling environment of patients, are expected to meet various restrictions and requirements, including the respect for privacy. Meanwhile, lighting designs for school classrooms tend to be homogeneous, yet new ideas are being promoted to strike a balance between the economical and functional aspects. The severe economic environment continues to be hampering the growth of theaters and halls in both the private and public sectors. Contrary to the downsizing trend of such facilities, additional installations of lighting equipment were conspicuous, and the adoption of high efficacy lighting appliances and intelligent function control circuits are becoming popular. In the category of stores/commercial facilities, the construction of complex facilities is a continuing trend. Indirect lighting, high luminance discharge lamps with excellent color rendition and LEDs are being effectively used in these facilities, together with the introduction of lighting designs

  7. Facility management in German hospitals.

    Science.gov (United States)

    Gudat, H

    2000-04-01

    Facility management and optimum building management offer for hospitals a chance to reduce costs and to increase quality, process sequences, employee motivation and customer satisfaction. Some years ago simple services such as cleaning, catering or laundry were outsourced. Now, German hospitals progress to more complex fields such as building and medical technology, clinical support processes such as pharmacy, central laboratory and sterilization, goods and logistics services.

  8. Central Libraries in Uncertain Times.

    Science.gov (United States)

    Kenney, Brian J.

    2001-01-01

    Discusses security and safety issues for public libraries, especially high-profile central facilities, in light of the September 11 terrorist attacks. Highlights include inspecting bags as patrons enter as well as exit; the need for security guidelines for any type of disaster or emergency; building design; and the importance of communication.…

  9. Central projection of auditory receptors in the prothoracic ganglion of the buschcricket Psorodonotus illyricus (tettigoniidae): computer-aided analysis of the end branch pattern.

    Science.gov (United States)

    Ebendt, R; Friedel, J; Kalmring, K

    1994-01-01

    The projection patterns of morphologically and functionally identified auditory and auditory-vibratory receptor cells of receptor organs (the crista acustica and the intermediate organ) in the foreleg of the tettigoniid Psorodonotus illyricus, were investigated with combined recording and staining techniques, and subsequent histological examination and morphometric measurements. With the application of a computer program (AutoCAD), three-dimensional reconstructions of the axon end branches of receptor cells within the neuropile of the anterior Ring Tract (aRT) were made, in order to determine, the entire shape of each, the pattern and density of the end branches, and the positions of the target areas within the auditory neuropile. Clear differences for different functional types of receptors were found.

  10. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  11. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  12. Jupiter Laser Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Jupiter Laser Facility is an institutional user facility in the Physical and Life Sciences Directorate at LLNL. The facility is designed to provide a high degree...

  13. About the Facilities for the Aged in Central Urban Areas of Ningbo%宁波市中心城区老龄设施使用状态分析与对策

    Institute of Scientific and Technical Information of China (English)

    2013-01-01

    As the population aging process speeds up in China, many cities are confronted with a big problem of how to make the aged have a happy life. On the grounds that the supply-demand issue is the material base, this article analyzes the supply and demand of the facilities for the aged, explores the problems of the facilities for the aged, and suggests some solutions to the problems.%  伴随着我国老龄化程度的加深,如何让老龄人口安居晚年是目前我国各主要城市面临的主要问题之一。老龄设施的供需状况是老龄人口安居晚年的物质基础,针对老龄设施的供给与老龄人口养老需求之间的差异,阐述老龄设施的供给存在的问题,从软硬件两个层面提出解决当前老龄设施困境的措施。

  14. Aperture area measurement facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST has established an absolute aperture area measurement facility for circular and near-circular apertures use in radiometric instruments. The facility consists of...

  15. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  16. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  17. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology research The Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  18. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  19. Research on multimedia classroom remote central supervision system based on computer network%基于计算机网络的多媒体教室远程中央监管系统的探究

    Institute of Scientific and Technical Information of China (English)

    张桂宁

    2015-01-01

    With the rapid development of science and technology and the improvement of people's living standard,the people gradually to the multimedia teaching work seriously.According to the current situation of the multimedia teaching,based on the computer network multimedia classroom remote central control system and a central management system construction and carry out a detailed analysis on,hope for the development of China's media education to contribute a strength.%随着科学技术的飞速发展和人们生活水平的不断提高,当前人们逐渐对多媒体教学工作重视起来。本文针对多媒体教学现状,对基于计算机网络的多媒体教室远程中央监控系统和中央管理系统构建等进行了详细分析和阐述,希望为我国多媒体教育事业的发展贡献出一份力量。

  20. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  1. Guide to research facilities

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This Guide provides information on facilities at US Department of Energy (DOE) and other government laboratories that focus on research and development of energy efficiency and renewable energy technologies. These laboratories have opened these facilities to outside users within the scientific community to encourage cooperation between the laboratories and the private sector. The Guide features two types of facilities: designated user facilities and other research facilities. Designated user facilities are one-of-a-kind DOE facilities that are staffed by personnel with unparalleled expertise and that contain sophisticated equipment. Other research facilities are facilities at DOE and other government laboratories that provide sophisticated equipment, testing areas, or processes that may not be available at private facilities. Each facility listing includes the name and phone number of someone you can call for more information.

  2. The CDF Central Analysis Farm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T.H.; /MIT; Neubauer, M.; /UC, San Diego; Sfiligoi, I.; /Frascati; Weems, L.; /Fermilab; Wurthwein, F.; /UC, San Diego

    2004-01-01

    With Run II of the Fermilab Tevatron well underway, many computing challenges inherent to analyzing large volumes of data produced in particle physics research need to be met. We present the computing model within CDF designed to address the physics needs of the collaboration. Particular emphasis is placed on current development of a large O(1000) processor PC cluster at Fermilab serving as the Central Analysis Farm for CDF. Future plans leading toward distributed computing and GRID within CDF are also discussed.

  3. Bucharest heavy ion accelerator facility

    Energy Technology Data Exchange (ETDEWEB)

    Ceausescu, V.; Dobrescu, S.; Duma, M.; Indreas, G.; Ivascu, M.; Papureanu, S.; Pascovici, G.; Semenescu, G.

    1986-02-15

    The heavy ion accelerator facility of the Heavy Ion Physics Department at the Institute of Physics and Nuclear Engineering in Bucharest is described. The Tandem accelerator development and the operation of the first stage of the heavy ion postaccelerating system are discussed. Details are given concerning the resonance cavities, the pulsing system matching the dc beam to the RF cavities and the computer control system.

  4. 锥形束CT在上颌埋伏中切牙诊断中的应用%Application of cone beam computed tomography in the diagnosis of unerupted maxillary central incisor

    Institute of Scientific and Technical Information of China (English)

    顾月光; 王珊; 谷妍; 赵春洋; 许远; 刘可; 王林

    2012-01-01

    目的:探讨锥形束CT(CBCT)在上颌埋伏中切牙诊断中的作用.方法:选择正畸临床33例上颌中切牙埋伏阻生患者作为研究对象,拍摄全景片和CBCT,由10位正畸医生对33例埋伏牙患者的全景片和CBCT分别进行测量、分析,填写研究表格.所得数据用SPSS 17.0软件,采用McNemar检验、Kappa检验和配对t检验进行统计学分析.结果:CBCT与全景片对埋伏牙是否弯根、近中切角距中线距离和有无多生牙等分析项目没有统计学差异,其余分析项目均有统计学差异.结论:对上颌埋伏中切牙诊断,CBCT更直观、三维测量更精确,更好地指导临床.%Objective: To study the effect of cone beam computed tomography (CBCT) in the diagnosis of unerupted maxillary central incisor. Methods: 33 patients undergoing orthodontic treatment were included. For each unerupted maxillary central incisor, panoramic radiograph( PR) and CBCT were used in diagnosis. Ten orthodontic dentists completed the table which had been formulated. The images were measured and the data analysis was performed using McNemar test, Kappa test and paired t-test by SPSS 17. 0 soft ware. Results: The dimension data of the central incisor, the distance between the mesial incisor angle and the midline of the arch, dilacera-tion of the central incisor root and the existence supernumerary teeth obtained by PR and CBCT showed no statistic difference, while the other analysis items did. Conclusion: CBCT diagnosis of the unerupted incisor is more direct and more accuracy than panoramic radiograph.

  5. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  6. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  7. Women and Computers: An Introduction.

    Science.gov (United States)

    Perry, Ruth; Greber, Lisa

    1990-01-01

    Discusses women's central role in the development of the computer and their present day peripheral position, a progression paralleled in the fields of botany, medical care, and obstetrics. Affirms the importance of computer education to women. (DM)

  8. Computers, Children and Epistemology.

    Science.gov (United States)

    Blitman, Elaine; And Others

    1984-01-01

    The implementation of a computer education program in a Hawaiian school is described. The use of the LOGO programing language is central to the program, and teachers' comments on student behaviors and reactions are included. (MNS)

  9. Designing for the home: a comparative study of support aids for central heating systems.

    Science.gov (United States)

    Sauer, J; Wastell, D G; Schmeink, C

    2009-03-01

    The study examined the influence of different types of enhanced system support on user performance during the management of a central heating system. A computer-based simulation of a central heating system, called CHESS V2.0, was used to model different interface options, providing different support facilities to the user (e.g., historical, predictive, and instructional displays). Seventy-five participants took part in the study and completed a series of operational scenarios under different support conditions. The simulation environment allowed the collection of performance measures (e.g., energy consumption), information sampling, and system control behaviour. Subjective user evaluations of various aspects of the system were also measured. The results showed performance gains for predictive displays whereas no such benefits were observed for the other display types. The data also revealed that status and predictive displays were valued most highly by users. The implications of the findings for designers of central heating systems are discussed.

  10. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  11. Composite analysis E-area vaults and saltstone disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cook, J.R.

    1997-09-01

    This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potential sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public.

  12. Central line infections - hospitals

    Science.gov (United States)

    ... infection; Central venous catheter - infection; CVC - infection; Central venous device - infection; Infection control - central line infection; Nosocomial infection - central line infection; Hospital acquired ...

  13. Wireless remote monitoring of critical facilities

    Science.gov (United States)

    Tsai, Hanchung; Anderson, John T.; Liu, Yung Y.

    2016-12-13

    A method, apparatus, and system are provided for monitoring environment parameters of critical facilities. A Remote Area Modular Monitoring (RAMM) apparatus is provided for monitoring environment parameters of critical facilities. The RAMM apparatus includes a battery power supply and a central processor. The RAMM apparatus includes a plurality of sensors monitoring the associated environment parameters and at least one communication module for transmitting one or more monitored environment parameters. The RAMM apparatus is powered by the battery power supply, controlled by the central processor operating a wireless sensor network (WSN) platform when the facility condition is disrupted. The RAMM apparatus includes a housing prepositioned at a strategic location, for example, where a dangerous build-up of contamination and radiation may preclude subsequent manned entrance and surveillance.

  14. CENTRAL ASSESSMENT OF COMPUTED TOMOGRAPHY BRAIN SCANS

    Directory of Open Access Journals (Sweden)

    Lesley Ann Cala

    2016-08-01

    Full Text Available Development of multislice CT (MSCT scanners since 1998 has resulted in submillimetre thick slices being able to be acquired, without increasing the radiation dose to the patient. Although the incident x-ray beam is widened in the slice thickness direction (Z-direction, the emergent x-rays fall upon multiple rows of small detectors. This means data can be collected simultaneously for more than one slice per rotation of the x-ray tube. For example, the dose received by the patient will be the same for four thin slices of 2.5 mm, as for one slice of 10 mm thickness. A 64-slice MSCT can create 0.625 mm thick slices. This leads to high diagnostic value in the detection of small abnormalities in stroke patients and in the reconstruction of data from CT angiography (CTA of the brain.

  15. Materiel Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CRREL's Materiel Evaluation Facility (MEF) is a large cold-room facility that can be set up at temperatures ranging from −20°F to 120°F with a temperature change...

  16. Facilities for US Radioastronomy.

    Science.gov (United States)

    Thaddeus, Patrick

    1982-01-01

    Discusses major developments in radioastronomy since 1945. Topics include proposed facilities, very-long-baseline interferometric array, millimeter-wave telescope, submillimeter-wave telescope, and funding for radioastronomy facilities and projects. (JN)

  17. Integrated Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the center of the 586-square-mile Hanford Site is the Integrated Disposal Facility, also known as the IDF.This facility is a landfill similar in concept...

  18. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  19. Financing Professional Sports Facilities

    OpenAIRE

    Baade, Robert A.; Victor A. Matheson

    2011-01-01

    This paper examines public financing of professional sports facilities with a focus on both early and recent developments in taxpayer subsidization of spectator sports. The paper explores both the magnitude and the sources of public funding for professional sports facilities.

  20. Wastewater Treatment Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Individual permits for municipal, industrial, and semi-public wastewater treatment facilities in Iowa for the National Pollutant Discharge Elimination System (NPDES)...

  1. Facility Response Plan (FRP)

    Data.gov (United States)

    U.S. Environmental Protection Agency — A Facility Response Plan (FRP) demonstrates a facility's preparedness to respond to a worst case oil discharge. Under the Clean Water Act, as amended by the Oil...

  2. Projectile Demilitarization Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Projectile Wash Out Facility is US Army Ammunition Peculiar Equipment (APE 1300). It is a pilot scale wash out facility that uses high pressure water and steam...

  3. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology researchThe Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  4. Armament Technology Facility (ATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Armament Technology Facility is a 52,000 square foot, secure and environmentally-safe, integrated small arms and cannon caliber design and evaluation facility....

  5. Dialysis Facility Compare

    Data.gov (United States)

    U.S. Department of Health & Human Services — Dialysis Facility Compare helps you find detailed information about Medicare-certified dialysis facilities. You can compare the services and the quality of care that...

  6. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to:Evaluate and characterize the effect of flame and thermal...

  7. Cold Vacuum Drying Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the K-Basins (see K-Basins link) in Hanford's 100 Area is a facility called the Cold Vacuum Drying Facility (CVDF).Between 2000 and 2004, workers at the...

  8. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  9. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to: Evaluate and characterize the effect of flame and thermal...

  10. Central Solenoid

    CERN Multimedia

    2002-01-01

    The Central Solenoid (CS) is a single layer coil wound internally in a supporting cylinder housed in the cryostat of the Liquid Argon Calorimeter. It was successfully tested at Toshiba in December 2000 and was delivered to CERN in September 2001 ready for integration in the LAr Calorimeter in 2003. An intermediate test of the chimney and proximity cryogenics was successfully performed in June 2002.

  11. Europa central

    Directory of Open Access Journals (Sweden)

    Karel BARTOSEK

    2010-02-01

    Full Text Available La investigación francesa continúa interesándose por Europa Central. Desde luego, hay límites a este interés en el ambiente general de mi nueva patria: en la ignorancia, producto del largo desinterés de Francia por este espacio después de la Segunda Guerra Mundial, y en el comportamiento y la reflexión de la clase política y de los medios de comunicación (una anécdota para ilustrar este ambiente: durante la preparación de nuestro coloquio «Refugiados e inmigrantes de Europa Central en el movimiento antifascista y la Resistencia en Francia, 1933-1945», celebrado en París en octubre de 1986, el problema de la definición fue planteado concreta y «prácticamente». ¡Y hubo entonces un historiador eminente, para quién Alemania no formaría parte de Europa Central!.

  12. Nuclear fuel cycle facility accident analysis handbook

    Energy Technology Data Exchange (ETDEWEB)

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  13. National Biomedical Tracer Facility. Project definition study

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, R.

    1995-02-14

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H{sup -}, H{sup +}, and D{sup +}). The proposed NBTF facility includes an 80 MeV, 1 mA H{sup -} cyclotron that will produce proton-induced (neutron deficient) research isotopes.

  14. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITY Evaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  15. Steady State Vacuum Ultraviolet Exposure Facility With Automated Calibration Capability

    Science.gov (United States)

    Stueber, Thomas J.; Sechkar, Edward A.; Dever, Joyce A.; Banks, Bruce A.

    2000-01-01

    NASA Glenn Research Center at Lewis Field designed and developed a steady state vacuum ultraviolet automated (SSVUVa) facility with in situ VUV intensity calibration capability. The automated feature enables a constant accelerated VUV radiation exposure over long periods of testing without breaking vacuum. This test facility is designed to simultaneously accommodate four isolated radiation exposure tests within the SSVUVa vacuum chamber. Computer-control of the facility for long, term continuous operation also provides control and recording of thermocouple temperatures, periodic recording of VUV lamp intensity, and monitoring of vacuum facility status. This paper discusses the design and capabilities of the SSVUVa facility.

  16. Central pain.

    Science.gov (United States)

    Singh, Supreet

    2014-12-01

    Questions from patients about pain conditions and analgesic pharmacotherapy and responses from authors are presented to help educate patients and make them more effective self-advocates. The topic addressed in this issue is central pain, a neuropathic pain syndrome caused by a lesion in the brain or spinal cord that sensitizes one's perception of pain. It is a debilitating condition caused by various diseases such as multiple sclerosis, strokes, spinal cord injuries, or brain tumors. Varied symptoms and the use of pharmacological medicines and nonpharmacological therapies will be addressed.

  17. central t

    Directory of Open Access Journals (Sweden)

    Manuel R. Piña Monarrez

    2007-01-01

    Full Text Available Dado que la Regresión Ridge (RR, es una estimación sesgada que parte de la solución de la regresión de Mínimos Cuadrados (MC, es vital establecer las condiciones para las que la distribución central t de Student que se utiliza en la prueba de hipótesis en MC, sea también aplicable a la regresión RR. La prueba de este importante resultado se presenta en este artículo.

  18. Study on Storage Facilities of Agricultural Products in Courtyard in China

    Institute of Scientific and Technical Information of China (English)

    CHEN Li; LI Xi-Hong; XIA Qiu-Yu; HU Yun-feng; GUAN Wen-qiang

    2002-01-01

    Mini storage facilities applicable in rural areas in China have been developed after nine years of research. Optimal design of structure and refrigeration system, facilities optimization, computer control and management technology are studied and developed.

  19. Adiabatic quantum computing

    OpenAIRE

    Lobe, Elisabeth; Stollenwerk, Tobias; Tröltzsch, Anke

    2015-01-01

    In the recent years, the field of adiabatic quantum computing has gained importance due to the advances in the realisation of such machines, especially by the company D-Wave Systems. These machines are suited to solve discrete optimisation problems which are typically very hard to solve on a classical computer. Due to the quantum nature of the device it is assumed that there is a substantial speedup compared to classical HPC facilities. We explain the basic principles of adiabatic ...

  20. Ocean Thermal Energy Conversion (OTEC) test facilities study program. Final report. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-01-17

    A comprehensive test program has been envisioned by ERDA to accomplish the OTEC program objectives of developing an industrial and technological base that will lead to the commercial capability to successfully construct and economically operate OTEC plants. This study was performed to develop alternative non-site specific OTEC test facilities/platform requirements for an integrated OTEC test program including both land and floating test facilities. A progression of tests was established in which OTEC power cycle component designs proceed through advanced research and technology, component, and systems test phases. This progression leads to the first OTEC pilot plant and provides support for following developments which potentially reduce the cost of OTEC energy. It also includes provisions for feedback of results from all test phases to enhance modifications to existing designs or development of new concepts. The tests described should be considered as representative of generic types since specifics can be expected to change as the OTEC plant design evolves. Emphasis is placed on defining the test facility which is capable of supporting the spectrum of tests envisioned. All test support facilities and equipment have been identified and included in terms of space, utilities, cost, schedule, and constraints or risks. A highly integrated data acquisition and control system has been included to improve test operations and facility effectiveness through a centralized computer system capable of automatic test control, real-time data analysis, engineering analyses, and selected facility control including safety alarms. Electrical power, hydrogen, and ammonia are shown to be technically feasible as means for transmitting OTEC power to a land-based distribution point. (WHK)

  1. Ubiquitous Computing: Potentials and Challenges

    OpenAIRE

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. This paper presents a comprehensive discussion on the central trends in ubiquitous computing consider...

  2. Instrumentation of the ESRF medical imaging facility

    CERN Document Server

    Elleaume, H; Berkvens, P; Berruyer, G; Brochard, T; Dabin, Y; Domínguez, M C; Draperi, A; Fiedler, S; Goujon, G; Le Duc, G; Mattenet, M; Nemoz, C; Pérez, M; Renier, M; Schulze, C; Spanne, P; Suortti, P; Thomlinson, W; Estève, F; Bertrand, B; Le Bas, J F

    1999-01-01

    At the European Synchrotron Radiation Facility (ESRF) a beamport has been instrumented for medical research programs. Two facilities have been constructed for alternative operation. The first one is devoted to medical imaging and is focused on intravenous coronary angiography and computed tomography (CT). The second facility is dedicated to pre-clinical microbeam radiotherapy (MRT). This paper describes the instrumentation for the imaging facility. Two monochromators have been designed, both are based on bent silicon crystals in the Laue geometry. A versatile scanning device has been built for pre-alignment and scanning of the patient through the X-ray beam in radiography or CT modes. An intrinsic germanium detector is used together with large dynamic range electronics (16 bits) to acquire the data. The beamline is now at the end of its commissioning phase; intravenous coronary angiography is intended to start in 1999 with patients and the CT pre-clinical program is underway on small animals. The first in viv...

  3. Facility Effluent Monitoring Plan determinations for the 600 Area facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nickels, J.M.

    1991-08-01

    This document determines the need for Facility Effluent Monitoring Plans for Westinghouse Hanford Company's 600 Area facilities on the Hanford Site. The Facility Effluent Monitoring Plan determinations were prepared in accordance with A Guide For Preparing Hanford Site Facility Effluent Monitoring Plans (WHC 1991). Five major Westinghouse Hanford Company facilities in the 600 Area were evaluated: the Purge Water Storage Facility, 212-N, -P, and -R Facilities, the 616 Facility, and the 213-J K Storage Vaults. Of the five major facilities evaluated in the 600 Area, none will require preparation of a Facility Effluent Monitoring Plan.

  4. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  5. A comparison of data-access platforms for BaBar and ALICE analysis computing model at the Italian Tier1

    Energy Technology Data Exchange (ETDEWEB)

    Fella, A; Li Gioi, L; Noferini, F; Cavalli, A; Chierici, A; Dell' Agnello, L; Gregori, D; Italiano, A; Martelli, B; Prosperini, A; Ricci, P; Ronchieri, E; Salomoni, D; Sapunenko, V; Vitlacil, D [INFN-CNAF Bologna (Italy); Furano, F [Conseil Europeen Recherche Nucl. (CERN) (Switzerland); Steinke, M [Ruhr Universitaet, Bochum (Germany); Andreotti, D; Luppi, E, E-mail: armando.fella@cnaf.infn.i, E-mail: luigi.ligioi@cnaf.infn.i, E-mail: francesco.noferini@bo.infn.i [INFN Ferrara, Ferrara (Italy)

    2010-04-01

    Performance, reliability and scalability in data access are key issues in the context of Grid computing and High Energy Physics (HEP) data analysis. We present the technical details and the results of a large scale validation and performance measurement achieved at the CNAF Tier1, the central computing facility of the Italian National Institute for Nuclear Research (INFN). The aim of this work is the evaluation of data access activity during analysis tasks within BaBar and ALICE computing models against two of the most used data handling systems in HEP scenario: GPFS and Scalla/Xrootd.

  6. Central nervous system abnormalities on midline facial defects with hypertelorism detected by magnetic resonance image and computed tomography Anomalias de sistema nervoso central em defeitos de linha média facial com hipertelorismo detectados por ressonância magnética e tomografia computadorizada

    Directory of Open Access Journals (Sweden)

    Vera Lúcia Gil-da-Silva-Lopes

    2006-12-01

    Full Text Available The aim of this study were to describe and to compare structural central nervous system (CNS anomalies detected by magnetic resonance image (MRI and computed tomography (CT in individuals affected by midline facial defects with hypertelorism (MFDH isolated or associated with multiple congenital anomalies (MCA. The investigation protocol included dysmorphological examination, skull and facial X-rays, brain CT and/or MRI. We studied 24 individuals, 12 of them had an isolated form (Group I and the others, MCA with unknown etiology (Group II. There was no significative difference between Group I and II and the results are presented in set. In addition to the several CNS anomalies previously described, MRI (n=18 was useful for detection of neuronal migration errors. These data suggested that structural CNS anomalies and MFDH seem to have an intrinsic embryological relationship, which should be taken in account during the clinical follow-up.Este estudo objetivou descrever e comparar as anomalias estruturais do sistema nervoso central (SNC detectadas por meio de ressonância magnética (RM e tomografia computadorizada (TC de crânio em indivíduos com defeitos de linha média facial com hipertelorismo (DLMFH isolados ou associados a anomalias congênitas múltiplas (ACM. O protocolo de investigação incluiu exame dismorfológico, RX de crânio e face, CT e RM de crânio. Foram estudados 24 indivíduos, sendo que 12 apresentavam a forma isolada (Grupo I e os demais, DLMFH com ACM de etiologia não esclarecida (Grupo II. Não houve diferença entre os dois grupos e os resultados foram agrupados. Além de várias anomalias de SNC já descritas, a RM foi útil para detecção de erros de migração neuronal. Os dados sugerem que as alterações estruturais de SNC e os DLMFH têm relação embriológica, o que deve ser levado em conta durante o seguimento clínico.

  7. Thermal distortion test facility

    Science.gov (United States)

    Stapp, James L.

    1995-02-01

    The thermal distortion test facility (TDTF) at Phillips Laboratory provides precise measurements of the distortion of mirrors that occurs when their surfaces are heated. The TDTF has been used for several years to evaluate mirrors being developed for high-power lasers. The facility has recently undergone some significant upgrades to improve the accuracy with which mirrors can be heated and the resulting distortion measured. The facility and its associated instrumentation are discussed.

  8. Synchrotron radiation facilities

    CERN Multimedia

    1972-01-01

    Particularly in the past few years, interest in using the synchrotron radiation emanating from high energy, circular electron machines has grown considerably. In our February issue we included an article on the synchrotron radiation facility at Frascati. This month we are spreading the net wider — saying something about the properties of the radiation, listing the centres where synchrotron radiation facilities exist, adding a brief description of three of them and mentioning areas of physics in which the facilities are used.

  9. ENLIGHT and LEIR biomedical facility.

    Science.gov (United States)

    Dosanjh, M; Cirilli, M; Navin, S

    2014-07-01

    Particle therapy (including protons and carbon ions) allows a highly conformal treatment of deep-seated tumours with good accuracy and minimal dose to surrounding tissues, compared to conventional radiotherapy using X-rays. Following impressive results from early phase trials, over the last decades particle therapy in Europe has made considerable progress in terms of new institutes dedicated to charged particle therapy in several countries. Particle therapy is a multidisciplinary subject that involves physicists, biologists, radio-oncologists, engineers and computer scientists. The European Network for Light Ion Hadron Therapy (ENLIGHT) was created in response to the growing needs of the European community to coordinate such efforts. A number of treatment centres are already operational and treating patients across Europe, including two dual ion (protons and carbon ions) centres in Heidelberg (the pioneer in Europe) and Pavia. However, much more research needs to be carried out and beamtime is limited. Hence there is a strong interest from the biomedical research community to have a facility with greater access to relevant beamtime. Such a facility would facilitate research in radiobiology and the development of more accurate techniques of dosimetry and imaging. The Low Energy Ion Ring (LEIR) accelerator at CERN presents such an opportunity, and relies partly on CERN's existing infrastructure. The ENLIGHT network, European Commission projects under the ENLIGHT umbrella and the future biomedical facility are discussed.

  10. Neutron Therapy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Neutron Therapy Facility provides a moderate intensity, broad energy spectrum neutron beam that can be used for short term irradiations for radiobiology (cells)...

  11. Flexible Electronics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Flexible Electronics Research Facility designs, synthesizes, tests, and fabricates materials and devices compatible with flexible substrates for Army information...

  12. High Combustion Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL's High-Pressure Combustion Research Facility in Morgantown, WV, researchers can investigate new high-pressure, high-temperature hydrogen turbine combustion...

  13. Magnetics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Magnetics Research Facility houses three Helmholtz coils that generate magnetic fields in three perpendicular directions to balance the earth's magnetic field....

  14. Facility Environmental Management System

    Data.gov (United States)

    Federal Laboratory Consortium — This is the Web site of the Federal Highway Administration's (FHWA's) Turner-Fairbank Highway Research Center (TFHRC) facility Environmental Management System (EMS)....

  15. Geophysical Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Geophysical Research Facility (GRF) is a 60 ft long qaodmasdkwaspemas5ajkqlsmdqpakldnzsdfls 22 ft wide qaodmasdkwaspemas4ajkqlsmdqpakldnzsdfls 7 ft deep concrete...

  16. Manufacturing Demonstration Facility (MDF)

    Data.gov (United States)

    Federal Laboratory Consortium — The U.S. Department of Energy Manufacturing Demonstration Facility (MDF) at Oak Ridge National Laboratory (ORNL) provides a collaborative, shared infrastructure to...

  17. GPS Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Global Positioning System (GPS) Test Facility Instrumentation Suite (GPSIS) provides great flexibility in testing receivers by providing operational control of...

  18. Imagery Data Base Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Imagery Data Base Facility supports AFRL and other government organizations by providing imagery interpretation and analysis to users for data selection, imagery...

  19. Nonlinear Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Nonlinear Materials Characterization Facility conducts photophysical research and development of nonlinear materials operating in the visible spectrum to protect...

  20. Transonic Experimental Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Transonic Experimental Research Facility evaluates aerodynamics and fluid dynamics of projectiles, smart munitions systems, and sub-munitions dispensing systems;...

  1. Target Assembly Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Target Assembly Facility integrates new armor concepts into actual armored vehicles. Featuring the capability ofmachining and cutting radioactive materials, it...

  2. Region 9 NPDES Facilities

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  3. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural AnalysesThe ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide an...

  4. Textiles Performance Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Textiles Performance Testing Facilities has the capabilities to perform all physical wet and dry performance testing, and visual and instrumental color analysis...

  5. Catalytic Fuel Conversion Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility enables unique catalysis research related to power and energy applications using military jet fuels and alternative fuels. It is equipped with research...

  6. Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Materials Characterization Facility enables detailed measurements of the properties of ceramics, polymers, glasses, and composites. It features instrumentation...

  7. Engine Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Air Force Arnold Engineering Development Center's Engine Test Facility (ETF) test cells are used for development and evaluation testing of propulsion systems for...

  8. Heated Tube Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Heated Tube Facility at NASA GRC investigates cooling issues by simulating conditions characteristic of rocket engine thrust chambers and high speed airbreathing...

  9. Geodynamics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This GSL facility has evolved over the last three decades to support survivability and protective structures research. Experimental devices include three gas-driven...

  10. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Lee, H. H.; Kim, K. H. and others

    2000-03-01

    The objectives of this study are (1) the refurbishment for PIEF(Post Irradiation Examination Facility) and M6 hot-cell in IMEF(Irradiated Material Examination Facility), (2) the establishment of the compatible facility for DUPIC fuel fabrication experiments which is licensed by government organization, and (3) the establishment of the transportation system and transportation cask for nuclear material between facilities. The report for this project describes following contents, such as objectives, necessities, scope, contents, results of current step, R and D plan in future and etc.

  11. Mobile Solar Tracker Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST's mobile solar tracking facility is used to characterize the electrical performance of photovoltaic panels. It incorporates meteorological instruments, a solar...

  12. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural Analyses The ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide...

  13. Composite Structures Manufacturing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Composite Structures Manufacturing Facility specializes in the design, analysis, fabrication and testing of advanced composite structures and materials for both...

  14. Universal Drive Train Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This vehicle drive train research facility is capable of evaluating helicopter and ground vehicle power transmission technologies in a system level environment. The...

  15. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  16. Proximal Probes Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Proximal Probes Facility consists of laboratories for microscopy, spectroscopy, and probing of nanostructured materials and their functional properties. At the...

  17. Computation Directorate 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Henson, V E; Guse, J A

    2008-03-06

    software; from expanding BlueGene/L, the world's most powerful computer, by 60% and using it to capture the most prestigious prize in the field of computing, to helping create an automated control system for the National Ignition Facility (NIF) that monitors and adjusts more than 60,000 control and diagnostic points; from creating a microarray probe that rapidly detects virulent high-threat organisms, natural or bioterrorist in origin, to replacing large numbers of physical computer servers with small numbers of virtual servers, reducing operating expense by 60%, the people in Computation have been at the center of weighty projects whose impacts are felt across the Laboratory and the DOE community. The accomplishments I just mentioned, and another two dozen or so, make up the stories contained in this report. While they form an exceptionally diverse set of projects and topics, it is what they have in common that excites me. They share the characteristic of being central, often crucial, to the mission-driven business of the Laboratory. Computational science has become fundamental to nearly every aspect of the Laboratory's approach to science and even to the conduct of administration. It is difficult to consider how we would proceed without computing, which occurs at all scales, from handheld and desktop computing to the systems controlling the instruments and mechanisms in the laboratories to the massively parallel supercomputers. The reasons for the dramatic increase in the importance of computing are manifest. Practical, fiscal, or political realities make the traditional approach to science, the cycle of theoretical analysis leading to experimental testing, leading to adjustment of theory, and so on, impossible, impractical, or forbidden. How, for example, can we understand the intricate relationship between human activity and weather and climate? We cannot test our hypotheses by experiment, which would require controlled use of the entire earth over centuries

  18. Residents Living in Residential Care Facilities: United States, 2010

    Science.gov (United States)

    ... of Residential Care Facilities. National Center for Health Statistics. Vital Health Stat 1(54). 2011. SUDAAN, release 10.0 [computer software]. Research Triangle Park, NC: RTI International. 2008. Suggested ...

  19. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  20. Computers and neurosurgery.

    Science.gov (United States)

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century.

  1. Shattering and Compressing Networks for Centrality Analysis

    CERN Document Server

    Sarıyüce, Ahmet Erdem; Kaya, Kamer; Çatalyürek, Ümit V

    2012-01-01

    Who is more important in a network? Who controls the flow between the nodes or whose contribution is significant for connections? Centrality metrics play an important role while answering these questions. The betweenness metric is useful for network analysis and implemented in various tools. Since it is one of the most computationally expensive kernels in graph mining, several techniques have been proposed for fast computation of betweenness centrality. In this work, we propose and investigate techniques which compress a network and shatter it into pieces so that the rest of the computation can be handled independently for each piece. Although we designed and tuned the shattering process for betweenness, it can be adapted for other centrality metrics in a straightforward manner. Experimental results show that the proposed techniques can be a great arsenal to reduce the centrality computation time for various types of networks.

  2. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance-computer-based electronic backend that...

  3. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance computer-based electronic backend that...

  4. Assessment of Recreational Facilities in Federal Capital City, Abuja, Nigeria

    Directory of Open Access Journals (Sweden)

    Cyril Kanayo Ezeamaka

    2016-06-01

    Full Text Available Abuja Master Plan provided development of adequate Green Areas and other Recreational Facilities within the Federal Capital City (FCC, as part of its sustainability principles and provided for these recreational facilities within each neighborhood (FCDA, 1979. However, there have been several recent foul cries about the negative development of recreational facilities and the abuse of the Master Plan in the FCC.  The motivation for carrying out this study arose from the observation that recreational facilities in Phase 1 of the Federal Capital City Abuja are not clearly developed as intended by the policy makers and thus, the need to identify the recreational facilities in the Phase 1 of FCC and observe their level of development as well as usage. The field survey revealed that the Central Business District and Gazupe have higher numbers of recreational facilities with 45 and 56. While Wuse II (A08 and Wuse II (A07 Districts have lesser recreational facilities with 10 and 17. The field survey further revealed that all the districts in Phase 1 have over 35% cases of land use changes from recreational facilities to other use. The survey shows that over 65% of these recreational facilities are fully developed. The study also shows that just about 11% of the recreational sporting facilities were developed in line with the Abuja Master Plan in Phase 1. The study revealed that recreational facilities in Phase 1 of the FCC, Abuja has not being developed in compliance with the Abuja Master Plan.

  5. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    Science.gov (United States)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  6. Central Issue Facility at Fort Benning and Related Army Policies

    Science.gov (United States)

    2010-06-21

    such as boots, T-shirts, underwear, and socks , are not required to be returned. G-4 (Logistics) personnel informed us that the PPG is currently under...of this process slipped to June 1, 2010. This CIF is intended to provide issue,2 2 Issue means...0 Yes Elbow Pads 1 0 0 Yes Glove System, Summer 1 0 0 No Glove System, Intermediate 1 0 0 No T-Shirts, Moist Wick 0 0 4 No Socks , Cotton 0 0 4

  7. Samarbejdsformer og Facilities Management

    DEFF Research Database (Denmark)

    Storgaard, Kresten

    Resultater fra en surveyundersøgelse om fordele og ulemper ved forskellige samarbejdsformer indenfor Facilities Management fremlægges.......Resultater fra en surveyundersøgelse om fordele og ulemper ved forskellige samarbejdsformer indenfor Facilities Management fremlægges....

  8. Japan Hadron Facility

    CERN Document Server

    Hayano, R S

    1999-01-01

    Japan Hadron Facility (JHF) is a high-intensity proton accelerator complex consisting of a 200 MeV linac, a 3 GeV booster and a 50 GeV main ring. Its status and future possibilities of realizing a versatile antiproton facility at JHF are presented.

  9. Continuous real-time water-quality monitoring and regression analysis to compute constituent concentrations and loads in the North Fork Ninnescah River upstream from Cheney Reservoir, south-central Kansas, 1999–2012

    Science.gov (United States)

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir, located in south-central Kansas, is the primary water supply for the city of Wichita. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station since 1998 on the North Fork Ninnescah River, the main source of inflow to Cheney Reservoir. Continuously measured water-quality physical properties include streamflow, specific conductance, pH, water temperature, dissolved oxygen, and turbidity. Discrete water-quality samples were collected during 1999 through 2009 and analyzed for sediment, nutrients, bacteria, and other water-quality constituents. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to compute concentrations of those constituents of interest that are not easily measured in real time because of limitations in sensor technology and fiscal constraints. Regression models were published in 2006 that were based on data collected during 1997 through 2003. This report updates those models using discrete and continuous data collected during January 1999 through December 2009. Models also were developed for four new constituents, including additional nutrient species and indicator bacteria. In addition, a conversion factor of 0.68 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the North Ninnescah River upstream from Cheney Reservoir site. Newly developed models and 14 years of hourly continuously measured data were used to calculate selected constituent concentrations and loads during January 1999 through December 2012. The water-quality information in this report is important to the city of Wichita because it allows the concentrations of many potential pollutants of interest to Cheney Reservoir, including nutrients and sediment, to be estimated in real time and characterized over conditions and time scales that

  10. Reminder: Mandatory Computer Security Course

    CERN Multimedia

    IT Department

    2011-01-01

    Just like any other organization, CERN is permanently under attack – even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Therefore, a new dedicated basic computer security course has been designed informing you about the “Do’s” and “Dont’s” when using CERN's computing facilities. This course is mandatory for all person owning a CERN computer account and must be followed once every three years. Users who have never done the course, or whose course needs to be renewe...

  11. New Mandatory Computer Security Course

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Just like any other organization, CERN is permanently under attack - even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Since 2007, newcomers have to follow a dedicated basic computer security course informing them about the “Do’s” and “Dont’s” when using CERNs computing facilities. This course has recently been redesigned. It is now mandatory for all CERN members (users and staff) owning a CERN computer account and must be followed once every three years. Members who...

  12. Computational Social Creativity.

    Science.gov (United States)

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  13. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. S.; Choi, J. W.; Go, W. I.; Kim, H. D.; Song, K. C.; Jeong, I. H.; Park, H. S.; Im, C. S.; Lee, H. M.; Moon, K. H.; Hong, K. P.; Lee, K. S.; Suh, K. S.; Kim, E. K.; Min, D. K.; Lee, J. C.; Chun, Y. B.; Paik, S. Y.; Lee, E. P.; Yoo, G. S.; Kim, Y. S.; Park, J. C.

    1997-09-01

    In the early stage of the project, a comprehensive survey was conducted to identify the feasibility of using available facilities and of interface between those facilities. It was found out that the shielded cell M6 interface between those facilities. It was found out that the shielded cell M6 of IMEF could be used for the main process experiments of DUPIC fuel fabrication in regard to space adequacy, material flow, equipment layout, etc. Based on such examination, a suitable adapter system for material transfer around the M6 cell was engineered. Regarding the PIEF facility, where spent PWR fuel assemblies are stored in an annex pool, disassembly devices in the pool are retrofitted and spent fuel rod cutting and shipping system to the IMEF are designed and built. For acquisition of casks for radioactive material transport between the facilities, some adaptive refurbishment was applied to the available cask (Padirac) based on extensive analysis on safety requirements. A mockup test facility was newly acquired for remote test of DUPIC fuel fabrication process equipment prior to installation in the M6 cell of the IMEF facility. (author). 157 refs., 57 tabs., 65 figs.

  14. Unweighted Betweenness Centrality for Critical Fault Detection for Cascading Outage Assessment

    DEFF Research Database (Denmark)

    Petersen, Pauli Fríðheim; Jóhannsson, Hjörtur; Nielsen, Arne Hejde

    2016-01-01

    This paper analyses the possible use of unweighted betweenness centrality instead of weighted betweenness centrality, for critical fault detection for assessment of cascading failures. As unweighted betweenness centrality is significantly faster to compute, the possible use of this will significa...

  15. All biology is computational biology

    Science.gov (United States)

    2017-01-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science. PMID:28278152

  16. A measure of centrality based on modularity matrix

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this paper, a kind of measure of structural centrality for networks, called modularity centrality, is introduced. This centrality index is based on the eigenvector belonging to the largest magnitude eigenvalue of modularity matrix. The measure is illustrated and compared with the standard centrality measures using a classic dataset. The statistical distribution of modularity centrality is investigated by con- sidering large computer generated graphs and two networks from the real world.

  17. National Solar Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The National Solar Thermal Test Facility (NSTTF) is the only test facility in the United States of its type. This unique facility provides experimental engineering...

  18. Hydrography - Water Pollution Control Facilities

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — A Water Pollution Control Facility is a DEP primary facility type related to the Water Pollution Control Program. The sub-facility types related to Water Pollution...

  19. Skilled nursing or rehabilitation facilities

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/patientinstructions/000435.htm Skilled nursing or rehabilitation facilities To use the sharing features ... facility. Who Needs to go to a Skilled Nursing or Rehabilitation Facility? Your health care provider may ...

  20. Aviation Flight Support Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility consists of a 75' x 200' hanger with two adjacent helicopter pads located at Felker Army Airfield on Fort Eustis. A staff of Government and contractor...

  1. Pit Fragment Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility contains two large (20 foot high by 20 foot diameter) double walled steel tubs in which experimental munitions are exploded while covered with sawdust....

  2. GPS Satellite Simulation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The GPS satellite simulation facility consists of a GPS satellite simulator controlled by either a Silicon Graphics Origin 2000 or PC depending upon unit under test...

  3. Advanced Microscopy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides a facility for high-resolution studies of complex biomolecular systems. The goal is an understanding of how to engineer biomolecules for various...

  4. Coastal Harbors Modeling Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Coastal Harbors Modeling Facility is used to aid in the planning of harbor development and in the design and layout of breakwaters, absorbers, etc.. The goal is...

  5. Corrosion Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Corrosion Testing Facility is part of the Army Corrosion Office (ACO). It is a fully functional atmospheric exposure site, called the Corrosion Instrumented Test...

  6. Frost Effects Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Full-scale study in controlled conditionsThe Frost Effects Research Facility (FERF) is the largest refrigerated warehouse in the United States that can be used for a...

  7. Waste Water Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — This dataset contains the locations of municipal and industrial direct discharge wastewater treatment facilities throughout the state of Vermont. Spatial data is not...

  8. FDA Certified Mammography Facilities

    Science.gov (United States)

    ... Products Radiation-Emitting Products Home Radiation-Emitting Products Mammography Quality Standards Act and Program Consumer Information (MQSA) ... it Email Print This list of FDA Certified Mammography Facilities is updated weekly. If you click on ...

  9. Treated Effluent Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Treated non-hazardous and non-radioactive liquid wastes are collected and then disposed of through the systems at the Treated Effluent Disposal Facility (TEDF). More...

  10. Powder Metallurgy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The facility is uniquely equipped as the only laboratory within DA to conduct PM processing of refractory metals and alloys as well as the processing of a wide range...

  11. Mark 1 Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Mark I Test Facility is a state-of-the-art space environment simulation test chamber for full-scale space systems testing. A $1.5M dollar upgrade in fiscal year...

  12. The Birmingham Irradiation Facility

    CERN Document Server

    Dervan, P; Hodgson, P; Marin-Reyes, H; Wilson, J

    2013-01-01

    At the end of 2012 the proton irradiation facility at the CERN PS [1] will shut down for two years. With this in mind, we have been developing a new ATLAS scanning facility at the University of Birmingham Medical Physics cyclotron. With proton beams of energy approximately 30 MeV, fluences corresponding to those of the upgraded Large Hadron Collider (HL-LHC) can be reached conveniently. The facility can be used to irradiate silicon sensors, optical components and mechanical structures (e.g. carbon fibre sandwiches) for the LHC upgrade programme. Irradiations of silicon sensors can be carried out in a temperature controlled cold box that can be scanned through the beam. The facility is described in detail along with the first tests carried out with mini (1 x 1 cm^2 ) silicon sensors.

  13. Structural Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides a wide variety of testing equipment, fixtures and facilities to perform both unique aviation component testing as well as common types of materials testing...

  14. Climatic Environmental Test Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — RTTC has an extensive suite of facilities for supporting MIL-STD-810 testing, toinclude: Temperature/Altitude, Rapid Decompression, Low/High Temperature,Temperature...

  15. Geophysical Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Geophysical Research Facility (GRF) is a 60 ft long × 22 ft wide × 7 ft deep concrete basin at CRREL for fresh or saltwater investigations and can be temperature...

  16. Mass Properties Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility is used to acquire accurate weight, 3 axis center of gravity and 3 axis moment of inertia measurements for air launched munitions and armament equipment.

  17. Coastal Inlet Model Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Coastal Inlet Model Facility, as part of the Coastal Inlets Research Program (CIRP), is an idealized inlet dedicated to the study of coastal inlets and equipped...

  18. Skilled Nursing Facility PPS

    Data.gov (United States)

    U.S. Department of Health & Human Services — Section 4432(a) of the Balanced Budget Act (BBA) of 1997 modified how payment is made for Medicare skilled nursing facility (SNF) services. Effective with cost...

  19. Wind Tunnel Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This ARDEC facility consists of subsonic, transonic, and supersonic wind tunnels to acquire aerodynamic data. Full-scale and sub-scale models of munitions are fitted...

  20. Robotics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This 60 feet x 100 feet structure on the grounds of the Fort Indiantown Gap Pennsylvania National Guard (PNG) Base is a mixed-use facility comprising office space,...

  1. Environmental Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Test Facility (ETF) provides non-isolated shock testing for stand-alone equipment and full size cabinets under MIL-S-901D specifications. The ETF...

  2. Wind Tunnel Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — NASA Ames Research Center is pleased to offer the services of our premier wind tunnel facilities that have a broad range of proven testing capabilities to customers...

  3. Airborne Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — AFRL's Airborne Evaluation Facility (AEF) utilizes Air Force Aero Club resources to conduct test and evaluation of a variety of equipment and concepts. Twin engine...

  4. IHS Facility Locator

    Data.gov (United States)

    U.S. Department of Health & Human Services — This map can be used to find an Indian Health Service, Tribal or Urban Indian Health Program facility. This map can be used to: Zoom in to a general location to...

  5. The Birmingham Irradiation Facility

    Science.gov (United States)

    Dervan, P.; French, R.; Hodgson, P.; Marin-Reyes, H.; Wilson, J.

    2013-12-01

    At the end of 2012 the proton irradiation facility at the CERN PS [1] will shut down for two years. With this in mind, we have been developing a new ATLAS scanning facility at the University of Birmingham Medical Physics cyclotron. With proton beams of energy approximately 30 MeV, fluences corresponding to those of the upgraded Large Hadron Collider (HL-LHC) can be reached conveniently. The facility can be used to irradiate silicon sensors, optical components and mechanical structures (e.g. carbon fibre sandwiches) for the LHC upgrade programme. Irradiations of silicon sensors can be carried out in a temperature controlled cold box that can be scanned through the beam. The facility is described in detail along with the first tests carried out with mini (1×1 cm2) silicon sensors.

  6. Air Data Calibration Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility is for low altitude subsonic altimeter system calibrations of air vehicles. Mission is a direct support of the AFFTC mission. Postflight data merge is...

  7. Pittsburgh City Facilities

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Pittsburgh City FacilitiesIncludes: City Administrative Buildings, Police Stations, Fire Stations, EMS Stations, DPW Sites, Senior Centers, Recreation Centers,...

  8. Pittsburgh City Facilities

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Pittsburgh City FacilitiesIncludes: City Administrative Buildings, Police Stations, Fire Stations, EMS Stations, DPW Sites, Senior Centers, Recreation Centers, Pool...

  9. Water Tunnel Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NETL’s High-Pressure Water Tunnel Facility in Pittsburgh, PA, re-creates the conditions found 3,000 meters beneath the ocean’s surface, allowing scientists to study...

  10. Combustion Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — For more than 30 years The Combustion Research Facility (CRF) has served as a national and international leader in combustion science and technology. The need for a...

  11. Dialysis Facility Compare Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — These are the official datasets used on the Medicare.gov Dialysis Facility Compare Website provided by the Centers for Medicare and Medicaid Services. These data...

  12. Urban Test Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — RTTC has access to various facilities for use in urban testing applications,including an agreement with the Hazardous Devices School (HDS): a restrictedaccess Urban...

  13. Concrete Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This is a 20,000-sq ft laboratory that supports research on all aspects of concrete and materials technology. The staff of this facility offer wide-ranging expertise...

  14. A cryogenic test facility

    Science.gov (United States)

    Veenendaal, Ian

    The next generation, space-borne instruments for far infrared spectroscopy will utilize large diameter, cryogenically cooled telescopes in order to achieve unprecedented sensitivities. Low background, ground-based cryogenic facilities are required for the cryogenic testing of materials, components and subsystems. The Test Facility Cryostat (TFC) at the University of Lethbridge is a large volume, closed cycle, 4K cryogenic facility, developed for this purpose. This thesis discusses the design and performance of the facility and associated external instrumentation. An apparatus for measuring the thermal properties of materials is presented, and measurements of the thermal expansion and conductivity of carbon fibre reinforced polymers (CFRPs) at cryogenic temperatures are reported. Finally, I discuss the progress towards the design and fabrication of a demonstrator cryogenic, far infrared Fourier transform spectrometer.

  15. Airborne & Field Sensors Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — RTTC facilities include an 800' x 60' paved UAV operational area, clearapproach/departure zone, concrete pads furnished with 208VAC, 3 phase,200 amp power, 20,000 sq...

  16. Electra Laser Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The Electra Laser Facility is used to develop the science and technology needed to develop a reliable, efficient, high-energy, repetitively pulsed krypton...

  17. Field Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Field Research Facility (FRF) located in Duck, N.C. was established in 1977 to support the U.S. Army Corps of Engineers' coastal engineering mission. The FRF is...

  18. Frost Effects Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Full-scale study in controlled conditions The Frost Effects Research Facility (FERF) is the largest refrigerated warehouse in the United States that can be used for...

  19. Ballistic Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Ballistic Test Facility is comprised of two outdoor and one indoor test ranges, which are all instrumented for data acquisition and analysis. Full-size aircraft...

  20. Advanced Microanalysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Microanalysis Facility fully integrates capabilities for chemical and structural analysis of electronic materials and devices for the U.S. Army and DoD....

  1. Hypersonic Tunnel Facility (HTF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Hypersonic Tunnel Facility (HTF) is a blow-down, non-vitiated (clean air) free-jet wind tunnel capable of testing large-scale, propulsion systems at Mach 5, 6,...

  2. Computational physics: a perspective.

    Science.gov (United States)

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  3. Fast network centrality analysis using GPUs

    Directory of Open Access Journals (Sweden)

    Shi Zhiao

    2011-05-01

    Full Text Available Abstract Background With the exploding volume of data generated by continuously evolving high-throughput technologies, biological network analysis problems are growing larger in scale and craving for more computational power. General Purpose computation on Graphics Processing Units (GPGPU provides a cost-effective technology for the study of large-scale biological networks. Designing algorithms that maximize data parallelism is the key in leveraging the power of GPUs. Results We proposed an efficient data parallel formulation of the All-Pairs Shortest Path problem, which is the key component for shortest path-based centrality computation. A betweenness centrality algorithm built upon this formulation was developed and benchmarked against the most recent GPU-based algorithm. Speedup between 11 to 19% was observed in various simulated scale-free networks. We further designed three algorithms based on this core component to compute closeness centrality, eccentricity centrality and stress centrality. To make all these algorithms available to the research community, we developed a software package gpu-fan (GPU-based Fast Analysis of Networks for CUDA enabled GPUs. Speedup of 10-50× compared with CPU implementations was observed for simulated scale-free networks and real world biological networks. Conclusions gpu-fan provides a significant performance improvement for centrality computation in large-scale networks. Source code is available under the GNU Public License (GPL at http://bioinfo.vanderbilt.edu/gpu-fan/.

  4. Calibration Facilities for NIF

    Energy Technology Data Exchange (ETDEWEB)

    Perry, T.S.

    2000-06-15

    The calibration facilities will be dynamic and will change to meet the needs of experiments. Small sources, such as the Manson Source should be available to everyone at any time. Carrying out experiments at Omega is providing ample opportunity for practice in pre-shot preparation. Hopefully, the needs that are demonstrated in these experiments will assure the development of (or keep in service) facilities at each of the laboratories that will be essential for in-house preparation for experiments at NIF.

  5. Controlling centrality in complex networks

    CERN Document Server

    Nicosia, Vincenzo; Romance, Miguel; Russo, Giovanni; Latora, Vito

    2011-01-01

    Spectral centrality measures allow to identify influential individuals in social groups, to rank Web pages by their popularity, and even to determine the impact of scientific researches. The centrality score of a node within a network crucially depends on the entire pattern of connections, so that the usual approach is to compute the node centralities once the network structure is assigned. We face here with the inverse problem, that is, we study how to modify the centrality scores of the nodes by acting on the structure of a given network. We prove that there exist particular subsets of nodes, called controlling sets, which can assign any prescribed set of centrality values to all the nodes of a graph, by cooperatively tuning the weights of their out-going links. We show that many large networks from the real world have surprisingly small controlling sets, containing even less than 5-10% of the nodes. These results suggest that rankings obtained from spectral centrality measures have to be considered with ex...

  6. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  7. Mound facility physical characterization

    Energy Technology Data Exchange (ETDEWEB)

    Tonne, W.R.; Alexander, B.M.; Cage, M.R.; Hase, E.H.; Schmidt, M.J.; Schneider, J.E.; Slusher, W.; Todd, J.E.

    1993-12-01

    The purpose of this report is to provide a baseline physical characterization of Mound`s facilities as of September 1993. The baseline characterizations are to be used in the development of long-term future use strategy development for the Mound site. This document describes the current missions and alternative future use scenarios for each building. Current mission descriptions cover facility capabilities, physical resources required to support operations, current safety envelope and current status of facilities. Future use scenarios identify potential alternative future uses, facility modifications required for likely use, facility modifications of other uses, changes to safety envelope for the likely use, cleanup criteria for each future use scenario, and disposition of surplus equipment. This Introductory Chapter includes an Executive Summary that contains narrative on the Functional Unit Material Condition, Current Facility Status, Listing of Buildings, Space Plans, Summary of Maintenance Program and Repair Backlog, Environmental Restoration, and Decontamination and Decommissioning Programs. Under Section B, Site Description, is a brief listing of the Site PS Development, as well as Current Utility Sources. Section C contains Site Assumptions. A Maintenance Program Overview, as well as Current Deficiencies, is contained within the Maintenance Program Chapter.

  8. Central differences, Euler numbers and symbolic methods

    CERN Document Server

    Dowker, J S

    2013-01-01

    I relate some coefficients encountered when computing the functional determinants on spheres to the central differentials of nothing. In doing this I use some historic works, in particular transcribing the elegant symbolic formalism of Jeffery (1861) into central difference form which has computational advantages for Euler numbers, as discovered by Shovelton (1915). I derive sum rules for these, and for the central differentials, the proof of which involves an interesting expression for powers of sech x as multiple derivatives. I present a more general, symbolic treatment of central difference calculus which allows known, and unknown, things to be obtained in an elegant and compact fashion gaining, at no cost, the expansion of the powers of the inverse sinh, a basic central function. Systematic use is made of the operator 2 asinh(D/2). Umbral calculus is employed to compress the operator formalism. For example the orthogonality/completeness of the factorial numbers, of the first and second kinds, translates, ...

  9. XML Based Scientific Data Management Facility

    Science.gov (United States)

    Mehrotra, P.; Zubair, M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The World Wide Web consortium has developed an Extensible Markup Language (XML) to support the building of better information management infrastructures. The scientific computing community realizing the benefits of XML has designed markup languages for scientific data. In this paper, we propose a XML based scientific data management ,facility, XDMF. The project is motivated by the fact that even though a lot of scientific data is being generated, it is not being shared because of lack of standards and infrastructure support for discovering and transforming the data. The proposed data management facility can be used to discover the scientific data itself, the transformation functions, and also for applying the required transformations. We have built a prototype system of the proposed data management facility that can work on different platforms. We have implemented the system using Java, and Apache XSLT engine Xalan. To support remote data and transformation functions, we had to extend the XSLT specification and the Xalan package.

  10. Uncapacitated facility location problems: contributions

    Directory of Open Access Journals (Sweden)

    Galvão Roberto Diéguez

    2004-01-01

    Full Text Available The objective of the present paper is to review my personal contributions in the field of uncapacitated facility location problems. These contributions took place throughout my academic career, from the time I was a Ph.D. student at Imperial College to the present day. They cover approximately 30 years, from 1973 to 2003; they address: algorithms developed for the p-median problem and for a general formulation of uncapacitated location problems; the study of dynamic location models; covering and hierarchical location problems; queuing-based probabilistic location models. The contributions encompass theoretical developments, computational algorithms and practical applications. All work took place in an academic environment, with the invaluable collaboration of colleagues (both in Brazil and abroad and research students at COPPE. Each section in the paper is dedicated to a topic that involves a personal contribution. Every one of them is placed within the context of the existing literature.

  11. Feasibility study: PASS computer environment

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-10

    The Policy Analysis Screening System (PASS) is a computerized information-retrieval system designed to provide analysts in the Department of Energy, Assistant Secretary for Environment, Office of Technology Impacts (DOE-ASEV-OTI) with automated access to articles, computer simulation outputs, energy-environmental statistics, and graphics. Although it is essential that PASS respond quickly to user queries, problems at the computer facility where it was originally installed seriously slowed PASS's operations. Users attempting to access the computer by telephone repeatedly encountered busy signals and, once logged on, experienced unsatisfactory delays in response to commands. Many of the problems stemmed from the system's facility manager having brought another large user onto the system shortly after PASS was implemented, thereby significantly oversubscribing the facility. Although in March 1980 Energy Information Administration (EIA) transferred operations to its own computer facility, OTI has expressed concern that any improvement in computer access time and response time may not be sufficient or permanent. Consequently, a study was undertaken to assess the current status of the system, to identify alternative computer environments, and to evaluate the feasibility of each alternative in terms of its cost and its ability to alleviate current problems.

  12. Conceptualization and design of a variable-gravity research facility

    Science.gov (United States)

    1987-01-01

    The goal is to provide facilities for the study of the effects of variable-gravity levels in reducing the physiological stresses upon the humans of long-term stay time in zero-g. The designs studied include: twin-tethered two module system with a central despun module with docking port and winch gear; and rigid arm tube facility using shuttle external tanks. Topics examined included: despun central capsule configuration, docking clearances, EVA requirements, crew selection, crew scheduling, food supply and preparation, waste handling, leisure use, biomedical issues, and psycho-social issues.

  13. Automating an EXAFS facility: hardware and software considerations

    Energy Technology Data Exchange (ETDEWEB)

    Georgopoulos, P; Sayers, D E; Bunker, B; Elam, T; Grote, W A

    1981-01-01

    The basic design considerations for computer hardware and software, applicable not only to laboratory EXAFS facilities, but also to synchrotron installations, are reviewed. Uniformity and standardization of both hardware configurations and program packages for data collection and analysis are heavily emphasized. Specific recommendations are made with respect to choice of computers, peripherals, and interfaces, and guidelines for the development of software packages are set forth. A description of two working computer-interfaced EXAFS facilities is presented which can serve as prototypes for future developments. 3 figures.

  14. HVAC optimization as facility requirements change with corporate restructuring

    Energy Technology Data Exchange (ETDEWEB)

    Rodak, R.R.; Sankey, M.S.

    1997-06-01

    The hyper-competitive, dynamic 1990`s forced many corporations to {open_quotes}Right-Size,{close_quotes} relocating resources and equipment -- even consolidating. These changes led to utility reduction if HVAC optimization was thoroughly addressed, and energy conservation opportunities were identified and properly designed. This is true particularly when the facility`s heating and cooling systems are matched to correspond with the load changes attributed to the reduction of staff and computers. Computers have been downsized and processing power per unit of energy input increased, thus, the need for large mainframe computer centers, and their associated high intensity energy usage, have been decreased or eliminated. Cooling, therefore, also has been reduced.

  15. The grand challenge of managing the petascale facility.

    Energy Technology Data Exchange (ETDEWEB)

    Aiken, R. J.; Mathematics and Computer Science

    2007-02-28

    This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, we should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected

  16. Facility Environmental Vulnerability Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Van Hoesen, S.D.

    2001-07-09

    From mid-April through the end of June 2001, a Facility Environmental Vulnerability Assessment (FEVA) was performed at Oak Ridge National Laboratory (ORNL). The primary goal of this FEVA was to establish an environmental vulnerability baseline at ORNL that could be used to support the Laboratory planning process and place environmental vulnerabilities in perspective. The information developed during the FEVA was intended to provide the basis for management to initiate immediate, near-term, and long-term actions to respond to the identified vulnerabilities. It was expected that further evaluation of the vulnerabilities identified during the FEVA could be carried out to support a more quantitative characterization of the sources, evaluation of contaminant pathways, and definition of risks. The FEVA was modeled after the Battelle-supported response to the problems identified at the High Flux Beam Reactor at Brookhaven National Laboratory. This FEVA report satisfies Corrective Action 3A1 contained in the Corrective Action Plan in Response to Independent Review of the High Flux Isotope Reactor Tritium Leak at the Oak Ridge National Laboratory, submitted to the Department of Energy (DOE) ORNL Site Office Manager on April 16, 2001. This assessment successfully achieved its primary goal as defined by Laboratory management. The assessment team was able to develop information about sources and pathway analyses although the following factors impacted the team's ability to provide additional quantitative information: the complexity and scope of the facilities, infrastructure, and programs; the significantly degraded physical condition of the facilities and infrastructure; the large number of known environmental vulnerabilities; the scope of legacy contamination issues [not currently addressed in the Environmental Management (EM) Program]; the lack of facility process and environmental pathway analysis performed by the accountable line management or facility owner; and

  17. BLT-MS (Breach, Leach, and Transport -- Multiple Species) data input guide. A computer model for simulating release of contaminants from a subsurface low-level waste disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.M.; Kinsey, R.R.; Aronson, A.; Divadeenam, M. [Brookhaven National Lab., Upton, NY (United States); MacKinnon, R.J. [Brookhaven National Lab., Upton, NY (United States)]|[Ecodynamics Research Associates, Inc., Albuquerque, NM (United States)

    1996-11-01

    The BLT-MS computer code has been developed, implemented, and tested. BLT-MS is a two-dimensional finite element computer code capable of simulating the time evolution of concentration resulting from the time-dependent release and transport of aqueous phase species in a subsurface soil system. BLT-MS contains models to simulate the processes (water flow, container degradation, waste form performance, transport, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is simulated through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, or solubility. Radioactive production and decay in the waste form are simulated. Transport considers the processes of advection, dispersion, diffusion, radioactive production and decay, reversible linear sorption, and sources (waste forms releases). To improve the usefulness of BLT-MS a preprocessor, BLTMSIN, which assists in the creation of input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. This document reviews the models implemented in BLT-MS and serves as a guide to creating input files for BLT-MS.

  18. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  19. Effective strategies for development of thermal heavy oil field facilities

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Ken; Lehnert-Thiel, Gunter [IMV Projects (Canada)

    2011-07-01

    In thermal heavy oil, a significant part of the capital has to be invested in field facilities and therefore strategies have to be implemented to optimize these costs. Field facilities consist of pipelines, earthworks and production pads whose purpose is to connect an oilsands reservoir to a central processing facility. This paper, presented by IMV Projects, a leading company in the thermal heavy oil field, highlights strategies to manage field facility lifecycle cost. Upfront planning should be done and the development of field facilities should be thought of as a long term infrastructure program rather than a stand-alone project. In addition, templates should be developed to save money and repeatability should be implemented to obtain a better prediction of the program's costs. The strategies presented herein allow major savings over the program's life by implementing an improved schedule and allowing refinements all along the program's course.

  20. Comprehensive facilities plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    The Ernest Orlando Lawrence Berkeley National Laboratory`s Comprehensive Facilities Plan (CFP) document provides analysis and policy guidance for the effective use and orderly future development of land and capital assets at the Berkeley Lab site. The CFP directly supports Berkeley Lab`s role as a multiprogram national laboratory operated by the University of California (UC) for the Department of Energy (DOE). The CFP is revised annually on Berkeley Lab`s Facilities Planning Website. Major revisions are consistent with DOE policy and review guidance. Facilities planing is motivated by the need to develop facilities for DOE programmatic needs; to maintain, replace and rehabilitate existing obsolete facilities; to identify sites for anticipated programmatic growth; and to establish a planning framework in recognition of site amenities and the surrounding community. The CFP presents a concise expression of the policy for the future physical development of the Laboratory, based upon anticipated operational needs of research programs and the environmental setting. It is a product of the ongoing planning processes and is a dynamic information source.

  1. Fostering Computational Thinking

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  2. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  3. Expertise concerning the request by the ZWILAG Intermediate Storage Wuerenlingen AG for delivery of the operation licence for the conditioning installation as well as for the burning and melting installation of the central intermediate storage facility for radioactive wastes in Wuerenlingen; Gutachten zum Gesuch der ZWILAG Zwischenlager Wuerenlingen AG um Erteilung der Betriebsbewilligung fuer die Konditionierungsanlage sowie fuer die Verbrennungs- und Schmelzanlage des Zentralen Zwischenlagers fuer radioaktive Abfaelle in Wuerenlingen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-08-15

    On December 15, 1997, the Central Intermediate Storage Facility for radioactive materials (ZWILAG) delivered a request to the Swiss Federal Council for an operational licence for the conditioning installation and the incineration and melting installation at the Central Intermediate Storage Facility (ZZL). For the whole project, the general licence was granted on June 23, 1993. The construction of the whole installation as well as the operation in the storage halls with reception, hot cell and transhipment station were licensed on August 12, 1999. The Federal Agency for the Safety of Nuclear Installations (HSK) examined the request documentation of ZWILAG from the point of view of nuclear safety and radiation protection. The results of the examination are contained in this report. In general, the examination leads to a positive decision. During the realisation process many project adaptations have brought improvements to the installation as far as radiation protection and the treatment of wastes are concerned. HSK requests from a former licensing process were taken into account by ZWILAG. The present examination contains many remarks and requests that the HSK considers necessary. The most important remarks are issued as special references as far as they do not appear as proposals in the licence obligations. In general, the reference form was chosen when the fulfilment of the request can be guarantied by HSK within the framework of a release process. Several references inform ZWILAG of problems for which HSK wishes to receive the results of extended inquiries or where HSK will assist in testing or demonstrations. Other references concern requests for plant documentation. Further, calculations of radioactive material release during normal operation is considered as being necessary. Based on its examination, HSK concludes that the conditions are fulfilled for the safe operation of the conditioning installation and of the incineration and melting installation as long as

  4. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J. [Brookhaven National Lab., Upton, NY (United States)]|[Ecodynamic Research Associates, Inc., Albuquerque, NM (United States); Sullivan, T.M.; Kinsey, R.R. [Brookhaven National Lab., Upton, NY (United States)

    1997-05-01

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.

  5. FACILITIES MANAGEMENT AT CERN

    CERN Multimedia

    2002-01-01

    Recently we have been confronted with difficulties concerning services which are part of a new contract for facilities management. Please see below for some information about this contract. Following competitive tendering and the Finance Committee decision, the contract was awarded to the Swiss firm 'Facilities Management Network (FMN)'. The owners of FMN are two companies 'M+W Zander' and 'Avireal', both very experienced in this field of facilities management. The contract entered into force on 1st July 2002. CERN has grouped together around 20 different activities into this one contract, which was previously covered by separate contracts. The new contract includes the management and execution of many activities, in particular: Guards and access control; cleaning; operation and maintenance of heating plants, cooling and ventilation equipment for buildings not related to the tunnel or the LHC; plumbing; sanitation; lifts; green areas and roads; waste disposal; and includes a centralised helpdesk for these act...

  6. ESO adaptive optics facility

    Science.gov (United States)

    Arsenault, R.; Madec, P.-Y.; Hubin, N.; Paufique, J.; Stroebele, S.; Soenke, C.; Donaldson, R.; Fedrigo, E.; Oberti, S.; Tordo, S.; Downing, M.; Kiekebusch, M.; Conzelmann, R.; Duchateau, M.; Jost, A.; Hackenberg, W.; Bonaccini Calia, D.; Delabre, B.; Stuik, R.; Biasi, R.; Gallieni, D.; Lazzarini, P.; Lelouarn, M.; Glindeman, A.

    2008-07-01

    ESO has initiated in June 2004 a concept of Adaptive Optics Facility. One unit 8m telescope of the VLT is upgraded with a 1.1 m convex Deformable Secondary Mirror and an optimized instrument park. The AO modules GALACSI and GRAAL will provide GLAO and LTAO corrections forHawk-I and MUSE. A natural guide star mode is provided for commissioning and maintenance at the telescope. The facility is completed by a Laser Guide Star Facility launching 4 LGS from the telescope centerpiece used for the GLAO and LTAO wavefront sensing. A sophisticated test bench called ASSIST is being designed to allow an extensive testing and characterization phase of the DSM and its AO modules in Europe. Most sub-projects have entered the final design phase and the DSM has entered Manufacturing phase. First light is planned in the course of 2012 and the commissioning phases should be completed by 2013.

  7. Modernizing sports facilities

    Energy Technology Data Exchange (ETDEWEB)

    Dustin, R. [McKenney`s, Inc., Atlanta, GA (United States)

    1996-09-01

    Modernization and renovation of sports facilities challenge the design team to balance a number of requirements: spectator and owner expectations, existing building and site conditions, architectural layouts, code and legislation issues, time constraints and budget issues. System alternatives are evaluated and selected based on the relative priorities of these requirements. These priorities are unique to each project. At Alexander Memorial Coliseum, project schedules, construction funds and facility usage became the priorities. The ACC basketball schedule and arrival of the Centennial Olympics dictated the construction schedule. Initiation and success of the project depended on the commitment of the design team to meet coliseum funding levels established three years ago. Analysis of facility usage and system alternative capabilities drove the design team to select a system that met the project requirements and will maximize the benefits to the owner and spectators for many years to come.

  8. Experimental facility for containment sump reliability studies (Generic Task A-43). [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Durgin, W. W.; Padmanabhan, M.; Janik, C. R.

    1980-12-01

    On July 3, 1979, Sandia National Laboratories (Sandia) contracted the Alden Research Laboratory (ARL) to conduct tests on unresolved safety issues associated with containment sump performance during the recirculation mode (Generic Task A-43). This report describes the test facility constructed and completed under Phase I, Task III of the contract. Sump performance is determined through the observation of vortex formation in the main tank and the measurement of swirl, pressure gradient, and entrained air in the suction pipes. The use of electrically operated valves and a sophisticated data acquisition system, with computer interface, allows the test flow parameters to be set and test data to be taken (with the exception of vortex observations) from a single central office.

  9. Coverage centralities for temporal networks

    CERN Document Server

    Takaguchi, Taro; Yoshida, Yuichi

    2015-01-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex by the proportion of (normal) vertex pairs, the quickest routes between which can (or should) use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bo...

  10. Coverage centralities for temporal networks*

    Science.gov (United States)

    Takaguchi, Taro; Yano, Yosuke; Yoshida, Yuichi

    2016-02-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex based on the fastest temporal paths which use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bottleneck time at which most information sent in the temporal network passes through a small number of temporal vertices, which suggests an important role of these temporal vertices in spreading phenomena. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.Supplementary material in the form of one pdf file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-60498-7

  11. Classical computing, quantum computing, and Shor's factoring algorithm

    CERN Document Server

    Manin, Yu I

    1999-01-01

    This is an expository talk written for the Bourbaki Seminar. After a brief introduction, Section 1 discusses in the categorical language the structure of the classical deterministic computations. Basic notions of complexity icluding the P/NP problem are reviewed. Section 2 introduces the notion of quantum parallelism and explains the main issues of quantum computing. Section 3 is devoted to four quantum subroutines: initialization, quantum computing of classical Boolean functions, quantum Fourier transform, and Grover's search algorithm. The central Section 4 explains Shor's factoring algorithm. Section 5 relates Kolmogorov's complexity to the spectral properties of computable function. Appendix contributes to the prehistory of quantum computing.

  12. Facility effluent monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the facility effluent monitoring programs and provides an evaluation of effluent monitoring data. These evaluations are useful in assessing the effectiveness of effluent treatment and control systems, as well as management practices.

  13. Facilities: The Tech Edge.

    Science.gov (United States)

    Farmer, Lesley S. J.

    2002-01-01

    Examines the impact of technology on school library facilities and suggests some low-impact ways to optimize its use. Highlights include considering the role technology can play; educational goals; interior environmental factors; circulation desk needs; security; storage for hardware and software; handicapped accessibility; and future planning.…

  14. Toroid magnet test facility

    CERN Multimedia

    2002-01-01

    Because of its exceptional size, it was not feasible to assemble and test the Barrel Toroid - made of eight coils - as an integrated toroid on the surface, prior to its final installation underground in LHC interaction point 1. It was therefore decided to test these eight coils individually in a dedicated test facility.

  15. TNO HVAC facilities

    NARCIS (Netherlands)

    Hammink, H.A.J.

    2015-01-01

    TNO has extensive knowledge of heating, ventilation and air conditioning (HVAC), and can offer its services through theoretical studies, laboratory experiments and field measurements. This complete scope, made possible through our test facilities, enables the effective development of new products, i

  16. Sustainable Facilities Management

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Elle, Morten; Hoffmann, Birgitte;

    2004-01-01

    is on the housing departments and strateies for the management of the use of resources. The research methods used are case studies based on interviews in addition to literature studies. The paper explores lessons to be learned about sustainable facilities management in general, and points to a need for new...

  17. Facility Modernization Report

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D; Ackley, R

    2007-05-10

    Modern and technologically up-to-date facilities and systems infrastructure are necessary to accommodate today's research environment. In response, Lawrence Livermore National Laboratory (LLNL) has a continuing commitment to develop and apply effective management models and processes to maintain, modernize, and upgrade its facilities to meet the science and technology mission. The Facility Modernization Pilot Study identifies major subsystems of facilities that are either technically or functionally obsolete, lack adequate capacity and/or capability, or need to be modernized or upgraded to sustain current operations and program mission. This study highlights areas that need improvement, system interdependencies, and how these systems/subsystems operate and function as a total productive unit. Although buildings are 'grandfathered' in and are not required to meet current codes unless there are major upgrades, this study also evaluates compliance with 'current' building, electrical, and other codes. This study also provides an evaluation of the condition and overall general appearance of the structure.

  18. Facilities of Environmental Distinction

    Science.gov (United States)

    Pascopella, Angela

    2011-01-01

    Three of nine school buildings that have won the latest Educational Facility Design Awards from the American Institute of Architects (AIA) Committee on Architecture for Education stand out from the crowd of other school buildings because they are sustainable and are connected to the nature that surrounds them. They are: (1) Thurston Elementary…

  19. Nike Facility Diagnostics and Data Acquisition System

    Science.gov (United States)

    Chan, Yung; Aglitskiy, Yefim; Karasik, Max; Kehne, David; Obenschain, Steve; Oh, Jaechul; Serlin, Victor; Weaver, Jim

    2013-10-01

    The Nike laser-target facility is a 56-beam krypton fluoride system that can deliver 2 to 3 kJ of laser energy at 248 nm onto targets inside a two meter diameter vacuum chamber. Nike is used to study physics and technology issues related to laser direct-drive ICF fusion, including hydrodynamic and laser-plasma instabilities, material behavior at extreme pressures, and optical and x-ray diagnostics for laser-heated targets. A suite of laser and target diagnostics are fielded on the Nike facility, including high-speed, high-resolution x-ray and visible imaging cameras, spectrometers and photo-detectors. A centrally-controlled, distributed computerized data acquisition system provides robust data management and near real-time analysis feedback capability during target shots. Work supported by DOE/NNSA.

  20. Test facilities for VINCI®

    Science.gov (United States)

    Greuel, Dirk; Schäfer, Klaus; Schlechtriem, Stefan

    2013-09-01

    With the replacement of the current upper-stage ESC-A of the Ariane 5 launcher by an enhanced cryogenic upper-stage, ESA's Ariane 5 Midterm Evolution (A5-ME) program aims to raise the launcher's payload capacity in geostationary transfer orbit from 10 to 12 tons, an increase of 20 %. Increasing the in-orbit delivery capability of the A5-ME launcher requires a versatile, high-performance, evolved cryogenic upper-stage engine suitable for delivering multiple payloads to all kinds of orbits, ranging from low earth orbit to geostationary transfer orbit with increased perigee. In order to meet these requirements the re-ignitable liquid oxygen/liquid hydrogen expander cycle engine VINCI® currently under development is designated to power the future upper stage, featuring a design performance of 180 kN of thrust and 464 s of specific impulse. Since 2010 development tests for the VINCI® engine have been conducted at the test benches P3.2 and P4.1 at DLR test site in Lampoldshausen under the ESA A5-ME program. For the VINCI® combustion chamber development the P3.2 test facility is used, which is the only European thrust chamber test facility. Originally erected for the development of the thrust chamber of the Vulcain engine, in 2003 the test facility was modified that today it is able to simulate vacuum conditions for the ignition and startup of the VINCI® combustion chamber. To maintain the test operations under vacuum conditions over an entire mission life of the VINCI® engine, including re-ignition following long and short coasting phases, between 2000 and 2005 the test facility P4.1 was completely rebuilt into a new high-altitude simulation facility. During the past two P4.1 test campaigns in 2010 and 2011 a series of important milestones were reached in the development of the VINCI® engine. In preparation for future activities within the frame of ESA's A5-ME program DLR has already started the engineering of a stage test facility for the prospective upper stage

  1. Central bank Financial Independence

    OpenAIRE

    J.Ramon Martinez-Resano

    2004-01-01

    Central bank independence is a multifaceted institutional design. The financial component has been seldom analysed. This paper intends to set a comprehensive conceptual background for central bank financial independence. Quite often central banks are modelled as robot like maximizers of some goal. This perspective neglects the fact that central bank functions are inevitably deployed on its balance sheet and have effects on its income statement. A financially independent central bank exhibits ...

  2. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  3. Survey of the Knowledge and use of the library\\\\\\'s Information Section among users of the Central Library of Birjand University of Medical Sciences

    Directory of Open Access Journals (Sweden)

    Mehri Robiati

    2012-05-01

    Full Text Available Background and Aim: The assessment of the Central Library members' use and Knowledge is an important step in a comprehensive planning to improve their awareness of the available facilities and services. This Study was performed to determine knowledge, and use of library services and facilities of the information section by the members of the Central Library. Materials and Methods: In This descriptive– analytical study, among 558 participants, 344 students were selected through systematic sampling and the rest (254 by means of census. A self-designed questionnaire considering the aim of the study was used to evaluate some of the facilities offered by the information section. The gathered data was analyzed by means of Mann- Whitney and Kruskal-Wallis Tests using SPSS Software (version 15 and P≤0.05 was taken as the significant level. Results: Among 558 participants, 57%, 18%, 20% and 5% were students, staff, faculty members, and free members, respectively. 93/4% were aware of the library working hours and 16.6% were not satisfied with it. The most and the least familiarity with databases involved Irandoc and Ovid Bank (66/8%, 54%, respectively. Among the Computer Softwares, Microsoft Word was the most applicable one. Mean score (from the total of 116 of using library services and facilities was 21.719.1. There was a Significant relationship between the level of knowledge and library usage of the studied library members (P<0.001. Conclusion: The results showed that there is a relationship between members' Knowledge and their use of, the services and facilities of the central library.

  4. Berkeley Lab Computing Sciences: Accelerating Scientific Discovery

    OpenAIRE

    Hules, John A.

    2009-01-01

    Scientists today rely on advances in computer science, mathematics, and computational science, as well as large-scale computing and networking facilities, to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences organization researches, develops, and deploys new tools and technologies to meet these needs and to advance research in such areas as global climate change, combustion, fusion energy, nanotechnology, biology, and astrophysics.

  5. Room To Grow? Facilities Programming for Colleges and Universities.

    Science.gov (United States)

    Thompson, Roger; Adams, Tom

    2001-01-01

    Asserts that campus space needs could be remedied by moving centrally located service delivery organizations, such as fleet vehicle maintenance facilities. Describes the process of operational and space needs assessment; this process provides information that enables architects to plan for appropriate adjacencies, correct space allocation, and…

  6. ITER Central Solenoid Module Fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Smith, John [General Atomics, San Diego, CA (United States)

    2016-09-23

    The fabrication of the modules for the ITER Central Solenoid (CS) has started in a dedicated production facility located in Poway, California, USA. The necessary tools have been designed, built, installed, and tested in the facility to enable the start of production. The current schedule has first module fabrication completed in 2017, followed by testing and subsequent shipment to ITER. The Central Solenoid is a key component of the ITER tokamak providing the inductive voltage to initiate and sustain the plasma current and to position and shape the plasma. The design of the CS has been a collaborative effort between the US ITER Project Office (US ITER), the international ITER Organization (IO) and General Atomics (GA). GA’s responsibility includes: completing the fabrication design, developing and qualifying the fabrication processes and tools, and then completing the fabrication of the seven 110 tonne CS modules. The modules will be shipped separately to the ITER site, and then stacked and aligned in the Assembly Hall prior to insertion in the core of the ITER tokamak. A dedicated facility in Poway, California, USA has been established by GA to complete the fabrication of the seven modules. Infrastructure improvements included thick reinforced concrete floors, a diesel generator for backup power, along with, cranes for moving the tooling within the facility. The fabrication process for a single module requires approximately 22 months followed by five months of testing, which includes preliminary electrical testing followed by high current (48.5 kA) tests at 4.7K. The production of the seven modules is completed in a parallel fashion through ten process stations. The process stations have been designed and built with most stations having completed testing and qualification for carrying out the required fabrication processes. The final qualification step for each process station is achieved by the successful production of a prototype coil. Fabrication of the first

  7. Congestion Service Facilities Location Problem with Promise of Response Time

    Directory of Open Access Journals (Sweden)

    Dandan Hu

    2013-01-01

    Full Text Available In many services, promise of specific response time is advertised as a commitment by the service providers for the customer satisfaction. Congestion on service facilities could delay the delivery of the services and hurts the overall satisfaction. In this paper, congestion service facilities location problem with promise of response time is studied, and a mixed integer nonlinear programming model is presented with budget constrained. The facilities are modeled as M/M/c queues. The decision variables of the model are the locations of the service facilities and the number of servers at each facility. The objective function is to maximize the demands served within specific response time promised by the service provider. To solve this problem, we propose an algorithm that combines greedy and genetic algorithms. In order to verify the proposed algorithm, a lot of computational experiments are tested. And the results demonstrate that response time has a significant impact on location decision.

  8. EPA Facility Registry Service (FRS): AIRS_AFS Sub Facilities

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Air Facility System (AFS) contains compliance and permit data for stationary sources regulated by EPA, state and local air pollution agencies. The sub facility...

  9. EPA Facility Registry Service (FRS): Facility Interests Dataset Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are...

  10. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — This downloadable data package consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are...

  11. EPA Facility Registry Service (FRS): Facility Interests Dataset - Intranet

    Data.gov (United States)

    U.S. Environmental Protection Agency — This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in...

  12. Assisted Living Facilities, care facilities, Published in 2006, Washoe County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Assisted Living Facilities dataset, was produced all or in part from Published Reports/Deeds information as of 2006. It is described as 'care facilities'. Data...

  13. EPA Facility Registry Service (FRS): Facility Interests Dataset

    Data.gov (United States)

    U.S. Environmental Protection Agency — This web feature service consists of location and facility identification information from EPA's Facility Registry Service (FRS) for all sites that are available in...

  14. Replication of Space-Shuttle Computers in FPGAs and ASICs

    Science.gov (United States)

    Ferguson, Roscoe C.

    2008-01-01

    A document discusses the replication of the functionality of the onboard space-shuttle general-purpose computers (GPCs) in field-programmable gate arrays (FPGAs) and application-specific integrated circuits (ASICs). The purpose of the replication effort is to enable utilization of proven space-shuttle flight software and software-development facilities to the extent possible during development of software for flight computers for a new generation of launch vehicles derived from the space shuttles. The replication involves specifying the instruction set of the central processing unit and the input/output processor (IOP) of the space-shuttle GPC in a hardware description language (HDL). The HDL is synthesized to form a "core" processor in an FPGA or, less preferably, in an ASIC. The core processor can be used to create a flight-control card to be inserted into a new avionics computer. The IOP of the GPC as implemented in the core processor could be designed to support data-bus protocols other than that of a multiplexer interface adapter (MIA) used in the space shuttle. Hence, a computer containing the core processor could be tailored to communicate via the space-shuttle GPC bus and/or one or more other buses.

  15. The science of computing shaping a discipline

    CERN Document Server

    Tedre, Matti

    2014-01-01

    The identity of computing has been fiercely debated throughout its short history. Why is it still so hard to define computing as an academic discipline? Is computing a scientific, mathematical, or engineering discipline? By describing the mathematical, engineering, and scientific traditions of computing, The Science of Computing: Shaping a Discipline presents a rich picture of computing from the viewpoints of the field's champions. The book helps readers understand the debates about computing as a discipline. It explains the context of computing's central debates and portrays a broad perspecti

  16. Virtual Facility Layout Design Using Virtual Reality Techniques

    Institute of Scientific and Technical Information of China (English)

    Li; Zhi-Hua; Zhong; Yi-fang

    2003-01-01

    Manufacturing facility layout design has long been recognized as one of the most critical and difficult design tasks in manufacturing industries. Traditional layout methods can not fully solve this complex spatial layout problem. Virtual reality (VR) is a new form of human-computer interaction,which has the potential to support a range of engineering applications. This paper discusses the reasons why manufacturing facility layout design is considered to be an appropriate new area of VR utilization. A VR-based layout design framework is proposed. Th virtual environment construction issues are discussed. An example of the immersive VR application of facility layout is examined.

  17. Virtual Facility Layout Design Using Virtual Reality Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Zhi-hua; Zhong Yi-fang

    2003-01-01

    Manufacturing facility layout design has long been recognized as one of the most critical and difficult design tasks in manufacturing industries. Traditional layout methods can not fully solve this complex spatial layout problem. Virtual reality (VR) is a new form of human-computer interaction, which has the potential to support a range of engineering applications. This paper discusses the reasons why manufacturing facility layout design is considered to be an appropriate new area of VR utilization. A VR-based layout design framework is proposed. The virtual environment construction issues are discussed. An example of the immersive VR application of facility layout is examined.

  18. Ubiquitous Computing: Potentials and Challenges

    CERN Document Server

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. Ubiquitous computing will surround users with a comfortable and convenient information environment that merges physical and computational infrastructures into an integrated habitat. This habitat will feature a proliferation of hundreds or thousands of computing devices and sensors that will provide new functionality, offer specialized services, and boost productivity and interaction. This paper presents a comprehensive discussion on the central trends in ubiquitous computing considering them form technical, social and economic perspectives. It clearly identifies different application areas and sectors that will benefit f...

  19. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  20. Lightweight scheduling of elastic analysis containers in a competitive cloud environment: a Docked Analysis Facility for ALICE

    Science.gov (United States)

    Berzano, D.; Blomer, J.; Buncic, P.; Charalampidis, I.; Ganis, G.; Meusel, R.

    2015-12-01

    During the last years, several Grid computing centres chose virtualization as a better way to manage diverse use cases with self-consistent environments on the same bare infrastructure. The maturity of control interfaces (such as OpenNebula and OpenStack) opened the possibility to easily change the amount of resources assigned to each use case by simply turning on and off virtual machines. Some of those private clouds use, in production, copies of the Virtual Analysis Facility, a fully virtualized and self-contained batch analysis cluster capable of expanding and shrinking automatically upon need: however, resources starvation occurs frequently as expansion has to compete with other virtual machines running long-living batch jobs. Such batch nodes cannot relinquish their resources in a timely fashion: the more jobs they run, the longer it takes to drain them and shut off, and making one-job virtual machines introduces a non-negligible virtualization overhead. By improving several components of the Virtual Analysis Facility we have realized an experimental “Docked” Analysis Facility for ALICE, which leverages containers instead of virtual machines for providing performance and security isolation. We will present the techniques we have used to address practical problems, such as software provisioning through CVMFS, as well as our considerations on the maturity of containers for High Performance Computing. As the abstraction layer is thinner, our Docked Analysis Facilities may feature a more fine-grained sizing, down to single-job node containers: we will show how this approach will positively impact automatic cluster resizing by deploying lightweight pilot containers instead of replacing central queue polls.