WorldWideScience

Sample records for ground computer facilities

  1. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  2. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  3. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  4. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  5. Physics Division computer facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cyborski, D.R.; Teh, K.M.

    1995-08-01

    The Physics Division maintains several computer systems for data analysis, general-purpose computing, and word processing. While the VMS VAX clusters are still used, this past year saw a greater shift to the Unix Cluster with the addition of more RISC-based Unix workstations. The main Divisional VAX cluster which consists of two VAX 3300s configured as a dual-host system serves as boot nodes and disk servers to seven other satellite nodes consisting of two VAXstation 3200s, three VAXstation 3100 machines, a VAX-11/750, and a MicroVAX II. There are three 6250/1600 bpi 9-track tape drives, six 8-mm tapes and about 9.1 GB of disk storage served to the cluster by the various satellites. Also, two of the satellites (the MicroVAX and VAX-11/750) have DAPHNE front-end interfaces for data acquisition. Since the tape drives are accessible cluster-wide via a software package, they are, in addition to replay, used for tape-to-tape copies. There is however, a satellite node outfitted with two 8 mm drives available for this purpose. Although not part of the main cluster, a DEC 3000 Alpha machine obtained for data acquisition is also available for data replay. In one case, users reported a performance increase by a factor of 10 when using this machine.

  6. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  7. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  8. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  9. Guidance on the Stand Down, Mothball, and Reactivation of Ground Test Facilities

    Science.gov (United States)

    Volkman, Gregrey T.; Dunn, Steven C.

    2013-01-01

    The development of aerospace and aeronautics products typically requires three distinct types of testing resources across research, development, test, and evaluation: experimental ground testing, computational "testing" and development, and flight testing. Over the last twenty plus years, computational methods have replaced some physical experiments and this trend is continuing. The result is decreased utilization of ground test capabilities and, along with market forces, industry consolidation, and other factors, has resulted in the stand down and oftentimes closure of many ground test facilities. Ground test capabilities are (and very likely will continue to be for many years) required to verify computational results and to provide information for regimes where computational methods remain immature. Ground test capabilities are very costly to build and to maintain, so once constructed and operational it may be desirable to retain access to those capabilities even if not currently needed. One means of doing this while reducing ongoing sustainment costs is to stand down the facility into a "mothball" status - keeping it alive to bring it back when needed. Both NASA and the US Department of Defense have policies to accomplish the mothball of a facility, but with little detail. This paper offers a generic process to follow that can be tailored based on the needs of the owner and the applicable facility.

  10. Oak Ridge Leadership Computing Facility (OLCF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of standing up a supercomputer 100 times...

  11. Rendezvous Facilities in a Distributed Computer System

    Institute of Scientific and Technical Information of China (English)

    廖先Zhi; 金兰

    1995-01-01

    The distributed computer system described in this paper is a set of computer nodes interconnected in an interconnection network via packet-switching interfaces.The nodes communicate with each other by means of message-passing protocols.This paper presents the implementation of rendezvous facilities as high-level primitives provided by a parallel programming language to support interprocess communication and synchronization.

  12. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  13. Developing Computational Thinking through Grounded Embodied Cognition

    Science.gov (United States)

    Fadjo, Cameron Lawrence

    2012-01-01

    Two studies were conducted to examine the use of grounded embodied pedagogy, construction of Imaginary Worlds (Study 1), and context of instructional materials (Study 2) for developing learners' Computational Thinking (CT) Skills and Concept knowledge during the construction of digital artifacts using Scratch, a block-based programming…

  14. Computational Grounded Cognition: A New Alliance between Grounded Cognition and Computational Modeling

    Directory of Open Access Journals (Sweden)

    Giovanni ePezzulo

    2013-01-01

    Full Text Available Grounded theories assume that there is no central module for cognition. According to this view, all cognitive phenomena, including those considered the province of amodal cognition such as reasoning, numeric and language processing, are ultimately grounded in (and emerge from a variety of bodily, affective, perceptual and motor processes. The development and expression of cognition is constrained by the embodiment of cognitive agents and various contextual factors (physical and social in which they are immersed. The grounded framework has received numerous empirical confirmations. Still, there are very few explicit computational models that implement grounding in sensory, motor and affective processes as intrinsic to cognition, and demonstrate that grounded theories can mechanistically implement higher cognitive abilities. We propose a new alliance between grounded cognition and computational modeling towards a novel multidisciplinary enterprise: Computational Grounded Cognition. We clarify the defining features of this novel approach and emphasize the importance of using the methodology of Cognitive Robotics, which permits simultaneous consideration of multiple aspects of grounding, embodiment, and situatedness, showing how they constrain the development and expression of cognition.

  15. SILEX ground segment control facilities and flight operations

    Science.gov (United States)

    Demelenne, Benoit; Tolker-Nielsen, Toni; Guillen, Jean-Claude

    1999-04-01

    The European Space Agency is going to conduct an inter orbit link experiment which will connect a low Earth orbiting satellite and a Geostationary satellite via optical terminals. This experiment has been called SILEX (Semiconductor Inter satellite Link Experiment). Two payloads have been built. One called PASTEL (PASsager de TELecommunication) has been embarked on the French Earth observation satellite SPOT4 which has been launched successfully in March 1998. The future European experimental data relay satellite ARTEMIS (Advanced Relay and TEchnology MISsion), which will route the data to ground, will carry the OPALE terminal (Optical Payload Experiment). The European Space Agency is responsible for the operation of both terminals. Due to the complexity and experimental character of this new optical technology, the development, preparation and validation of the ground segment control facilities required a long series of technical and operational qualification tests. This paper is presenting the operations concept and the early results of the PASTEL in orbit operations.

  16. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  17. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  18. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  19. Embracing Safe Ground Test Facility Operations and Maintenance

    Science.gov (United States)

    Dunn, Steven C.; Green, Donald R.

    2010-01-01

    Conducting integrated operations and maintenance in wind tunnel ground test facilities requires a balance of meeting due dates, efficient operation, responsiveness to the test customer, data quality, effective maintenance (relating to readiness and reliability), and personnel and facility safety. Safety is non-negotiable, so the balance must be an "and" with other requirements and needs. Pressure to deliver services faster at increasing levels of quality in under-maintained facilities is typical. A challenge for management is to balance the "need for speed" with safety and quality. It s especially important to communicate this balance across the organization - workers, with a desire to perform, can be tempted to cut corners on defined processes to increase speed. Having a lean staff can extend the time required for pre-test preparations, so providing a safe work environment for facility personnel and providing good stewardship for expensive National capabilities can be put at risk by one well-intending person using at-risk behavior. This paper documents a specific, though typical, operational environment and cites management and worker safety initiatives and tools used to provide a safe work environment. Results are presented and clearly show that the work environment is a relatively safe one, though still not good enough to keep from preventing injury. So, the journey to a zero injury work environment - both in measured reality and in the minds of each employee - continues. The intent of this paper is to provide a benchmark for others with operational environments and stimulate additional sharing and discussion on having and keeping a safe work environment.

  20. Computational evaluation oa a neutron field facility

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Jose Julio de O.; Pazianotto, Mauricio T., E-mail: jjfilos@hotmail.com, E-mail: mpazianotto@gmail.com [Instituto Tecnologico de Aeronautica (ITA/DCTA), Sao Jose dos Campos, SP (Brazil); Federico, Claudio A.; Passaro, Angelo, E-mail: claudiofederico@ieav.cta.br, E-mail: angelo@ieav.cta.br [Instituto de Estudos Avancados (IEAv/DCTA), Sao Jose dos Campos, SP (Brazil)

    2015-07-01

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction {sup 3}H(d,n){sup 4}He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 10{sup 8} n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  1. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1987-01-01

    A stochastic control model of the NASA/MSFC Ground Facility for Large Space Structures (LSS) control verification through Maximum Entropy (ME) principle adopted in Hyland's method was presented. Using ORACLS, a computer program was implemented for this purpose. Four models were then tested and the results presented.

  2. An Assessment of Testing Requirement Impacts on Nuclear Thermal Propulsion Ground Test Facility Design

    Science.gov (United States)

    Shipers, Larry R.; Ottinger, Cathy A.; Sanchez, Lawrence C.

    1994-07-01

    Programs to develop solid core nuclear thermal propulsion (NTP) systems have been under way at the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), and the Department of Energy (DOE). These programs have recognized the need for a new ground test facility to support development of NTP systems. However, the different military and civilian applications have led to different ground test facility requirements. The Department of Energy (DOE) in its role as landlord and operator of the proposed research reactor test facilities has initiated an effort to explore opportunities for a common ground test facility to meet both DoD and NASA needs. The baseline design and operating limits of the proposed DoD NTP ground test facility are described. The NASA ground test facility requirements are reviewed and their potential impact on the DoD facility baseline is discussed.

  3. An assessment of testing requirement impacts on nuclear thermal propulsion ground test facility design

    Energy Technology Data Exchange (ETDEWEB)

    Shipers, L.R.; Ottinger, C.A.; Sanchez, L.C.

    1993-10-25

    Programs to develop solid core nuclear thermal propulsion (NTP) systems have been under way at the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), and the Department of Energy (DOE). These programs have recognized the need for a new ground test facility to support development of NTP systems. However, the different military and civilian applications have led to different ground test facility requirements. The Department of Energy (DOE) in its role as landlord and operator of the proposed research reactor test facilities has initiated an effort to explore opportunities for a common ground test facility to meet both DoD and NASA needs. The baseline design and operating limits of the proposed DoD NTP ground test facility are described. The NASA ground test facility requirements are reviewed and their potential impact on the DoD facility baseline is discussed.

  4. Hypergravity Facilities in the ESA Ground-Based Facility Program - Current Research Activities and Future Tasks

    Science.gov (United States)

    Frett, Timo; Petrat, Guido; W. A. van Loon, Jack J.; Hemmersbach, Ruth; Anken, Ralf

    2016-06-01

    Research on Artificial Gravity (AG) created by linear acceleration or centrifugation has a long history and could significantly contribute to realize long-term human spaceflight in the future. Employing centrifuges plays a prominent role in human physiology and gravitational biology. This article gives a short review about the background of Artificial Gravity with respect to hypergravity (including partial gravity) and provides information about actual ESA ground-based facilities for research on a variety of biosystems such as cells, plants, animals or, particularly, humans.

  5. The Spartan attitude control system - Ground support computer

    Science.gov (United States)

    Schnurr, R. G., Jr.

    1986-01-01

    The Spartan Attitude Control System (ACS) contains a command and control computer. This computer is optimized for the activities of the flight and contains very little human interface hardware and software. The computer system provides the technicians testing of Spartan ACS with a convenient command-oriented interface to the flight ACS computer. The system also decodes and time tags data automatically sent out by the flight computer as key events occur. The duration and magnitude of all system maneuvers is also derived and displayed by this system. The Ground Support Computer is also the primary Ground Support Equipment for the flight sequencer which controls all payload maneuvers, and long term program timing.

  6. Joint ACE ground penetrating radar antenna test facility at the Technical University of Denmark

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter; Sarri, A.;

    2005-01-01

    A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented.......A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented....

  7. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  8. Computer Profile of School Facilities Energy Consumption.

    Science.gov (United States)

    Oswalt, Felix E.

    This document outlines a computerized management tool designed to enable building managers to identify energy consumption as related to types and uses of school facilities for the purpose of evaluating and managing the operation, maintenance, modification, and planning of new facilities. Specifically, it is expected that the statistics generated…

  9. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  10. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  11. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  12. Ground Handling of Batteries at Test and Launch-site Facilities

    Science.gov (United States)

    Jeevarajan, Judith A.; Hohl, Alan R.

    2008-01-01

    Ground handling of flight as well as engineering batteries at test facilities and launch-site facilities is a safety critical process. Test equipment interfacing with the batteries should have the required controls to prevent a hazardous failure of the batteries. Test equipment failures should not induce catastrophic failures on the batteries. Transportation requirements for batteries should also be taken into consideration for safe transportation. This viewgraph presentation includes information on the safe handling of batteries for ground processing at test facilities as well as launch-site facilities.

  13. Stormwater Pollution Prevention Plan TA-60 Roads and Grounds Facility and Associated Sigma Mesa Staging Area

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, Leonard Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    This Stormwater Pollution Prevention Plan (SWPPP) is applicable to operations at the Technical Area -60 (TA-60) Roads and Grounds Facility and Associated Sigma Mesa Staging Area off Eniwetok Drive, in Los Alamos County, New Mexico.

  14. Ground test facilities for evaluating nuclear thermal propulsion engines and fuel elements

    Science.gov (United States)

    Allen, G. C.; Beck, D. F.; Harmon, C. D.; Shipers, L. R.

    Interagency panels evaluating nuclear thermal propulsion development options have consistently recognized the need for constructing a major new ground test facility to support fuel element and engine testing. This paper summarizes the requirements, configuration, and design issues of a proposed ground test complex for evaluating nuclear thermal propulsion engines and fuel elements being developed for the Space Nuclear Thermal Propulsion (SNTP) program.

  15. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  16. Technology benefits and ground test facilities for high-speed civil transport development

    Science.gov (United States)

    Winston, Matthew M.; Shields, Elwood M.; Morris, Shelby J., Jr.

    1992-01-01

    The advanced technology base necessary for successful twenty-first century High-Speed Civil Transport (HSCT) aircraft will require extensive ground testing in aerodynamics, propulsion, acoustics, structures, materials, and other disciplines. This paper analyzes the benefits of advanced technology application to HSCT concepts, addresses the adequacy of existing groundbased test facilities, and explores the need for new facilities required to support HSCT development. A substantial amount of HSCT-related ground testing can be accomplished in existing facilities. The HSCT development effort could also benefit significantly from some new facilities initially conceived for testing in other aeronautical research areas. A new structures testing facility is identified as critically needed to insure timely technology maturation.

  17. Computational Modeling in Support of National Ignition Facility Operations

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M J; Sacks, R A; Haynam, C A; Williams, W H

    2001-10-23

    Numerical simulation of the National Ignition Facility (NIF) laser performance and automated control of laser setup process are crucial to the project's success. These functions will be performed by two closely coupled computer codes: the virtual beamline (VBL) and the laser operations performance model (LPOM).

  18. The ACE-DTU Planar Near-Field Ground Penetrating Radar Antenna Test Facility

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter

    2004-01-01

    The ACE-DTU planar near-field ground penetrating radar (GPR) antenna test facility is used to measure the plane-wave transmitting spectrum of a GPR loop antenna close to the air-soil interface by means of a probe buried in soil. Probe correction is implemented using knowledge about the complex...

  19. The ACE-DTU Planar Near-Field Ground Penetrating Radar Antenna Test Facility

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter

    2004-01-01

    The ACE-DTU planar near-field ground penetrating radar (GPR) antenna test facility is used to measure the plane-wave transmitting spectrum of a GPR loop antenna close to the air-soil interface by means of a probe buried in soil. Probe correction is implemented using knowledge about the complex...

  20. Concept of ground facilities and the analyses of the factors for cost estimation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Choi, H. J.; Choi, J. W.; Kim, S. K.; Cho, D. K

    2007-09-15

    The geologic disposal of spent fuels generated from the nuclear power plants is the only way to protect the human beings and the surrounding environments present and future. The direct disposal of the spent fuels from the nuclear power plants is considered, and a Korean Reference HLW disposal System(KRS) suitable for our representative geological conditions have been developed. In this study, the concept of the spent fuel encapsulation process as a key of the above ground facilities for deep geological disposal was established. To do this, the design requirements, such as the functions and the spent fuel accumulations, were reviewed. Also, the design principles and the bases were established. Based on the requirements and the bases, the encapsulation process of the spent fuel from receiving spent fuel of nuclear power plants to transferring canister into the underground repository was established. Simulation for the above-ground facility in graphic circumstances through KRS design concept and disposal scenarios for spent nuclear fuel showed that an appropriate process was performed based on facility design concept and required for more improvement on construction facility by actual demonstration test. And, based on the concept of the above ground facilities for the Korean Reference HLW disposal System, the analyses of the factors for the cost estimation was carried out.

  1. Which future for electromagnetic Astronomy: Ground Based vs Space Borne Large Astrophysical Facilities

    Science.gov (United States)

    Ubertini, Pietro

    2015-08-01

    The combined use of large ground based facilities and large space observatories is playing a key role in the advance of astrophysics by providing access to the entire electromagnetic spectrum, allowing high sensitivity observations from the lower radio wavelength to the higher energy gamma rays.It is nowadays clear that a forward steps in the understanding of the Universe evolution and large scale structure formation is essential and only possible with the combined use of multiwavelength imaging and spectral high resolution instruments.The increasing size, complexity and cost of large ground and space observatories places a growing emphasis on international collaboration. If the present set of astronomical facilities is impressive and complete, with nicely complementary space and ground based telescopes, the scenario becomes worrisome and critical in the next two decades. In fact, only a few ‘Large’ main space missions are planned and there is a need to ensure proper ground facility coverage: the synergy Ground-Space is not escapable in the timeframe 2020-2030.The scope of this talk is to review the current astronomical instrumentation panorama also in view of the recent major national agencies and international bodies programmatic decisions.This Division B meeting give us a unique opportunity to review the current situation and discuss the future perspectives taking advantage of the large audience ensured by the IAU GA.

  2. Characterization of Vacuum Facility Background Gas Through Simulation and Considerations for Electric Propulsion Ground Testing

    Science.gov (United States)

    Yim, John T.; Burt, Jonathan M.

    2015-01-01

    The background gas in a vacuum facility for electric propulsion ground testing is examined in detail through a series of cold flow simulations using a direct simulation Monte Carlo (DSMC) code. The focus here is on the background gas itself, its structure and characteristics, rather than assessing its interaction and impact on thruster operation. The background gas, which is often incorrectly characterized as uniform, is found to have a notable velocity within a test facility. The gas velocity has an impact on the proper measurement of pressure and the calculation of ingestion flux to a thruster. There are also considerations for best practices for tests that involve the introduction of supplemental gas flows to artificially increase the background pressure. All of these effects need to be accounted for to properly characterize the operation of electric propulsion thrusters across different ground test vacuum facilities.

  3. Effluent Containment System for space thermal nuclear propulsion ground test facilities

    Science.gov (United States)

    1995-08-01

    This report presents the research and development study work performed for the Space Reactor Power System Division of the U.S. Department of Energy on an innovative effluent containment system (ECS) that would be used during ground testing of a space nuclear thermal rocket engine. A significant portion of the ground test facilities for a space nuclear thermal propulsion engine are the effluent treatment and containment systems. The proposed ECS configuration developed recycles all engine coolant media and does not impact the environment by venting radioactive material. All coolant media, hydrogen and water, are collected, treated for removal of radioactive particulates, and recycled for use in subsequent tests until the end of the facility life. Radioactive materials removed by the treatment systems are recovered, stored for decay of short-lived isotopes, or packaged for disposal as waste. At the end of the useful life, the facility will be decontaminated and dismantled for disposal.

  4. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  5. Life-Cycle Assessments of Selected NASA Ground-Based Test Facilities

    Science.gov (United States)

    Sydnor, George Honeycutt

    2012-01-01

    In the past two years, two separate facility-specific life cycle assessments (LCAs) have been performed as summer student projects. The first project focused on 13 facilities managed by NASA s Aeronautics Test Program (ATP), an organization responsible for large, high-energy ground test facilities that accomplish the nation s most advanced aerospace research. A facility inventory was created for each facility, and the operational-phase carbon footprint and environmental impact were calculated. The largest impacts stemmed from electricity and natural gas used directly at the facility and to generate support processes such as compressed air and steam. However, in specialized facilities that use unique inputs like R-134a, R-14, jet fuels, or nitrogen gas, these sometimes had a considerable effect on the facility s overall environmental impact. The second LCA project was conducted on the NASA Ames Arc Jet Complex and also involved creating a facility inventory and calculating the carbon footprint and environmental impact. In addition, operational alternatives were analyzed for their effectiveness at reducing impact. Overall, the Arc Jet Complex impact is dominated by the natural-gas fired boiler producing steam on-site, but alternatives were provided that could reduce the impact of the boiler operation, some of which are already being implemented. The data and results provided by these LCA projects are beneficial to both the individual facilities and NASA as a whole; the results have already been used in a proposal to reduce carbon footprint at Ames Research Center. To help future life cycle projects, several lessons learned have been recommended as simple and effective infrastructure improvements to NASA, including better utility metering and data recording and standardization of modeling choices and methods. These studies also increased sensitivity to and appreciation for quantifying the impact of NASA s activities.

  6. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  7. Hanford environment as related to radioactive waste burial grounds and transuranium waste storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.J.; Isaacson, R.E.

    1977-06-01

    A detailed characterization of the existing environment at Hanford was provided by the U.S. Energy Research and Development Administration (ERDA) in the Final Environmental Statement, Waste Management Operations, Hanford Reservation, Richland, Washington, December 1975. Abbreviated discussions from that document are presented together with current data, as they pertain to radioactive waste burial grounds and interim transuranic (TRU) waste storage facilities. The discussions and data are presented in sections on geology, hydrology, ecology, and natural phenomena. (JRD)

  8. Hanford facility dangerous waste permit application, low-level burial grounds

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, R.H.

    1997-08-12

    The Hanford Facility Dangerous Plaste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, `operating` treatment, storage, and/or disposal units, such as the Low-Level Burial Grounds (this document, DOE/RL-88-20).

  9. A Grounded Theory Analysis of Introductory Computer Science Pedagogy

    Directory of Open Access Journals (Sweden)

    Jonathan Wellons

    2011-12-01

    Full Text Available Planning is a critical, early step on the path to successful program writing and a skill that is often lacking in novice programmers. As practitioners we are continually searching for or creating interventions to help our students, particularly those who struggle in the early stages of their computer science education. In this paper we report on our ongoing research of novice programming skills that utilizes the qualitative research method of grounded theory to develop theories and inform the construction of these interventions. We describe how grounded theory, a popular research method in the social sciences since the 1960’s, can lend formality and structure to the common practice of simply asking students what they did and why they did it. Further, we aim to inform the reader not only about our emerging theories on interventions for planning but also how they might collect and analyze their own data in this and other areas that trouble novice programmers. In this way those who lecture and design CS1 interventions can do so from a more informed perspective.

  10. Operational Phase Life Cycle Assessment of Select NASA Ground Test Facilities

    Science.gov (United States)

    Sydnor, George H.; Marshall, Timothy J.; McGinnis, Sean

    2011-01-01

    NASA's Aeronautics Test Program (ATP) is responsible for many large, high-energy ground test facilities that accomplish the nation s most advanced aerospace research. In order to accomplish these national objectives, significant energy and resources are consumed. A select group of facilities was analyzed using life-cycle assessment (LCA) to determine carbon footprint and environmental impacts. Most of these impacts stem from electricity and natural gas consumption, used directly at the facility and to generate support processes such as compressed air and steam. Other activities were analyzed but determined to be smaller in scale and frequency with relatively negligible environmental impacts. More specialized facilities use R-134a, R-14, jet fuels, or nitrogen gas, and these unique inputs can have a considerable effect on a facility s overall environmental impact. The results of this LCA will be useful to ATP and NASA as the nation looks to identify its top energy consumers and NASA looks to maximize research output and minimize environmental impact. Keywords: NASA, Aeronautics, Wind tunnel, Keyword 4, Keyword 5

  11. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  12. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Blood Establishment Computer System... ``Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April... establishment computer system validation program, consistent with recognized principles of software...

  13. An Experimental Facility to Validate Ground Source Heat Pump Optimisation Models for the Australian Climate

    Directory of Open Access Journals (Sweden)

    Yuanshen Lu

    2017-01-01

    Full Text Available Ground source heat pumps (GSHPs are one of the most widespread forms of geothermal energy technology. They utilise the near-constant temperature of the ground below the frost line to achieve energy-efficiencies two or three times that of conventional air-conditioners, consequently allowing a significant offset in electricity demand for space heating and cooling. Relatively mature GSHP markets are established in Europe and North America. GSHP implementation in Australia, however, is limited, due to high capital price, uncertainties regarding optimum designs for the Australian climate, and limited consumer confidence in the technology. Existing GSHP design standards developed in the Northern Hemisphere are likely to lead to suboptimal performance in Australia where demand might be much more cooling-dominated. There is an urgent need to develop Australia’s own GSHP system optimisation principles on top of the industry standards to provide confidence to bring the GSHP market out of its infancy. To assist in this, the Queensland Geothermal Energy Centre of Excellence (QGECE has commissioned a fully instrumented GSHP experimental facility in Gatton, Australia, as a publically-accessible demonstration of the technology and a platform for systematic studies of GSHPs, including optimisation of design and operations. This paper presents a brief review on current GSHP use in Australia, the technical details of the Gatton GSHP facility, and an analysis on the observed cooling performance of this facility to date.

  14. Torsion pendulum facility for ground testing of gravitational sensors for LISA

    CERN Document Server

    Hüller, M; Dolesi, R; Vitale, S; Weber, W J

    2002-01-01

    We report here on a torsion pendulum facility for ground-based testing of the Laser Interferometer Space Antenna (LISA) gravitational sensors. We aim to measure weak forces exerted by a capacitive position sensor on a lightweight version of the LISA test mass, suspended from a thin torsion fibre. This facility will permit measurement of the residual, springlike coupling between the test mass and the sensor and characterization of other stray forces relevant to LISA drag-free control. The expected force sensitivity of the proposed torsion pendulum is limited by the intrinsic thermal noise at approx 3x10 sup - sup 1 sup 3 N Hz sup - sup 1 sup / sup 2 at 1 mHz. We briefly describe the design and implementation of the apparatus, its expected performance and preliminary experimental data.

  15. Radioactive contamination in liquid wastes discharged to ground at the separations facilities through December, 1966

    Energy Technology Data Exchange (ETDEWEB)

    McMurray, B.J.

    1967-02-15

    This document summarizes the amounts of radioactive contamination discharged to ground from chemical separations and laboratory facilities through December, 1966. Detailed data for individual disposal sites are presented on a month-to-month basis for the period of January through December, 1966. Previous publications of this series are listed in the bibliography and may be referred to for specific information on measurements and radioactivity totals prior to January, 1966. Several changes in crib nomenclature were made during 1965. These changes are noted on the individual tables so reference may be made to them in previous reports.

  16. Status of the National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Bryant, R; Carey, R; Casavant, D; Edwards, O; Ferguson, W; Krammen, J; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Van Arsdall, P J; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1{omega}, 2{omega}, and 3{omega} beamlines in the world

  17. Correlation of horizontal and vertical components of strong ground motion for response-history analysis of safety-related nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yin-Nan, E-mail: ynhuang@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Yen, Wen-Yi, E-mail: b01501059@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Whittaker, Andrew S., E-mail: awhittak@buffalo.edu [Dept. of Civil, Structural and Environmental Engineering, MCEER, State University of New York at Buffalo, Buffalo, NY 14260 (United States)

    2016-12-15

    Highlights: • The correlation of components of ground motion is studied using 1689 sets of records. • The data support an upper bound of 0.3 on the correlation coefficient. • The data support the related requirement in the upcoming edition of ASCE Standard 4. - Abstract: Design standards for safety-related nuclear facilities such as ASCE Standard 4-98 and ASCE Standard 43-05 require the correlation coefficient for two orthogonal components of ground motions for response-history analysis to be less than 0.3. The technical basis of this requirement was developed by Hadjian three decades ago using 50 pairs of recorded ground motions that were available at that time. In this study, correlation coefficients for (1) two horizontal components, and (2) the vertical component and one horizontal component, of a set of ground motions are computed using records from a ground-motion database compiled recently for large-magnitude shallow crustal earthquakes. The impact of the orientation of the orthogonal horizontal components on the correlation coefficient of ground motions is discussed. The rules in the forthcoming edition of ASCE Standard 4 for the correlation of components in a set of ground motions are shown to be reasonable.

  18. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  19. Computer simulation of ground coupled storage in a series solar assisted heat pump system

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, John W.; Metz, Philip D.

    1979-01-01

    A quantitative study of the effect of thermal coupling between the ground and the heat storage element of a series solar assisted heat pump system is presented. The transient simulation computer program TRNSYS is used to simulate the solar portion of this system. A program to simulate the thermal interaction of the storage element with the ground is incorporated into TRNSYS as a sub-routine. This program calculates heat flow through the ground in discrete steps over space and time. Boundary conditions are established. The ground coupled storage is driven by thermal inputs from the solar portion of the system and from the changing ambient and ground temperatures.

  20. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  1. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  2. NASA HRP Plans for Collaboration at the IBMP Ground-Based Experimental Facility (NEK)

    Science.gov (United States)

    Cromwell, Ronita L.

    2016-01-01

    NASA and IBMP are planning research collaborations using the IBMP Ground-based Experimental Facility (NEK). The NEK offers unique capabilities to study the effects of isolation on behavioral health and performance as it relates to spaceflight. The NEK is comprised of multiple interconnected modules that range in size from 50-250m(sup3). Modules can be included or excluded in a given mission allowing for flexibility of platform design. The NEK complex includes a Mission Control Center for communications and monitoring of crew members. In an effort to begin these collaborations, a 2-week mission is planned for 2017. In this mission, scientific studies will be conducted to assess facility capabilities in preparation for longer duration missions. A second follow-on 2-week mission may be planned for early in 2018. In future years, long duration missions of 4, 8 and 12 months are being considered. Missions will include scenarios that simulate for example, transit to and from asteroids, the moon, or other interplanetary travel. Mission operations will be structured to include stressors such as, high workloads, communication delays, and sleep deprivation. Studies completed at the NEK will support International Space Station expeditions, and future exploration missions. Topics studied will include communication, crew autonomy, cultural diversity, human factors, and medical capabilities.

  3. The Advance of Computing from the Ground to the Cloud

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  4. Modular Extended-Stay HyperGravity Facility Design Concept: An Artificial-Gravity Space-Settlement Ground Analogue

    Science.gov (United States)

    Dorais, Gregory A.

    2015-01-01

    This document defines the design concept for a ground-based, extended-stay hypergravity facility as a precursor for space-based artificial-gravity facilities that extend the permanent presence of both human and non-human life beyond Earth in artificial-gravity settlements. Since the Earth's current human population is stressing the environment and the resources off-Earth are relatively unlimited, by as soon as 2040 more than one thousand people could be living in Earthorbiting artificial-gravity habitats. Eventually, the majority of humanity may live in artificialgravity habitats throughout this solar system as well as others, but little is known about the longterm (multi-generational) effects of artificial-gravity habitats on people, animals, and plants. In order to extend life permanently beyond Earth, it would be useful to create an orbiting space facility that generates 1g as well as other gravity levels to rigorously address the numerous challenges of such an endeavor. Before doing so, developing a ground-based artificial-gravity facility is a reasonable next step. Just as the International Space Station is a microgravity research facility, at a small fraction of the cost and risk a ground-based artificial-gravity facility can begin to address a wide-variety of the artificial-gravity life-science questions and engineering challenges requiring long-term research to enable people, animals, and plants to live off-Earth indefinitely.

  5. Grounded Theory in media research and the use of the computer

    NARCIS (Netherlands)

    Hijmans, E.J.S.; Peters, V.A.M.

    2000-01-01

    This article offers, from a perspective of Grounded Theory, a comprehensive summary of general procedures for qualitative analysis and the advantages of the use of the computer. The Grounded Theory Approach is one of the most elaborated methods in the field of interpretive analysis within which anal

  6. JPL Table Mountain Facility Support of the Ground/Orbiter Lasercomm Demonstration

    Science.gov (United States)

    Gillam, S. D.; Young, J. W.; Sidwell, D. R.

    1996-01-01

    On 23 nights between October 30, 1995, and January 13, 1996, the JPL Table Mountain Facility (TMF) was the site of the ground stations of the Ground/Orbiter Lasercomm Demonstration (GOLD). These 0.6-m and 1.2-m telescopes acted as terminals in a bent-pipe optical communications link. This link went from the ground to an optical communications transceiver terminal on the Japanese Engineering Test Satellite (ETS-VI) and back to the ground. This article describes how the TMF supported this novel optical communications experiment. This experiment was a collaborative effort between JPL, NASA's Deep Space Network (DSN), the Japanese National Aeronautics and Space Development Agency (NASDA), and the Japanese Communications Research Laboratory (CRL), which operates the ETS-VI. The 0.6-m telescope, in the coude configuration, was used to uplink a 514-nm modulated laser to the transceiver on the ETS-VI communications satellite. The 1.2-m telescope, in the Cassegrain configuration, was used to detect an 830-nm diode laser signal downlinked from the ETS-VI terminal. The downlink was sent only if the uplink beam was detected. The uplink beam had to be kept within a box 5 arcsec on a side and centered on the position of the ETS-VI. This required that the 0.6-m telescope track the ETS-VI to a precision of ~2 arcsec. The 1.2-m telescope was required to track to a precision of 4{5 arcsec because the downlink detector had an aperture with a 13-arcsec-diameter field of view. This article describes how the above tracking performance was met by both telescopes. Equipment designed for the experiment at the transmitter and receiver stations, acquisition methods, and software developed to support this project are discussed, as are experiments performed to establish the suitability of the TMF telescopes for this demonstration. This article discusses upgrades to the TMF electrical power system needed to support GOLD; mechanical, optical, and servo-control aspects of the transmitter and

  7. A methodology for assessing computer software applicability to inventory and facility management

    OpenAIRE

    Paul, Debashis

    1989-01-01

    Computer applications have become popular and widespread in architecture and other related fields. While the architect uses a computer for design and construction of a building, the user takes the advantage of computer for maintenance of the building. Inventory and facility management are two such fields where computer applications have become predominant. The project has investigated the use and application of different commercially available computer software in the above men...

  8. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  9. A Comparison of Space and Ground Based Facility Environmental Effects for FEP Teflon. Revised

    Science.gov (United States)

    Rutledge, Sharon K.; Banks, Bruce A.; Kitral, Michael

    1998-01-01

    Fluorinated Ethylene Propylene (FEP) Teflon is widely used as a thermal control material for spacecraft, however, it is susceptible to erosion, cracking, and subsequent mechanical failure in low Earth orbit. One of the difficulties in determining whether FEP Teflon will survive during a mission is the wide disparity of erosion rates observed for this material in space and in ground based facilities. Each environment contains different levels of atomic oxygen, ions, and vacuum ultraviolet (VUV) radiation in addition to parameters such as the energy of the arriving species and temperature. These variations make it difficult to determine what is causing the observed differences in erosion rates. This paper attempts to narrow down which factors affect the erosion rate of FEP Teflon through attempting to change only one environmental constituent at a time. This was attempted through the use of a single simulation facility (plasma asher) environment with a variety of Faraday cages and VUV transparent windows. Isolating one factor inside of a radio frequency (RF) plasma proved to be very difficult. Two observations could be made. First, it appears that the erosion yield of FEP Teflon with respect to that of polyimide Kapton is not greatly affected by the presence or lack of VUV radiation present in the RF plasma and the relative erosion yield for the FEP Teflon may decrease with increasing fluence. Second, shielding from charged particles appears to lower the relative erosion yield of the FEP to approximately that observed in space, however it is difficult to determine for sure whether ions, electrons, or some other components are causing the enhanced erosion.

  10. Active suspension design for a Large Space Structure ground test facility

    Science.gov (United States)

    Lange, Thomas J. H.; Schlegel, Clemens

    1993-01-01

    The expected future high performance requirements for Large Space Structures (LSS) enforce technology innovations such as active vibration damping techniques e.g., by means of structure sensors and actuators. The implementation of new technologies like that requires an interactive and integrated structural and control design with an increased effort in hardware validation by ground testing. During the technology development phase generic system tests will be most important covering verification and validation aspects up to the preparation and definition of relevant space experiments. For many applications using advanced designs it is deemed necessary to improve existing testing technology by further reducing disturbances and gravity coupling effects while maintaining high performance reliability. A key issue in this context is the improvement of suspension techniques. The ideal ground test facility satisfying these requirements completely will never be found. The highest degree of reliability will always be obtained by passive suspension methods taking into account severe performance limitations such as non-zero rigid body modes, restriction of degrees of freedom of motion and frequency response limitations. Passive compensation mechanisms, e.g., zero-spring-rate mechanisms, either require large moving masses or they are limited with respect to low-frequency performance by friction, stiction or other non-linear effects. With active suspensions these limitations can be removed to a large extent thereby increasing the range of applications. Despite an additional complexity which is associated with a potential risk in reliability their development is considered promising due to the amazing improvement of real-time control technology which is still continuing.

  11. Computing spatial correlation of ground motion intensities for ShakeMap

    Science.gov (United States)

    Verros, Sarah A.; Wald, David J.; Worden, C. Bruce; Hearne, Mike; Ganesh, Mahadevan

    2017-02-01

    Modeling the spatial correlation of ground motion residuals, caused by coherent contributions from source, path, and site, can provide valuable loss and hazard information, as well as a more realistic depiction of ground motion intensities. The U.S. Geological Survey (USGS) software package, ShakeMap, utilizes a deterministic empirical approach to estimate median ground shaking in conjunction with observed seismic data. ShakeMap-based shaking estimates are used in concert with loss estimation algorithms to estimate fatalities and economic losses after significant seismic events around the globe. Incorporating the spatial correlation of ground motion residuals has been shown to improve seismic loss estimates. In particular, Park, Bazzuro, and Baker (Applications of Statistics and Probability in Civil Engineering, 2007) investigated computing spatially correlated random fields of residuals. However, for large scale ShakeMap grids, computational requirements of the method are prohibitive. In this work, a memory efficient algorithm is developed to compute the random fields and implemented using the ShakeMap framework. This new, iterative parallel algorithm is based on decay properties of an associated ground motion correlation function and is shown to significantly reduce computational requirements associated with adding spatial variability to the ShakeMap ground motion estimates. Further, we demonstrate and quantify the impact of adding peak ground motion spatial variability on resulting earthquake loss estimates.

  12. [Use of personal computers in forensic medicine facilities].

    Science.gov (United States)

    Vorel, F

    1995-08-01

    The authors present a brief account of possibilities to use computers, type PC, in departments of forensic medicine and discuss basic technical and programme equipment. In the author's opinion the basic reason for using computers is to create an extensive database of post-mortem findings which would make it possible to process them on a large scale and use them for research and prevention. Introduction of computers depends on the management of the department and it is necessary to persuade workers-future users of computers-of the advantages associated with their use.

  13. A Computational Theory of Grounding in Natural Language Conversation.

    Science.gov (United States)

    1994-12-01

    I I I I I I I ix I I I List of Tables I 2.1 Clark & Marshall’s Methods of Achieving Copresence for Mutual Knowl- edge...utterances were understood: "uh-huh", "yeah", "right", "okay." There is also a third type of acknowledgement common to both media . In computer...infer mutual belief if all of those check out. Copresence heuristics involve the I agents recognizing that they and the object of mutual knowledge are

  14. Language Facilities for Programming User-Computer Dialogues.

    Science.gov (United States)

    Lafuente, J. M.; Gries, D.

    1978-01-01

    Proposes extensions to PASCAL that provide for programing man-computer dialogues. An interactive dialogue application program is viewed as a sequence of frames and separate computational steps. PASCAL extensions allow the description of the items of information in each frame and the inclusion of behavior rules specifying the interactive dialogue.…

  15. On-Orbit and Ground Performance of the PGBA Plant Growth Facility

    Science.gov (United States)

    Hoehn, A.; Chamberlain, D. J.; Forsyth, S. W.; Hanna, D. S.; Scovazzo, P.; Stodieck, L. S.; Heyenga, G.; Kliss, Mark

    1997-01-01

    PGBA, a plant growth facility developed for commercial space biotechnology research, successfully grew a total of 30 plants (6 species) for 10 days on board the Space Shuttle Endeavour (STS-77) and is scheduled for reflight on board MSL-1 (STS-83) for a 16 day flight. The PGBA life support systems provide atmospheric, thermal, and humidity control as well as lighting and nutrient supply in a 23.6 liter chamber. Atmosphere treatment includes ethylene and other hydrocarbon removal, CO2 replenishment, and O2 control. The normally closed system uses controlled CO2 replenishment from the crew cabin as required by the plants. Temperature is controlled (1 C) at user-specified setpoints between 20-32 C, using water-filled coolant loops, solid state Peltier thermoelectric devices, and liquid heat exchangers. The thermoelectric cooling systems were optimized for low power consumption and high cooling efficiencies. Relative humidity is maintained between 60-100% using a cooled porous metal plate to remove water vapor from the air stream without cooling the bulk air below the dew point. The lighting system utilizes three compact fluorescent bi-axial lights with variable lighting control and light intensity (PAR) between 220 and 330 micromol/sq m/s at a distance of 20 cm in spaceflight configuration (on orbit power limited to 230 Watt for entire payload). A ground, up to 550 micromol/sq m/s light intensity can be achieved with 330 Watt payload power consumption. Plant water and nutrient support is sustained via the 'Nutrient Pack' system including the passive or active 'Water Replenishable Nutrient Pack.' The root matrix material (soil or Agar) and nutrient formulation of each pack is prepared according to plant species and experimental requirements. These systems were designed by NASA Ames personnel. Data acquisition and control systems provide 32 channels of environmental data as well as digitized or analog video signals for downlink.

  16. Computing spatial correlation of ground motion intensities for ShakeMap

    Science.gov (United States)

    Verros, Sarah; Wald, David J.; Worden, Charles; Hearne, Mike; Ganesh, Mahadevan

    2017-01-01

    Modeling the spatial correlation of ground motion residuals, caused by coherent contributions from source, path, and site, can provide valuable loss and hazard information, as well as a more realistic depiction of ground motion intensities. The U.S. Geological Survey (USGS) software package, ShakeMap, utilizes a deterministic empirical approach to estimate median ground shaking in conjunction with observed seismic data. ShakeMap-based shaking estimates are used in concert with loss estimation algorithms to estimate fatalities and economic losses after significant seismic events around the globe. Incorporating the spatial correlation of ground motion residuals has been shown to improve seismic loss estimates. In particular, Park, Bazzuro, and Baker (Applications of Statistics and Probability in Civil Engineering, 2007) investigated computing spatially correlated random fields of residuals. However, for large scale ShakeMap grids, computational requirements of the method are prohibitive. In this work, a memory efficient algorithm is developed to compute the random fields and implemented using the ShakeMap framework. This new, iterative parallel algorithm is based on decay properties of an associated ground motion correlation function and is shown to significantly reduce computational requirements associated with adding spatial variability to the ShakeMap g

  17. Preparing ground States of quantum many-body systems on a quantum computer.

    Science.gov (United States)

    Poulin, David; Wocjan, Pawel

    2009-04-03

    Preparing the ground state of a system of interacting classical particles is an NP-hard problem. Thus, there is in general no better algorithm to solve this problem than exhaustively going through all N configurations of the system to determine the one with lowest energy, requiring a running time proportional to N. A quantum computer, if it could be built, could solve this problem in time sqrt[N]. Here, we present a powerful extension of this result to the case of interacting quantum particles, demonstrating that a quantum computer can prepare the ground state of a quantum system as efficiently as it does for classical systems.

  18. Identification of ground motion features for high-tech facility under far field seismic waves using wavelet packet transform

    Science.gov (United States)

    Huang, Shieh-Kung; Loh, Chin-Hsiung; Chen, Chin-Tsun

    2016-04-01

    Seismic records collected from earthquake with large magnitude and far distance may contain long period seismic waves which have small amplitude but with dominant period up to 10 sec. For a general situation, the long period seismic waves will not endanger the safety of the structural system or cause any uncomfortable for human activity. On the contrary, for those far distant earthquakes, this type of seismic waves may cause a glitch or, furthermore, breakdown to some important equipments/facilities (such as the high-precision facilities in high-tech Fab) and eventually damage the interests of company if the amplitude becomes significant. The previous study showed that the ground motion features such as time-variant dominant frequencies extracted using moving window singular spectrum analysis (MWSSA) and amplitude characteristics of long-period waves identified from slope change of ground motion Arias Intensity can efficiently indicate the damage severity to the high-precision facilities. However, embedding a large hankel matrix to extract long period seismic waves make the MWSSA become a time-consumed process. In this study, the seismic ground motion data collected from broadband seismometer network located in Taiwan were used (with epicenter distance over 1000 km). To monitor the significant long-period waves, the low frequency components of these seismic ground motion data are extracted using wavelet packet transform (WPT) to obtain wavelet coefficients and the wavelet entropy of coefficients are used to identify the amplitude characteristics of long-period waves. The proposed method is a timesaving process compared to MWSSA and can be easily implemented for real-time detection. Comparison and discussion on this method among these different seismic events and the damage severity to the high-precision facilities in high-tech Fab is made.

  19. Computer Aided Design of Transformer Station Grounding System Using CDEGS Software

    Directory of Open Access Journals (Sweden)

    S. Nikolovski

    2004-01-01

    Full Text Available This paper presents a computer-aided design of a transformer station grounding system. Fault conditions in a transformer station can produce huge damage to transformer station equipment if the grounding system is not designed properly. A well designed grounding system is a very important part of the project for transformer station design as a whole. This paper analyses a procedure for transformer grounding system design and spatial distribution of touch and step voltage on the ground surface level, using the CDEGS (Current Distribution Electromagnetic Interference Grounding and Soil Structure Analysis software. Spatial distribution is needed for checking and finding dangerous step and touch voltages above and around the transformer station. Apparent earth resistivity data is measured and analyzed using the RESAP module of the CDEGS software. Because of the very high current flow into the grounding system during a single line to ground fault or a three phase fault in the transformer station, very high and dangerous potentials can be induced on the metallic structures including the fence, which can cause dangerous situations for people and animals near the station and for the personnel inside the station. The PLOT module of CDEGS is used to view the results of the scalar potential, step and touch voltage on the surface. Graphic displays include equipotent contour lines and potential profiles (gradients in 3D and 2D perspective and apparent soil resistivity (Wm versus inter electrode spacing (m. The results of alternative grid designs may be displayed simultaneously for the purpose of comparison.

  20. Using High Performance Computing to Realize a System-Level RDDO for Military Ground Vehicles

    Science.gov (United States)

    2008-07-14

    Using High Performance Computing to Realize a System-Level RBDO for Military Ground Vehicles • David A. Lamb, Ph.D. • Computational Reliability and...fictitious load cases is number of design variables X number of static load cases (6 X 24 = 144 for Stryker A-arm). RBDO Flowchart Pre-processor Morpher...Based Geometry Morpher Mesh Finite Element Analysis Durability Sensitivity RBDO /PBDO FE Analysis FE re-analysis for DSA Sensitivity of SIC and Fatigue

  1. High Resolution Muon Computed Tomography at Neutrino Beam Facilities

    CERN Document Server

    Suerfu, Burkhant

    2015-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pio...

  2. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  3. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  4. Characterization of 618-11 solid waste burial ground, disposed waste, and description of the waste generating facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hladek, K.L.

    1997-10-07

    The 618-11 (Wye or 318-11) burial ground received transuranic (TRTJ) and mixed fission solid waste from March 9, 1962, through October 2, 1962. It was then closed for 11 months so additional burial facilities could be added. The burial ground was reopened on September 16, 1963, and continued operating until it was closed permanently on December 31, 1967. The burial ground received wastes from all of the 300 Area radioactive material handling facilities. The purpose of this document is to characterize the 618-11 solid waste burial ground by describing the site, burial practices, the disposed wastes, and the waste generating facilities. This document provides information showing that kilogram quantities of plutonium were disposed to the drum storage units and caissons, making them transuranic (TRU). Also, kilogram quantities of plutonium and other TRU wastes were disposed to the three trenches, which were previously thought to contain non-TRU wastes. The site burial facilities (trenches, caissons, and drum storage units) should be classified as TRU and the site plutonium inventory maintained at five kilograms. Other fissile wastes were also disposed to the site. Additionally, thousands of curies of mixed fission products were also disposed to the trenches, caissons, and drum storage units. Most of the fission products have decayed over several half-lives, and are at more tolerable levels. Of greater concern, because of their release potential, are TRU radionuclides, Pu-238, Pu-240, and Np-237. TRU radionuclides also included slightly enriched 0.95 and 1.25% U-231 from N-Reactor fuel, which add to the fissile content. The 618-11 burial ground is located approximately 100 meters due west of Washington Nuclear Plant No. 2. The burial ground consists of three trenches, approximately 900 feet long, 25 feet deep, and 50 feet wide, running east-west. The trenches constitute 75% of the site area. There are 50 drum storage units (five 55-gallon steel drums welded together

  5. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  6. Coping with the Stigma of Mental Illness: Empirically-Grounded Hypotheses from Computer Simulations

    Science.gov (United States)

    Kroska, Amy; Har, Sarah K.

    2011-01-01

    This research demonstrates how affect control theory and its computer program, "Interact", can be used to develop empirically-grounded hypotheses regarding the connection between cultural labels and behaviors. Our demonstration focuses on propositions in the modified labeling theory of mental illness. According to the MLT, negative societal…

  7. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  8. Coping with the Stigma of Mental Illness: Empirically-Grounded Hypotheses from Computer Simulations

    Science.gov (United States)

    Kroska, Amy; Har, Sarah K.

    2011-01-01

    This research demonstrates how affect control theory and its computer program, "Interact", can be used to develop empirically-grounded hypotheses regarding the connection between cultural labels and behaviors. Our demonstration focuses on propositions in the modified labeling theory of mental illness. According to the MLT, negative societal…

  9. The computation of the terrain correction close to ground stations in GTE software

    Science.gov (United States)

    Capponi, Martina; Sampietro, Daniele

    2017-04-01

    In many geophysical and geodetic applications related to the gravitational field, the detailed modeling of the vertical component of the gravitational attraction due to topographic masses, represents a major issue. In fact, the increasing resolution of recently developed DTM, the increasing number of observation points and the increasing accuracy of gravity data demand the computation of a very accurate terrain correction (TC) of a fine DTM on large areas. As well known, classical methods such as prism or point masses approximations are indeed too slow while Fourier based techniques are usually too approximate if compared to the required accuracy. In 2016 GReD and Politecnico di Milano developed a new software, called GTE, based on an hybrid FFT-prism algorithm to compute TC for airborne observations. In this work we present the improvements of the GTE software to compute TC also at ground level. This requires to modify the FFT algorithm previously implemented and to properly handle the DTM slope close to the observation ground station. In order to resolve the latter problem, different algorithms, namely triangulated polyhedrons, ultra high resolution squared prisms and segmented concentric cylindrical rings centred on the station, have been tested to define an optimal method. Some tests to analyse the computational time and the accuracy obtained with each method are here presented and the performances of the improved GTE software to compute terrain corrections on ground stations are presented too. In details, the performed tests show that the algorithm is able to compute the TC from a DTM of 1001 × 1001 cells on the same grid in less than 5 minutes with accuracies of the order of 0.002 mGal, degradating to 0.2 mGal when computed on the ground stations.

  10. Studies of Plasma Instability Processes Excited by Ground Based High Power HF ("Heating") Facilities

    Science.gov (United States)

    2001-04-01

    and Megill, 1974; Carlson, 1974; Bernhardt et al., 1989). During the last years new interesting results have been obtained at HAARP facility (Peterson...Haslett and Megill, 1974; Carlson, 1974; Bernhardt et al., 1989). During the last years new interesting results have been obtained at HAARP facility

  11. Peta-scale QMC simulations on DOE leadership computing facilities

    Science.gov (United States)

    Kim, Jeongnim; Ab Initio Network Collaboration

    2014-03-01

    Continuum quantum Monte Carlo (QMC) has proved to be an invaluable tool for predicting the properties of matter from fundamental principles. Even with numerous innovations in methods, algorithms and codes, QMC simulations of realistic problems of 1000s and more electrons are demanding, requiring millions of core hours to achieve the target chemical accuracy. The multiple forms of parallelism afforded by QMC algorithms and high compute-to-communication ratio make them ideal candidates for acceleration in the multi/many-core paradigm. We have ported and tuned QMCPACK to recently deployed DOE doca-petaflop systems, Titan (Cray XK7 CPU/GPGPU) and Mira (IBM Blue Gene/Q). The efficiency gains through improved algorithms and architecture-specific tuning and, most importantly, the vast increase in computing powers have opened up opportunities to apply QMC at unprecedent scales, accuracy and time-to-solution. We present large-scale QMC simulations to study energetics of layered materials where vdW interactions play critical roles. Collaboration supported through the Predictive Theory and Modeling for Materials and Chemical Science program by the Basic Energy Science, Department of Energy.

  12. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  13. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  14. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  15. Direction and Integration of Experimental Ground Test Capabilities and Computational Methods

    Science.gov (United States)

    Dunn, Steven C.

    2016-01-01

    This paper groups and summarizes the salient points and findings from two AIAA conference panels targeted at defining the direction, with associated key issues and recommendations, for the integration of experimental ground testing and computational methods. Each panel session utilized rapporteurs to capture comments from both the panel members and the audience. Additionally, a virtual panel of several experts were consulted between the two sessions and their comments were also captured. The information is organized into three time-based groupings, as well as by subject area. These panel sessions were designed to provide guidance to both researchers/developers and experimental/computational service providers in defining the future of ground testing, which will be inextricably integrated with the advancement of computational tools.

  16. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States). Safeguards Systems Group; Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States). Surety/Dismantlement Dept.

    1993-12-31

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security issues being addressed in the integrated design effort for the tritium, uranium/lithium, plutonium, plutonium storage, and high explosive/assembly facilities.

  17. Computer/information security design approaches for Complex 21/Reconfiguration facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.J.; Zack, N.R. [Los Alamos National Lab., NM (United States); Jaeger, C.D. [Sandia National Labs., Albuquerque, NM (United States)

    1993-08-01

    Los Alamos National Laboratory and Sandia National Laboratories have been designated the technical lead laboratories to develop the design of the computer/information security, safeguards, and physical security systems for all of the DOE Complex 21/Reconfiguration facilities. All of the automated information processing systems and networks in these facilities will be required to implement the new DOE orders on computer and information security. The planned approach for a highly integrated information processing capability in each of the facilities will require careful consideration of the requirements in DOE Orders 5639.6 and 1360.2A. The various information protection requirements and user clearances within the facilities will also have a significant effect on the design of the systems and networks. Fulfilling the requirements for proper protection of the information and compliance with DOE orders will be possible because the computer and information security concerns are being incorporated in the early design activities. This paper will discuss the computer and information security addressed in the integrated design effort, uranium/lithium, plutonium, plutonium high explosive/assembly facilities.

  18. Federal Technology Alert: Ground-Source Heat Pumps Applied to Federal Facilities--Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, Donald L.

    2001-03-01

    This Federal Technology Alert, which was sponsored by the U.S. Department of Energy's Office of Federal Energy Management Programs, provides the detailed information and procedures that a Federal energy manager needs to evaluate most ground-source heat pump applications. This report updates an earlier report on ground-source heat pumps that was published in September 1995. In the current report, general benefits of this technology to the Federal sector are described, as are ground-source heat pump operation, system types, design variations, energy savings, and other benefits. In addition, information on current manufacturers, technology users, and references for further reading are provided.

  19. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  20. Computational Simulation of Dynamic Response of Vehicle Tatra T815 and the Ground

    Science.gov (United States)

    Vlček, Jozef; Valašková, Veronika

    2016-10-01

    The effect of a moving load represents the actual problem which is analysed in engineering practice. The response of the vehicle and its dynamic effect on the pavement can be analysed by experimental or computational ways. The aim of this paper was to perform computer simulations of a vehicle-ground interaction. For this purpose, a half-part model of heavy lorry Tatra 815 and ground was modelled in computational programmes ADINA and PLAXIS based on FEM methods, utilizing analytical approaches. Two procedures were then selected for further calculations. The first one is based on the simplification of the stiffer pavement layers to the beam element supported by the springs simulating the subgrade layers using Winkler-Pasternak theory of elastic half-space. Modulus of subgrade reaction was determined in the standard programme trough the simulation of a plate load test. Second approach considers a multi-layered ground system with layers of different thicknesses and material properties. For comparison of outputs of both approaches, the same input values were used for every calculation procedure. Crucial parameter for the simulations was the velocity of the passing vehicle with regard to the ground response to the impulse of the pass. Lower velocities result in almost static response of the pavement, but higher velocities induce response that can be better described by the dynamic theory. For small deformations, an elastic material model seems to be sufficient to define the ground response to the moving load, but for larger deformations advanced material models for the ground environment would be more reliable.

  1. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  2. Ground Water Monitoring Requirements for Hazardous Waste Treatment, Storage and Disposal Facilities

    Science.gov (United States)

    The groundwater monitoring requirements for hazardous waste treatment, storage and disposal facilities (TSDFs) are just one aspect of the Resource Conservation and Recovery Act (RCRA) hazardous waste management strategy for protecting human health and the

  3. Computing multiple aggregation levels and contextual features for road facilities recognition using mobile laser scanning data

    Science.gov (United States)

    Yang, Bisheng; Dong, Zhen; Liu, Yuan; Liang, Fuxun; Wang, Yongjun

    2017-04-01

    In recent years, updating the inventory of road infrastructures based on field work is labor intensive, time consuming, and costly. Fortunately, vehicle-based mobile laser scanning (MLS) systems provide an efficient solution to rapidly capture three-dimensional (3D) point clouds of road environments with high flexibility and precision. However, robust recognition of road facilities from huge volumes of 3D point clouds is still a challenging issue because of complicated and incomplete structures, occlusions and varied point densities. Most existing methods utilize point or object based features to recognize object candidates, and can only extract limited types of objects with a relatively low recognition rate, especially for incomplete and small objects. To overcome these drawbacks, this paper proposes a semantic labeling framework by combing multiple aggregation levels (point-segment-object) of features and contextual features to recognize road facilities, such as road surfaces, road boundaries, buildings, guardrails, street lamps, traffic signs, roadside-trees, power lines, and cars, for highway infrastructure inventory. The proposed method first identifies ground and non-ground points, and extracts road surfaces facilities from ground points. Non-ground points are segmented into individual candidate objects based on the proposed multi-rule region growing method. Then, the multiple aggregation levels of features and the contextual features (relative positions, relative directions, and spatial patterns) associated with each candidate object are calculated and fed into a SVM classifier to label the corresponding candidate object. The recognition performance of combining multiple aggregation levels and contextual features was compared with single level (point, segment, or object) based features using large-scale highway scene point clouds. Comparative studies demonstrated that the proposed semantic labeling framework significantly improves road facilities recognition

  4. Experimental verification of a computational technique for determining ground reactions in human bipedal stance.

    Science.gov (United States)

    Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2007-01-01

    We have developed a three-dimensional (3D) biomechanical model of human standing that enables us to study the mechanisms of posture and balance simultaneously in various directions in space. Since the two feet are on the ground, the system defines a kinematically closed-chain which has redundancy problems that cannot be resolved using the laws of mechanics alone. We have developed a computational (optimization) technique that avoids the problems with the closed-chain formulation thus giving users of such models the ability to make predictions of joint moments, and potentially, muscle activations using more sophisticated musculoskeletal models. This paper describes the experimental verification of the computational technique that is used to estimate the ground reaction vector acting on an unconstrained foot while the other foot is attached to the ground, thus allowing human bipedal standing to be analyzed as an open-chain system. The computational approach was verified in terms of its ability to predict lower extremity joint moments derived from inverse dynamic simulations performed on data acquired from four able-bodied volunteers standing in various postures on force platforms. Sensitivity analyses performed with model simulations indicated which ground reaction force (GRF) and center of pressure (COP) components were most critical for providing better estimates of the joint moments. Overall, the joint moments predicted by the optimization approach are strongly correlated with the joint moments computed using the experimentally measured GRF and COP (0.78 unity slope (experimental=computational results) for postures of the four subjects examined. These results indicate that this model-based technique can be relied upon to predict reasonable and consistent estimates of the joint moments using the predicted GRF and COP for most standing postures.

  5. A digital computer propulsion control facility: Description of capabilities and summary of experimental program results

    Science.gov (United States)

    Zeller, J. R.; Arpasi, D. J.; Lehtinen, B.

    1976-01-01

    Flight weight digital computers are being used today to carry out many of the propulsion system control functions previously delegated exclusively to hydromechanical controllers. An operational digital computer facility for propulsion control mode studies has been used successfully in several experimental programs. This paper describes the system and some of the results concerned with engine control, inlet control, and inlet engine integrated control. Analytical designs for the digital propulsion control modes include both classical and modern/optimal techniques.

  6. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  7. MIP models for connected facility location: A theoretical and computational study☆

    Science.gov (United States)

    Gollowitzer, Stefan; Ljubić, Ivana

    2011-01-01

    This article comprises the first theoretical and computational study on mixed integer programming (MIP) models for the connected facility location problem (ConFL). ConFL combines facility location and Steiner trees: given a set of customers, a set of potential facility locations and some inter-connection nodes, ConFL searches for the minimum-cost way of assigning each customer to exactly one open facility, and connecting the open facilities via a Steiner tree. The costs needed for building the Steiner tree, facility opening costs and the assignment costs need to be minimized. We model ConFL using seven compact and three mixed integer programming formulations of exponential size. We also show how to transform ConFL into the Steiner arborescence problem. A full hierarchy between the models is provided. For two exponential size models we develop a branch-and-cut algorithm. An extensive computational study is based on two benchmark sets of randomly generated instances with up to 1300 nodes and 115,000 edges. We empirically compare the presented models with respect to the quality of obtained bounds and the corresponding running time. We report optimal values for all but 16 instances for which the obtained gaps are below 0.6%. PMID:25009366

  8. MIP models for connected facility location: A theoretical and computational study.

    Science.gov (United States)

    Gollowitzer, Stefan; Ljubić, Ivana

    2011-02-01

    This article comprises the first theoretical and computational study on mixed integer programming (MIP) models for the connected facility location problem (ConFL). ConFL combines facility location and Steiner trees: given a set of customers, a set of potential facility locations and some inter-connection nodes, ConFL searches for the minimum-cost way of assigning each customer to exactly one open facility, and connecting the open facilities via a Steiner tree. The costs needed for building the Steiner tree, facility opening costs and the assignment costs need to be minimized. We model ConFL using seven compact and three mixed integer programming formulations of exponential size. We also show how to transform ConFL into the Steiner arborescence problem. A full hierarchy between the models is provided. For two exponential size models we develop a branch-and-cut algorithm. An extensive computational study is based on two benchmark sets of randomly generated instances with up to 1300 nodes and 115,000 edges. We empirically compare the presented models with respect to the quality of obtained bounds and the corresponding running time. We report optimal values for all but 16 instances for which the obtained gaps are below 0.6%.

  9. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Plessis, Anton du, E-mail: anton2@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Physics Department, Stellenbosch University, Stellenbosch (South Africa); Roux, Stephan Gerhard le, E-mail: lerouxsg@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Guelpa, Anina, E-mail: aninag@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa)

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory’s first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  10. A refined computer program for the transient simulation of ground coupled heat pump systems

    Science.gov (United States)

    Andrews, J. W.; Metz, P. D.; Saunders, J. H.

    1983-04-01

    The use of the earth as a heat source/sink or storage medium for various heat pump based space conditioning systems were investigated. A computer program ground coupled system (GROCS) was developed to model the behavior of ground coupling devices. The GROCS was integrated with TRNSYS, the solar system simulation program, to permit the simulation of complete ground coupled heat pump systems. Experimental results were compared to GROCS simulation results for model validation. It is found that the model has considerable validity. A refined version of the GROCS-TRNSYS program developed to model vertical or horizontal earth coil systems, which considers system cycling is described. The design of the program and its interaction with TRNSYS are discussed.

  11. Phenomenography and grounded theory as research methods in computing education research field

    Science.gov (United States)

    Kinnunen, Päivi; Simon, Beth

    2012-06-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and analysis phases and the type of resulting outcomes of these methods. We also discuss the challenges and threats the both methods may pose to the researcher. We conclude that while aimed at tackling different types of research questions, both of these methods provide computing education researchers a useful tool in their research method toolbox.

  12. Computer predictions of ground storage effects on performance of Galileo and ISPM generators

    Science.gov (United States)

    Chmielewski, A.

    1983-01-01

    Radioisotope Thermoelectric Generators (RTG) that will supply electrical power to the Galileo and International Solar Polar Mission (ISPM) spacecraft are exposed to several degradation mechanisms during the prolonged ground storage before launch. To assess the effect of storage on the RTG flight performance, a computer code has been developed which simulates all known degradation mechanisms that occur in an RTG during storage and flight. The modeling of these mechanisms and their impact on the RTG performance are discussed.

  13. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach, part 2

    Science.gov (United States)

    Hsia, Wei Shen

    1989-01-01

    A validated technology data base is being developed in the areas of control/structures interaction, deployment dynamics, and system performance for Large Space Structures (LSS). A Ground Facility (GF), in which the dynamics and control systems being considered for LSS applications can be verified, was designed and built. One of the important aspects of the GF is to verify the analytical model for the control system design. The procedure is to describe the control system mathematically as well as possible, then to perform tests on the control system, and finally to factor those results into the mathematical model. The reduction of the order of a higher order control plant was addressed. The computer program was improved for the maximum entropy principle adopted in Hyland's MEOP method. The program was tested against the testing problem. It resulted in a very close match. Two methods of model reduction were examined: Wilson's model reduction method and Hyland's optimal projection (OP) method. Design of a computer program for Hyland's OP method was attempted. Due to the difficulty encountered at the stage where a special matrix factorization technique is needed in order to obtain the required projection matrix, the program was successful up to the finding of the Linear Quadratic Gaussian solution but not beyond. Numerical results along with computer programs which employed ORACLS are presented.

  14. Mixed Waste Management Facility (MWMF) Old Burial Ground (OBG) source control technology and inventory study

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P.; Rehder, T.E.; Kanzleiter, J.P.

    1996-10-02

    This report has been developed to support information needs for wastes buried in the Burial Ground Complex. Information discussed is presented in a total of four individual attachments. The general focus of this report is to collect information on estimated source inventories, leaching studies, source control technologies, and to provide information on modeling parameters and associated data deficiencies.

  15. Ground-water monitoring compliance projects for Hanford Site facilities: Volume 1, The report and Appendix A, Progress report for the period October 1 to December 31, 1986

    Energy Technology Data Exchange (ETDEWEB)

    1987-02-01

    This report documents recent progress on ground-water monitoring projects for four Hanford Site facilities: the 300 Area Process Trenches, the 183-H Solar Evaporation Basins, the 200 Area Low-Level Burial Grounds, and the Nonradioactive Dangerous Waste (NRDW) Landfill. The existing ground-water monitoring projects for the first two facilities named in the paragraph above are currently being expanded by adding new wells to the networks. During the reporting period, sampling of the existing wells continued on a monthly basis, and the analytical results for samples collected from September through November 1986 are included and discussed in this document. 8 refs., 41 figs., 7 tabs.

  16. A Fruitful Collaboration between ESO and the Max Planck Computing and Data Facility

    Science.gov (United States)

    Fourniol, N.; Zampieri, S.; Panea, M.

    2016-06-01

    The ESO Science Archive Facility (SAF), contains all La Silla Paranal Observatory raw data, as well as, more recently introduced, processed data created at ESO with state-of-the-art pipelines or returned by the astronomical community. The SAF has been established for over 20 years and its current holding exceeds 700 terabytes. An overview of the content of the SAF and the preservation of its content is provided. The latest development to ensure the preservation of the SAF data, provision of an independent backup copy of the whole SAF at the Max Planck Computing and Data Facility in Garching, is described.

  17. Qualification of Coatings for Launch Facilities and Ground Support Equipment Through the NASA Corrosion Technology Laboratory

    Science.gov (United States)

    Kolody, Mark R.; Curran, Jerome P.; Calle, Luz Marina

    2014-01-01

    Corrosion protection at NASA's Kennedy Space Center is a high priority item. The launch facilities at the Kennedy Space Center are located approximately 1000 feet from the Atlantic Ocean where they are exposed to salt deposits, high humidity, high UV degradation, and acidic exhaust from solid rocket boosters. These assets are constructed from carbon steel, which requires a suitable coating to provide long-term protection to reduce corrosion and its associated costs.

  18. Holonomic quantum computing in ground states of spin chains with symmetry-protected topological order

    CERN Document Server

    Renes, Joseph M; Brennen, Gavin K; Bartlett, Stephen D

    2011-01-01

    While solid-state devices offer naturally reliable hardware for modern classical computers, thus far quantum information processors resemble vacuum tube computers in being neither reliable nor scalable. Strongly correlated many body states stabilized in topologically ordered matter offer the possibility of naturally fault tolerant computing, but are both challenging to engineer and coherently control and cannot be easily adapted to different physical platforms. We propose an architecture which achieves some of the robustness properties of topological models but with a drastically simpler construction. Quantum information is stored in the degenerate ground states of spin-1 chains exhibiting symmetry-protected topological order (SPTO), while quantum gates are performed by adiabatic non-Abelian holonomies using only single-site fields and nearest-neighbor couplings. Gate operations respect the SPTO symmetry, inheriting some protection from noise and disorder from the SPTO robustness to local perturbation. A pote...

  19. NEGOTIATING COMMON GROUND IN COMPUTER-MEDIATED VERSUS FACE-TO-FACE DISCUSSIONS

    Directory of Open Access Journals (Sweden)

    Ilona Vandergriff

    2006-01-01

    Full Text Available To explore the impact of the communication medium on building common ground, this article presents research comparing learner use of reception strategies in traditional face-to-face (FTF and in synchronous computer-mediated communication (CMC.Reception strategies, such as reprises, hypothesis testing and forward inferencing provide evidence of comprehension and thus serve to establish common ground among participants. A number of factors, including communicative purpose or medium are hypothesized to affect the use of such strategies (Clark & Brennan, 1991. In the data analysis, I 1 identify specific types of reception strategies, 2 compare their relative frequencies by communication medium, by task, and by learner and 3 describe how these reception strategies function in the discussions. The findings of the quantitative analysis show that the medium alone seems to have little impact on grounding as indicated by use of reception strategies. The qualitative analysis provides evidence that participants adapted the strategies to the goals of the communicative interaction as they used them primarily to negotiate and update common ground on their collaborative activity rather than to compensate for L2 deficiencies.

  20. Computational investigations of low-emission burner facilities for char gas burning in a power boiler

    Science.gov (United States)

    Roslyakov, P. V.; Morozov, I. V.; Zaychenko, M. N.; Sidorkin, V. T.

    2016-04-01

    Various variants for the structure of low-emission burner facilities, which are meant for char gas burning in an operating TP-101 boiler of the Estonia power plant, are considered. The planned increase in volumes of shale reprocessing and, correspondingly, a rise in char gas volumes cause the necessity in their cocombustion. In this connection, there was a need to develop a burner facility with a given capacity, which yields effective char gas burning with the fulfillment of reliability and environmental requirements. For this purpose, the burner structure base was based on the staging burning of fuel with the gas recirculation. As a result of the preliminary analysis of possible structure variants, three types of early well-operated burner facilities were chosen: vortex burner with the supply of recirculation gases into the secondary air, vortex burner with the baffle supply of recirculation gases between flows of the primary and secondary air, and burner facility with the vortex pilot burner. Optimum structural characteristics and operation parameters were determined using numerical experiments. These experiments using ANSYS CFX bundled software of computational hydrodynamics were carried out with simulation of mixing, ignition, and burning of char gas. Numerical experiments determined the structural and operation parameters, which gave effective char gas burning and corresponded to required environmental standard on nitrogen oxide emission, for every type of the burner facility. The burner facility for char gas burning with the pilot diffusion burner in the central part was developed and made subject to computation results. Preliminary verification nature tests on the TP-101 boiler showed that the actual content of nitrogen oxides in burner flames of char gas did not exceed a claimed concentration of 150 ppm (200 mg/m3).

  1. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    Energy Technology Data Exchange (ETDEWEB)

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  2. [The Computer Competency of Nurses in Long-Term Care Facilities and Related Factors].

    Science.gov (United States)

    Chang, Ya-Ping; Kuo, Huai-Ting; Li, I-Chuan

    2016-12-01

    It is important for nurses who work in long-term care facilities (LTCFs) to have an adequate level of computer competency due to the multidisciplinary and comprehensive nature of long-term care services. Thus, it is important to understand the current computer competency of nursing staff in LTCFs and the factors that relate to this competency. To explore the computer competency of LTCF nurses and to identify the demographic and computer-usage characteristics that relate significantly to computer competency in the LTCF environment. A cross-sectional research design and a self-report questionnaire were used to collect data from 185 nurses working at LTCFs in Taipei. The results found that the variables of the frequency of computer use (β = .33), age (β = -.30), type(s) of the software used at work (β = .28), hours of on-the-job training (β = -.14), prior work experience at other LTCFs (β = -.14), and Internet use at home (β = .12) explain 58.0% of the variance in the computer competency of participants. The results of the present study suggest that the following measures may help increase the computer competency of LTCF nurses. (1) Nurses should be encouraged to use electronic nursing records rather than handwritten records. (2) On-the-job training programs should emphasize participant competency in the Excel software package in order to maintain efficient and good-quality of LTC services after implementing of the LTC insurance policy.

  3. Access to the energy system network simulator (ESNS), via remote computer terminals. [BNL CDC 7600/6600 computer facility

    Energy Technology Data Exchange (ETDEWEB)

    Reisman, A W

    1976-08-15

    The Energy System Network Simulator (ESNS) flow model is installed on the Brookhaven National Laboratory (BNL) CDC 7600/6600 computer facility for access by off-site users. The method of access available to outside users is through a system called CDC-INTERCOM, which allows communication between the BNL machines and remote teletype terminals. This write-up gives a brief description of INTERCOM for users unfamiliar with this system and a step-by-step guide to using INTERCOM in order to access ESNS.

  4. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  5. Self contamination effects in the TAUVEX UV Telescope: Ground testing and computer simulation

    Science.gov (United States)

    Lifshitz, Y.; Noter, Y.; Grossman, E.; Genkin, L.; Murat, M.; Saar, N.; Blasberger, A.

    1994-01-01

    The contamination effects due to outgassing from construction materials of the TAUVEX (Tel Aviv University UV Telescope) were evaluated using a combination of ground testing and computer simulations. Tests were performed from the material level of the system level including: (1) High sensitivity CVCM(10(exp -3 percent) measurements of critical materials. (2) Optical degradation measurements of samples specially contaminated by outgassing products at different contamination levels. (3) FTIR studies of chemical composition of outgassed products on above samples. (4) High resolution AFM studies of surface morphology of contaminated surfaces. The expected degradation of TAUVEX performance in mission was evaluated applying a computer simulation code using input parameters determined experimentally in the above tests. The results have served as guidelines for the proper selection of materials, cleanliness requirements, determination of the thermal conditions of the system and bakeout processes.

  6. Computational architecture for image processing on a small unmanned ground vehicle

    Science.gov (United States)

    Ho, Sean; Nguyen, Hung

    2010-08-01

    Man-portable Unmanned Ground Vehicles (UGVs) have been fielded on the battlefield with limited computing power. This limitation constrains their use primarily to teleoperation control mode for clearing areas and bomb defusing. In order to extend their capability to include the reconnaissance and surveillance missions of dismounted soldiers, a separate processing payload is desired. This paper presents a processing architecture and the design details on the payload module that enables the PackBot to perform sophisticated, real-time image processing algorithms using data collected from its onboard imaging sensors including LADAR, IMU, visible, IR, stereo, and the Ladybug spherical cameras. The entire payload is constructed from currently available Commercial off-the-shelf (COTS) components including an Intel multi-core CPU and a Nvidia GPU. The result of this work enables a small UGV to perform computationally expensive image processing tasks that once were only feasible on a large workstation.

  7. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    Science.gov (United States)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  8. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  9. Historic Seismicity, Computed Peak Ground Accelerations, and Seismic Site Conditions for Northeast Mexico

    Science.gov (United States)

    Montalvo-Arriet, J. C.; Galván-Ramírez, I. N.; Ramos-Zuñiga, L. G.; Navarro de León, I.; Ramírez-Fernández, J. A.; Quintanilla-López, Y.; Cavazos-Tovar, N. P.

    2007-05-01

    In this study we present the historic seismicity, computed peak ground accelerations, and mapping of seismic site conditions for northeast Mexico. We start with a compilation of the regional seismicity in northeast Mexico (24- 31°N, 87-106°W) for the 1787-2006 period. Our study area lies within three morphotectonic provinces: Basin and Range and Rio Grande rift, Sierra Madre Oriental and Gulf Coastal Plain. Peak ground acceleration (PGA) maps were computed for three different scenarios: 1928 Parral, Chihuahua (MW = 6.5); 1931 Valentine, Texas (MW = 6.4); and a hypothetical earthquake located in central Coahuila (MW = 6.5). Ground acceleration values were computed using attenuation relations developed for central and eastern North America and the Basin and Range province. The hypothetical earthquake in central Coahuila is considered a critical scenario for the main cities of northeast Mexico. The damage associated with this hypothetical earthquake could be severe because the majority of the buildings were constructed without allowance for seismic accelerations. The expected PGA values in Monterrey, Saltillo and Monclova range from 30 to 70 cm/s2 (0.03 to 0.07g). This earthquake might also produce or trigger significant landslides and rock falls in the Sierra Madre Oriental, where several cities are located (e.g. suburbs of Monterrey). Additionally, the Vs30 distribution for the state of Nuevo Leon and the cities of Linares and Monterrey are presented. The Vs30 data was obtained using seismic refraction profiling correlated with borehole information. According to NEHRP soil classification, sites classes A, B and C are dominant. Sites with class D occupy minor areas in both cities. Due to the semi-arid conditions in northeast Mexico, we obtained the highest values of Vs30 in Quaternary deposits (alluvium) cemented by caliche. Similar values of Vs30 were obtained in Reno and Las Vegas, Nevada. This work constitutes the first attempt at understanding and

  10. Mechanisms which help explain implementation of evidence-based practice in residential aged care facilities: a grounded theory study.

    Science.gov (United States)

    Masso, Malcolm; McCarthy, Grace; Kitson, Alison

    2014-07-01

    The context for the study was a nation-wide programme in Australia to implement evidence-based practice in residential aged care, in nine areas of practice, using a wide range of implementation strategies and involving 108 facilities. The study drew on the experiences of those involved in the programme to answer the question: what mechanisms influence the implementation of evidence-based practice in residential aged care and how do those mechanisms interact? The methodology used grounded theory from a critical realist perspective, informed by a conceptual framework that differentiates between the context, process and content of change. People were purposively sampled and invited to participate in semi-structured interviews, resulting in 44 interviews involving 51 people during 2009 and 2010. Participants had direct experience of implementation in 87 facilities, across nine areas of practice, in diverse locations. Sampling continued until data saturation was reached. The quality of the research was assessed using four criteria for judging trustworthiness: credibility, transferability, dependability and confirmability. Data analysis resulted in the identification of four mechanisms that accounted for what took place and participants' experiences. The core category that provided the greatest understanding of the data was the mechanism On Common Ground, comprising several constructs that formed a 'common ground' for change to occur. The mechanism Learning by Connecting recognised the ability to connect new knowledge with existing practice and knowledge, and make connections between actions and outcomes. Reconciling Competing Priorities was an ongoing mechanism whereby new practices had to compete with an existing set of constantly shifting priorities. Strategies for reconciling priorities ranged from structured approaches such as care planning to more informal arrangements such as conversations during daily work. The mechanism Exercising Agency bridged the gap between

  11. SynapSense Wireless Environmental Monitoring System of the RHIC & ATLAS Computing Facility at BNL

    Science.gov (United States)

    Casella, K.; Garcia, E.; Hogue, R.; Hollowell, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    RHIC & ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  12. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  13. Improvement of the Computing - Related Procurement Process at a Government Research Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gittins, C.

    2000-04-03

    The purpose of the project was to develop, implement, and market value-added services through the Computing Resource Center in an effort to streamline computing-related procurement processes across the Lawrence Livermore National Laboratory (LLNL). The power of the project was in focusing attention on and value of centralizing the delivery of computer related products and services to the institution. The project required a plan and marketing strategy that would drive attention to the facility's value-added offerings and services. A significant outcome of the project has been the change in the CRC internal organization. The realignment of internal policies and practices, together with additions to its product and service offerings has brought an increased focus to the facility. This movement from a small, fractious organization into one that is still small yet well organized and focused on its mission and goals has been a significant transition. Indicative of this turnaround was the sharing of information. One-on-one and small group meetings, together with statistics showing work activity was invaluable in gaining support for more equitable workload distribution, and the removal of blame and finger pointing. Sharing monthly reports on sales and operating costs also had a positive impact.

  14. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  15. Distributed computer control system in the Nova Laser Fusion Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    1985-09-01

    The EE Technical Review has two purposes - to inform readers of various activities within the Electronics Engineering Department and to promote the exchange of ideas. The articles, by design, are brief summaries of EE work. The articles included in this report are as follows: Overview - Nova Control System; Centralized Computer-Based Controls for the Nova Laser Facility; Nova Pulse-Power Control System; Nova Laser Alignment Control System; Nova Beam Diagnostic System; Nova Target-Diagnostics Control System; and Nova Shot Scheduler. The 7 papers are individually abstracted.

  16. Horizontal Air-Ground Heat Exchanger Performance and Humidity Simulation by Computational Fluid Dynamic Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Maria Congedo

    2016-11-01

    Full Text Available Improving energy efficiency in buildings and promoting renewables are key objectives of European energy policies. Several technological measures are being developed to enhance the energy performance of buildings. Among these, geothermal systems present a huge potential to reduce energy consumption for mechanical ventilation and cooling, but their behavior depending on varying parameters, boundary and climatic conditions is not fully established. In this paper a horizontal air-ground heat exchanger (HAGHE system is studied by the development of a computational fluid dynamics (CFD model. Summer and winter conditions representative of the Mediterranean climate are analyzed to evaluate operation and thermal performance differences. A particular focus is given to humidity variations as this parameter has a major impact on indoor air quality and comfort. Results show the benefits that HAGHE systems can provide in reducing energy consumption in all seasons, in summer when free-cooling can be implemented avoiding post air treatment using heat pumps.

  17. Validation of space/ground antenna control algorithms using a computer-aided design tool

    Science.gov (United States)

    Gantenbein, Rex E.

    1995-01-01

    The validation of the algorithms for controlling the space-to-ground antenna subsystem for Space Station Alpha is an important step in assuring reliable communications. These algorithms have been developed and tested using a simulation environment based on a computer-aided design tool that can provide a time-based execution framework with variable environmental parameters. Our work this summer has involved the exploration of this environment and the documentation of the procedures used to validate these algorithms. We have installed a variety of tools in a laboratory of the Tracking and Communications division for reproducing the simulation experiments carried out on these algorithms to verify that they do meet their requirements for controlling the antenna systems. In this report, we describe the processes used in these simulations and our work in validating the tests used.

  18. Preoperative computed tomography-guided percutaneous localization of ground glass pulmonary opacity with polylactic acid injection.

    Science.gov (United States)

    Hu, Mu; Zhi, Xiuyi; Zhang, Jian

    2015-07-01

    Localization of a ground glass nodule is a difficult challenge for thoracic surgeons, especially for ground glass opacities (GGOs) less than 10 mm in diameter. In this study we implement a new method for preoperative localization of pulmonary (GGOs). From October 2013 to December 2014, computed tomography-guided percutaneous polylactic acid injection localizations were performed for five pulmonary nodules in five patients (2 men and 3 women; mean age, 59.8 years; range, 54-65 years). The injection was feasible in all patients and the localization effect was excellent. The total procedure duration was 12.6 minutes (range; 10-15) and the volume of polylactic acid injected was 0.38 mL. The wedge resections were easily and successfully performed in all five cases. The cutting margin was no less than 2 cm from the lesion. This technique is promising for the determination of GGO location in thoracoscopic surgery for wedge resection.

  19. Soft Computing Approach to Evaluate and Predict Blast-Induced Ground Vibration

    Science.gov (United States)

    Khandelwal, Manoj

    2010-05-01

    the same excavation site, different predictors give different values of safe PPV vis-à-vis safe charge per delay. There is no uniformity in the predicted result by different predictors. All vibration predictor equations have their site specific constants. Therefore, they cannot be used in a generalized way with confidence and zero level of risk. To overcome on this aspect new soft computing tools like artificial neural network (ANN) has attracted because of its ability to learn from the pattern acquainted before. ANN has the ability to learn from patterns acquainted before. It is a highly interconnected network of a large number of processing elements called neurons in an architecture inspired by the brain. ANN can be massively parallel and hence said to exhibit parallel distributed processing. Once, the network has been trained, with sufficient number of sample data sets, it can make reliable and trustworthy predictions on the basis of its previous learning, about the output related to new input data set of similar pattern. This paper deals the application of ANN for the prediction of ground vibration by taking into consideration of maximum charge per delay and distance between blast face to monitoring point. To investigate the appropriateness of this approach, the predictions by ANN have been also compared with other vibration predictor equations.

  20. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford facilities: Progress Report for the Period April 1 to June 30, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-09-01

    This report describes the progress of 13 Hanford ground-water monitoring projects for the period April 1 to June 30, 1989. These projects are for the 300 area process trenches (300 area), 183-H solar evaporation basins (100-H area), 200 areas low-level burial grounds, nonradioactive dangerous waste landfill (southeast of the 200 areas), 1301-N liquid waste disposal facility (100-N area), 1324-N surface impoundment and 1324-NA percolation pond (100-N area), 1325-N liquid waste disposal facility (100-N area), 216-A-10 crib (200-east area), 216-A-29 ditch (200-east area), 216-A-36B crib (200-east area), 216-B-36B crib (200-east area), 216-B-3 pond (east of the 200-east area), 2101-M pond (200-east area), grout treatment facility (200-east area).

  1. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    Energy Technology Data Exchange (ETDEWEB)

    Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Norman, A. [Fermilab; Timm, S. [Fermilab; Tiradani, A. [Fermilab

    2017-03-15

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  2. Computational dosimetry for grounded and ungrounded human models due to contact current

    Science.gov (United States)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-08-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm2.

  3. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.

    1999-08-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike.

  4. Calculation of shielding of X rays in radiotherapy facilities with computer aid; Calculo de blindagem para instalacoes de radioterapia por raios X com auxilio de computador

    Energy Technology Data Exchange (ETDEWEB)

    Pedrosa, Paulo Sergio; Farias, Marcos Santana [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)]. E-mail: pedrosa@ien.gov.br; msantana@ien.gov.br; Gavazza, Sergio [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil). Dept. das Ciencias Fundamentais, Radiacao e Meio Ambiente]. E-mail: gavazza@ugf.br

    2005-07-01

    This work presents a methodology for calculation of shielding of X rays in radiotherapy facilities with computer aid. A friendly program, called RadTeraX, was developed in programming language Delphi that, through manual data input of a basic project of architecture and of some parameters, interprets the geometry and calculates the shielding of the walls, ground and roof of a radiotherapy installation for X rays. As a final product, this program supplies a graphic screen in the computer with all the input data and the calculation of the shielding, besides the respective calculation memory. Still today, in Brazil, the calculation of the shielding for radiotherapy facilities with X rays has been made based on recommendations of NCRP-49, that establishes a necessary calculation methodology to the elaboration of a shielding project. However, in high energies, where it is necessary the construction of a maze, NCRP-49 is insufficient, so that in this field, studies were made originating an article that proposes a solution for the problem and this solution was implemented in the program. The program can be applied in the practical execution of shielding projects for radiotherapy facilities and in didactic way in comparison with NCRP-49 and has been registered under number 00059420 at INPI - Instituto Nacional da Propriedade Industrial (National Institute of Industrial Property). (author)

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. Control computers and automation subsystem equipment in Del'fin facility

    Energy Technology Data Exchange (ETDEWEB)

    Allin, A.P.; Belen' kiy, Yu.M.; Borzyak, Yu.V.; Bykovskiy, N.E.; Grigor' ev, V.E.; Gusyatnikov, B.S.; Doroshkevich, I.L.; Ivanov, V.V.; Kuchinskiy, A.G.; Savchenko, V.M.

    1983-01-01

    The power equipment of the Del'fin laser facility contains a 10/sup 7/ J capacitor bank divided into four identical sections feeding a power preamplifier and three output stages each, with 328 IFP-20,000 flash tubes designed to produce 2.5 kJ laser radiation. The system for controlling and automating laser experiments, modeled after the SHIVA system (Lawrence Livermore Laboratory), includes a computer complex in the central console and the sequential ring bus, with CAMAC peripheral stations inside the optical chamber including three microcomputers (Polon, Nuclear Enterprise, HENESA). The control computer with a 28 K memory is linked to the CAMAC stations and to a terminal DECWRITER-II. The system crate contains a 9030/32 interface, a 3992 driver for the sequential bus, and a 064 LAM GRADER interrogation processing module. The computer complex also includes an RSX-11M multiprogram multipurpose real-time disk operating system which uses standard DECNET-11 software and includes a translator from MACRO-11 assembler language to FORTRAN-4, BASIC-11, COBOL and a few other languages. The laser power is automatically controlled through the CAMAC stations according to a main program as well as dialog maintenance programs (BCE, KASKN, KASN, DIAL, STRB, MODB, BKYM, STRB, BKYB, BKY) and measurement programs (CONTRO, CNTRO, KOD) designed to ensure simple and reliable high-speed control of laser experiments. All alignment and regulation of the laser facility is automated through optical channels (aligning LTI-501 laser, collimators, lenses, auxiliary optics) and servomechanisms (coordinate photoreceiver-homing signal module-step motors) designed for positioning and orientating mirrors 80 mm and 30 mm in diameter. 25 references, 31 figures, 2 tables.

  7. Computer model of two-dimensional solute transport and dispersion in ground water

    Science.gov (United States)

    Konikow, Leonard F.; Bredehoeft, J.D.

    1978-01-01

    This report presents a model that simulates solute transport in flowing ground water. The model is both general and flexible in that it can be applied to a wide range of problem types. It is applicable to one- or two-dimensional problems involving steady-state or transient flow. The model computes changes in concentration over time caused by the processes of convective transport, hydrodynamic dispersion, and mixing (or dilution) from fluid sources. The model assumes that the solute is non-reactive and that gradients of fluid density, viscosity, and temperature do not affect the velocity distribution. However, the aquifer may be heterogeneous and (or) anisotropic. The model couples the ground-water flow equation with the solute-transport equation. The digital computer program uses an alternating-direction implicit procedure to solve a finite-difference approximation to the ground-water flow equation, and it uses the method of characteristics to solve the solute-transport equation. The latter uses a particle- tracking procedure to represent convective transport and a two-step explicit procedure to solve a finite-difference equation that describes the effects of hydrodynamic dispersion, fluid sources and sinks, and divergence of velocity. This explicit procedure has several stability criteria, but the consequent time-step limitations are automatically determined by the program. The report includes a listing of the computer program, which is written in FORTRAN IV and contains about 2,000 lines. The model is based on a rectangular, block-centered, finite difference grid. It allows the specification of any number of injection or withdrawal wells and of spatially varying diffuse recharge or discharge, saturated thickness, transmissivity, boundary conditions, and initial heads and concentrations. The program also permits the designation of up to five nodes as observation points, for which a summary table of head and concentration versus time is printed at the end of the

  8. The Overview of the National Ignition Facility Distributed Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L J; Bettenhausen, R C; Carey, R A; Estes, C M; Fisher, J M; Krammen, J E; Reed, R K; VanArsdall, P J; Woodruff, J P

    2001-10-15

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008.

  9. RCRA (Resource Conservation and Recovery Act of 1976) ground-water monitoring projects for Hanford facilities: Progress report, October 1--December 31, 1988: Volume 1, Text

    Energy Technology Data Exchange (ETDEWEB)

    Fruland, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-04-01

    This report describes the progress of 13 Hanford ground-water monitoring projects for the period October 1 to December 31, 1988. There are 16 individual hazardous waste facilities covered by the 13 ground-water monitoring projects. The Grout Treatment Facility is included in this series of quarterly reports for the first time. The 13 projects discussed in this report were designed according to applicable interim-status ground-water monitoring requirements specified in the Resource Conservation and Recovery Act of 1976 (RCRA). During this quarter, field activities primarily consisted of sampling and analyses, and water-level monitoring. The 200 Areas Low-Level Burial Grounds section includes sediment analyses in addition to ground-water monitoring results. Twelve new wells were installed during the previous quarter: two at the 216-A-29 Ditch, six at the 216-A-10 Crib, and four at the 216-B-3 Pond. Preliminary characterization data for these new wells include drillers' logs and other drilling and site characterization data, and are provided in Volume 2 or on microfiche in the back of Volume 1. 26 refs., 28 figs., 74 tabs.

  10. Fair Grounds, Fair Grounds point locations in Critical Facilities data layer, Published in 2008, 1:4800 (1in=400ft) scale, Gove County.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Fair Grounds dataset, published at 1:4800 (1in=400ft) scale, was produced all or in part from Field Survey/GPS information as of 2008. It is described as 'Fair...

  11. ARADISH - Development of a Standardized Plant Growth Chamber for Experiments in Gravitational Biology Using Ground Based Facilities

    Science.gov (United States)

    Schüler, Oliver; Krause, Lars; Görög, Mark; Hauslage, Jens; Kesseler, Leona; Böhmer, Maik; Hemmersbach, Ruth

    2016-06-01

    Plant development strongly relies on environmental conditions. Growth of plants in Biological Life Support Systems (BLSS), which are a necessity to allow human survival during long-term space exploration missions, poses a particular problem for plant growth, as in addition to the traditional environmental factors, microgravity (or reduced gravity such as on Moon or Mars) and limited gas exchange hamper plant growth. Studying the effects of reduced gravity on plants requires real or simulated microgravity experiments under highly standardized conditions, in order to avoid the influence of other environmental factors. Analysis of a large number of biological replicates, which is necessary for the detection of subtle phenotypical differences, can so far only be achieved in Ground Based Facilities (GBF). Besides different experimental conditions, the usage of a variety of different plant growth chambers was a major factor that led to a lack of reproducibility and comparability in previous studies. We have developed a flexible and customizable plant growth chamber, called ARAbidopsis DISH (ARADISH), which allows plant growth from seed to seedling, being realized in a hydroponic system or on Agar. By developing a special holder, the ARADISH can be used for experiments with Arabidopsis thaliana or a plant with a similar habitus on common GBF hardware, including 2D clinostats and Random Positioning Machines (RPM). The ARADISH growth chamber has a controlled illumination system of red and blue light emitting diodes (LED), which allows the user to apply defined light conditions. As a proof of concept we tested a prototype in a proteomic experiment in which plants were exposed to simulated microgravity or a 90° stimulus. We optimized the design and performed viability tests after several days of growth in the hardware that underline the utility of ARADISH in microgravity research.

  12. On the importance of a rich embodiment in the grounding of concepts: perspectives from embodied cognitive science and computational linguistics.

    Science.gov (United States)

    Thill, Serge; Padó, Sebastian; Ziemke, Tom

    2014-07-01

    The recent trend in cognitive robotics experiments on language learning, symbol grounding, and related issues necessarily entails a reduction of sensorimotor aspects from those provided by a human body to those that can be realized in machines, limiting robotic models of symbol grounding in this respect. Here, we argue that there is a need for modeling work in this domain to explicitly take into account the richer human embodiment even for concrete concepts that prima facie relate merely to simple actions, and illustrate this using distributional methods from computational linguistics which allow us to investigate grounding of concepts based on their actual usage. We also argue that these techniques have applications in theories and models of grounding, particularly in machine implementations thereof. Similarly, considering the grounding of concepts in human terms may be of benefit to future work in computational linguistics, in particular in going beyond "grounding" concepts in the textual modality alone. Overall, we highlight the overall potential for a mutually beneficial relationship between the two fields.

  13. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  14. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  15. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  16. Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Brunton, G; Carey, R; Demaret, R; Fisher, J; Fishler, B; Ludwigsen, P; Marshall, C; Reed, R; Shelton, R; Townsend, S

    2011-03-18

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for controls and information systems to support these experiments on the path to ignition.

  17. Numerical methods for computing the ground state of spin-1 Bose-Einstein condensates in a uniform magnetic field.

    Science.gov (United States)

    Lim, Fong Yin; Bao, Weizhu

    2008-12-01

    We propose efficient and accurate numerical methods for computing the ground-state solution of spin-1 Bose-Einstein condensates subjected to a uniform magnetic field. The key idea in designing the numerical method is based on the normalized gradient flow with the introduction of a third normalization condition, together with two physical constraints on the conservation of total mass and conservation of total magnetization. Different treatments of the Zeeman energy terms are found to yield different numerical accuracies and stabilities. Numerical comparison between different numerical schemes is made, and the best scheme is identified. The numerical scheme is then applied to compute the condensate ground state in a harmonic plus optical lattice potential, and the effect of the periodic potential, in particular to the relative population of each hyperfine component, is investigated through comparison to the condensate ground state in a pure harmonic trap.

  18. Computer simulation models relevant to ground water contamination from EOR or other fluids - state-of-the-art

    Energy Technology Data Exchange (ETDEWEB)

    Kayser, M.B.; Collins, A.G.

    1986-03-01

    Ground water contamination is a serious national problem. The use of computers to simulate the behavior of fluids in the subsurface has proliferated extensively over the last decade. Numerical models are being used to solve water supply problems, various kinds of enertgy production problems, and ground water contamination problems. Modeling techniques have progressed to the point that their accuracy is only limited by the modeller's ability to describe the reservoir in question and the heterogeneities therein. Pursuant to the Task and Milestone Update of Project BE3A, this report summarizes the state of the art of computer simulation models relevant to contamination of ground water by enhanced oil recovery (EOR) chemicals and/or waste fluids. 150 refs., 6 tabs.

  19. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    Science.gov (United States)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  20. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    Science.gov (United States)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  1. NuSTAR calibration facility and multilayer reference database: Optic response model comparison to NuSTAR on-ground calibration data

    DEFF Research Database (Denmark)

    Brejnholt, Nicolai

    the optic response for both on- and off-axis NuSTAR observations, detailed knowledge of the as-coated multilayer is required. The purpose of this thesis is to establish a multilayer reference database. As an integral part of this effort, a hard X-ray calibration facility was designed and constructed. Each......-ground calibration data. The on-ground calibration and flight witness sample investigations were carried out at a hard X-ray facility constructed for the same purpose. This thesis established the NuSTAR multilayer reference database and found that it provides a good description of the as-coated multilayers of the Nu......STAR optics. A thorough quantitative study of the NuSTAR effective area requires the utilized ray tracing tool to mature further. Currently, the effective area estimated from the multilayer reference database represents an optimistic upper limit. Along with a conservative estimate derived from on...

  2. New-Measurement Techniques to Diagnose Charged Dust and Plasma Layers in the Near-Earth Space Environment Using Ground-Based Ionospheric Heating Facilities

    OpenAIRE

    Mahmoudian, Alireza

    2013-01-01

    Recently, experimental observations have shown that radar echoes from the irregularitysource region associated with mesospheric dusty space plasmas may be modulated by radio wave heating with ground-based ionospheric heating facilities. These experiments show great promise as a diagnostic for the associated dusty plasma in the Near-Earth Space Environment which is believed to have links to global change. This provides an alternative to more complicated and costly space-based observational app...

  3. Estimation of natural ground water recharge for the performance assessment of a low-level waste disposal facility at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, M.L.; Fayer, M.J.; Kincaid, C.T.; Gee, G.W.

    1995-03-01

    In 1994, the Pacific Northwest Laboratory (PNL) initiated the Recharge Task, under the PNL Vitrification Technology Development (PVTD) project, to assist Westinghouse Hanford Company (WHC) in designing and assessing the performance of a low-level waste (LLW) disposal facility for the US Department of Energy (DOE). The Recharge Task was established to address the issue of ground water recharge in and around the LLW facility and throughout the Hanford Site as it affects the unconfined aquifer under the facility. The objectives of this report are to summarize the current knowledge of natural ground water recharge at the Hanford Site and to outline the work that must be completed in order to provide defensible estimates of recharge for use in the performance assessment of this LLW disposal facility. Recharge studies at the Hanford Site indicate that recharge rates are highly variable, ranging from nearly zero to greater than 100 mm/yr depending on precipitation, vegetative cover, and soil types. Coarse-textured soils without plants yielded the greatest recharge. Finer-textured soils, with or without plants, yielded the least. Lysimeters provided accurate, short-term measurements of recharge as well as water-balance data for the soil-atmosphere interface and root zone. Tracers provided estimates of longer-term average recharge rates in undisturbed settings. Numerical models demonstrated the sensitivity of recharge rates to different processes and forecast recharge rates for different conditions. All of these tools (lysimetry, tracers, and numerical models) are considered vital to the development of defensible estimates of natural ground water recharge rates for the performance assessment of a LLW disposal facility at the Hanford Site.

  4. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  5. Investigation of seismicity and related effects at NASA Ames-Dryden Flight Research Facility, Computer Center, Edwards, California

    Science.gov (United States)

    Cousineau, R. D.; Crook, R., Jr.; Leeds, D. J.

    1985-01-01

    This report discusses a geological and seismological investigation of the NASA Ames-Dryden Flight Research Facility site at Edwards, California. Results are presented as seismic design criteria, with design values of the pertinent ground motion parameters, probability of recurrence, and recommended analogous time-history accelerograms with their corresponding spectra. The recommendations apply specifically to the Dryden site and should not be extrapolated to other sites with varying foundation and geologic conditions or different seismic environments.

  6. The National Ignition Facility: Status of the Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P J; Bryant, R; Carey, R; Casavant, D; Demaret, R; Edwards, O; Ferguson, W; Krammen, J; Lagin, L; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3{omega} single laser beam performance is 10.4 kJ, equivalent to 2 MJ

  7. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: lagin1@llnl.gov; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)] (and others)

    2008-04-15

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including

  8. Unsteady Computations of a Jet in a Crossflow with Ground Effect

    Science.gov (United States)

    Pandya, Shishir; Murman, Scott; Venkateswaran, Sankaran; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A numerical study of a jet in crossflow with ground effect is conducted using OVERFLOW with dual time-stepping and low Mach number preconditioning. The results of the numerical study are compared to an experiment to show that the numerical methods are capable of capturing the dominant features of the flow field as well as the unsteadiness associated with the ground vortex.

  9. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    Science.gov (United States)

    Karasawa, Mizuka; Chan, Tony; Smith, Jason

    2010-04-01

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  10. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford facilities: Progress report for the period July 1 to September 30, 1988: Volume 1, Text

    Energy Technology Data Exchange (ETDEWEB)

    Fruland, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-02-01

    This report describes the progress of 12 Hanford ground-water monitoring projects for the period July 1 to September 30, 1988. During this quarter, field activities at the 300 Area process trenches, the Nonradioactive Dangerous Waste Landfill, the 183-H Solar Evaporation Basins, the 1324-N/NA Surface Impoundment and Percolation Ponds, the 1301-N and 1325-N Liquid Waste Disposal Facilities, and the 216-A-36B Crib consisted of ground-water sampling and analyses, and water-level monitoring. The 200 Area Low-Level Burial Grounds section includes well development data, sediment analysis, and water-level measurements. Ground-water sampling was begun at this site, and results will be included in next quarter's report. Twelve new wells were installed during the quarter, two at the 216-A-29 Ditch, size at the 216-A-10 Crib, and four at the 216-B-3 Pond. Preliminary characterization data for these new wells are included in this report. Driller's logs and other drilling and site characterization data will be provided in the next quarterly report. At the 2101-M Pond, construction was completed on four wells, and initial ground-water samples were taken. The drilling logs, geophysical logging data, and as-built diagrams are included in this report in Volume 2. 19 refs., 24 figs., 39 tabs.

  11. In Situ Production of Chlorine-36 in the Eastern Snake River Plain Aquifer, Idaho: Implications for Describing Ground-Water Contamination Near a Nuclear Facility

    Energy Technology Data Exchange (ETDEWEB)

    L. D. Cecil; L. L. Knobel; J. R. Green (USGS); S. K. Frape (University of Waterloo)

    2000-06-01

    The purpose of this report is to describe the calculated contribution to ground water of natural, in situ produced 36Cl in the eastern Snake River Plain aquifer and to compare these concentrations in ground water with measured concentrations near a nuclear facility in southeastern Idaho. The scope focused on isotopic and chemical analyses and associated 36Cl in situ production calculations on 25 whole-rock samples from 6 major water-bearing rock types present in the eastern Snake River Plain. The rock types investigated were basalt, rhyolite, limestone, dolomite, shale, and quartzite. Determining the contribution of in situ production to 36Cl inventories in ground water facilitated the identification of the source for this radionuclide in environmental samples. On the basis of calculations reported here, in situ production of 36Cl was determined to be insignificant compared to concentrations measured in ground water near buried and injected nuclear waste at the INEEL. Maximum estimated 36Cl concentrations in ground water from in situ production are on the same order of magnitude as natural concentrations in meteoric water.

  12. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  13. Assess and improve the sustainability of water treatment facility using Computational Fluid Dynamics

    Science.gov (United States)

    Zhang, Jie; Tejada-Martinez, Andres; Lei, Hongxia; Zhang, Qiong

    2016-11-01

    Fluids problems in water treatment industry are often simplified or omitted since the focus is usually on chemical process only. However hydraulics also plays an important role in determining effluent water quality. Recent studies have demonstrated that computational fluid dynamics (CFD) has the ability to simulate the physical and chemical processes in reactive flows in water treatment facilities, such as in chlorine and ozone disinfection tanks. This study presents the results from CFD simulations of reactive flow in an existing full-scale ozone disinfection tank and in potential designs. Through analysis of the simulation results, we found that baffling factor and CT10 are not optimal indicators of disinfection performance. We also found that the relationship between effluent CT (the product of disinfectant concentration and contact time) obtained from CT transport simulation and baffling factor depends on the location of ozone release. In addition, we analyzed the environmental and economic impacts of ozone disinfection tank designs and developed a composite indicator to quantify the sustainability of ozone disinfection tank in technological, environmental and economic dimensions.

  14. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  15. Ground-water monitoring compliance projects for Hanford Site facilities: Progress report for the period January 1--March 31, 1988: Volume 1, Text

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    This report describes the progress of eight Hanford Site ground-water monitoring projects for the period January 1 to March 31, 1988. The facilities represented by the eight projects are the 300 Area Process trenches, 183-H Solar Evaporation Basins, 200 Areas Low-Level Burial Grounds, Nonradioactive Dangerous Waste Landfill, 216-A-36B Crib, 1301-N Liquid Waste Disposal Facility, 1325-N Liquid Waste Disposal Facility, and 1324-N/NA Surface Impoundment and Percolation Ponds. The latter four projects are included in this series of quarterly reports for the first time. This report is the seventh in a series of periodic status reports; the first six cover the period from May 1, 1986, through December 31, 1987 (PNL 1986; 1987a, b, c, d; 1988a). This report satisfies the requirements of Section 17B(3) of the Consent Agreement and Compliance Order issued by the Washington State Department of Ecology (1986a) to the US Department of Energy-Richland Operations Office. 13 refs., 19 figs., 24 tabs.

  16. Introduction of a terrestrial free-space optical communications network facility: IN-orbit and Networked Optical ground stations experimental Verification Advanced testbed (INNOVA)

    Science.gov (United States)

    Toyoshima, Morio; Munemasa, Yasushi; Takenaka, Hideki; Takayama, Yoshihisa; Koyama, Yoshisada; Kunimori, Hiroo; Kubooka, Toshihiro; Suzuki, Kenji; Yamamoto, Shinichi; Taira, Shinichi; Tsuji, Hiroyuki; Nakazawa, Isao; Akioka, Maki

    2014-03-01

    A terrestrial free-space optical communications network facility, named IN-orbit and Networked Optical ground stations experimental Verification Advanced testbed (INNOVA) is introduced. Many demonstrations have been conducted to verify the usability of sophisticated optical communications equipment in orbit. However, the influence of terrestrial weather conditions remains as an issue to be solved. One potential solution is site diversity, where several ground stations are used. In such systems, implementing direct high-speed optical communications links for transmission of data from satellites to terrestrial sites requires that links can be established even in the presence of clouds and rain. NICT is developing a terrestrial free-space optical communications network called INNOVA for future airborne and satellitebased optical communications projects. Several ground stations and environmental monitoring stations around Japan are being used to explore the site diversity concept. This paper describes the terrestrial free-space optical communications network facility, the monitoring stations around Japan for free-space laser communications, and potential research at NICT.

  17. Resource conservation and recovery act ground-water monitoring projects for Hanford facilities: Progress report, January 1--March 31, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-06-01

    This document describes the progress of 13 Hanford Site ground-water monitoring projects for the period January 1 to March 31, 1989. The work described in this document is conducted by the Pacific Northwest Laboratory under the management of Westinghouse Hanford Company for the US Department of Energy. Concentrations of ground-water constituents are compared to federal drinking water standards throughout this document for reference purposes. All drinking water supplied from the sampled aquifer meets regulatory standards for drinking water quality. 32 refs., 30 figs., 103 tabs.

  18. X-ray facility for the ground calibration of the X-ray monitor JEM-X on board INTEGRAL

    DEFF Research Database (Denmark)

    Loffredo, G.; Pelliciari, C.; Frontera, F.;

    2003-01-01

    We describe the X-ray facility developed for the calibration of the X-ray monitor JEM-X on board the INTEGRAL satellite. The apparatus allowed the scanning of the detector geometric area with a pencil beam of desired energy over the major part of the passband of the instrument. The monochromatic...

  19. From the Ground Up: Floorcovering Recommendations from an IAQ Consortium. Issuetrak: A CEFPI Brief on Educational Facility Issues.

    Science.gov (United States)

    Frank, David

    This brief describes the findings of a consortium on indoor air quality (IAQ) in educational facilities held in Chattanooga, Tennessee. The objective was to determine the impact floorcoverings have on indoor air quality in schools relative to maintenance, volatile organic compounds (VOCs), airborne contaminants, moisture, surface contaminants, and…

  20. Computer Simulation and Optimization of the Process of Thawing of Grounds Using Microwave Energy

    Science.gov (United States)

    Nekrasov, S. A.; Volkov, V. S.

    2017-01-01

    In this article, consideration is given to a mathematical model and a numerical method to calculate and optimize the process of high-speed thawing of grounds using microwave energy. Relevant examples of calculations and an analysis of results are presented.

  1. Nonlinear Site Response Due to Large Ground Acceleration: Observation and Computer Simulation

    Science.gov (United States)

    Noguchi, S.; Furumura, T.; Sasatani, T.

    2009-12-01

    We studied nonlinear site response due to large ground acceleration during the 2003 off-Miyagi Earthquake (Mw7.0) in Japan by means of horizontal-to-vertical spectral ratio analysis of S-wave motion. The results were then confirmed by finite-difference method (FDM) simulation of nonlinear seismic wave propagation. A nonlinear site response is often observed at soft sediment sites, and even at hard bedrock sites which are covered by thin soil layers. Nonlinear site response can be induced by strong ground motion whose peak ground acceleration (PGA) exceeds about 100 cm/s/s, and seriously affects the amplification of high frequency ground motion and PGA. Noguchi and Sasatani (2008) developed an efficient technique for quantitative evaluation of nonlinear site response using the horizontal-to-vertical spectral ratio of S-wave (S-H/V) derived from strong ground motion records, based on Wen et al. (2006). We applied this technique to perform a detailed analysis of the properties of nonlinear site response based on a large amount of data recorded at 132 K-NET and KiK-net strong motion stations in Northern Japan during the off-Miyagi Earthquake. We succeeded in demonstrating a relationship between ground motion level, nonlinear site response and surface soil characteristics. For example, the seismic data recorded at KiK-net IWTH26 showed obvious characteristics of nonlinear site response when the PGA exceeded 100 cm/s/s. As the ground motion level increased, the dominant peak of S-H/V shifted to lower frequency, the high frequency level of S-H/V dropped, and PGA amplification decreased. On the other hand, the records at MYGH03 seemed not to be affected by nonlinear site response even for high ground motion levels in which PGA exceeds 800 cm/s/s. The characteristics of such nonlinear site amplification can be modeled by evaluating Murnaghan constants (e.g. McCall, 1994), which are the third-order elastic constants. In order to explain the observed characteristics of

  2. Design of a Computer-Based Control System Using LabVIEW for the NEMESYS Electromagnetic Launcher Facility

    Science.gov (United States)

    2007-06-01

    quickly was necessary. A railgun shot typically occurs in less than 10 ms, and firing capacitor banks to shape the current pulse are in the 100s of...DESIGN OF A COMPUTER-BASED CONTROL SYSTEM USING LABVIEW FOR THE NEMESYS ELECTROMAGNETIC LAUNCHER FACILITY∗ B. M. Huhmanξ 1, J. M. Neri Plasma...has assembled a facility to develop and test materials for the study of barrel lifetime in electromagnetic launchers (EML) for surface-fire support

  3. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  4. Analysing the scalability of thermal hydraulic test facility data to reactor scale with a computer code; Vertailuanalyysin kaeyttoe termohydraulisten koelaitteistojen tulosten laitosmittakaavaan skaalautumisen tutkimisessa

    Energy Technology Data Exchange (ETDEWEB)

    Suikkanen, P.

    2009-01-15

    The objective of the Masters thesis was to study guidelines and procedures for scaling of thermal hydraulic test facilities and to compare results from two test facility models and from EPR model. Aim was to get an impression of how well the studied test facilities describe the behaviour in power plant scale during accident scenarios with computer codes. Models were used to determine the influence of primary circuit mass inventory on the behaviour of the circuit. The data from test facility models represent the same phenomena as the data from EPR model. The results calculated with PKL model were also compared against PKL test facility data. They showed good agreement. Test facility data is used to validate computer codes, which are used in nuclear safety analysis. The scale of the facility has effect on the behaviour of the phenomena and therefore special care must be taken in using the data. (orig.)

  5. Ground truth evaluation of computer vision based 3D reconstruction of synthesized and real plant images

    DEFF Research Database (Denmark)

    Nielsen, Michael; Andersen, Hans Jørgen; Slaughter, David

    2007-01-01

    There is an increasing interest in using 3D computer vision in precision agriculture. This calls for better quantitative evaluation and understanding of computer vision methods. This paper proposes a test framework using ray traced crop scenes that allows in-depth analysis of algorithm performance...

  6. On the occurrence of ground observations of ELF/VLF magnetospheric amplification induced by the HAARP facility

    OpenAIRE

    İnan, Umran Savaş; Golkowski, M.; Cohen, M. B.; Carpenter, D. L.

    2011-01-01

    The ionospheric heating facility of the High Frequency Active Auroral Research Program (HAARP) has been used extensively in the last 3 years for injection of ELF/VLF waves into the magnetosphere via modulated heating of the overhead auroral electrojet currents. Of particular interest are waves that are observed to be nonlinearly amplified after interaction with hot plasma electrons in the Earth's radiation belts. Past results have shown HAARP to be an effective platform for controlled studies...

  7. Data on the quantitative assessment pulmonary ground-glass opacification from coronary computed tomography angiography datasets

    DEFF Research Database (Denmark)

    Kühl, J Tobias; Kristensen, Thomas S; Thomsen, Anna F

    2017-01-01

    We assessed the CT attenuation density of the pulmonary tissue adjacent to the heart in patients with acute non-ST segment elevation myocardial infarction (J.T. Kuhl, T.S. Kristensen, A.F. Thomsen et al., 2016) [1]. This data was related to the level of ground-glass opacification evaluated...... by a radiologist, and data on the interobserver variability of semi-automated assessment of pulmonary attenuation density was provided....

  8. Advantages of analytically computing the ground heat flux in land surface models

    Science.gov (United States)

    Pauwels, Valentijn R. N.; Daly, Edoardo

    2016-11-01

    It is generally accepted that the ground heat flux accounts for a significant fraction of the surface energy balance. In land surface models, the ground heat flux is typically estimated through a numerical solution of the heat conduction equation. Recent research has shown that this approach introduces errors in the estimation of the energy balance. In this paper, we calibrate a land surface model using a numerical solution of the heat conduction equation with four different vertical spatial resolutions. It is found that the thermal conductivity is the most sensitive parameter to the spatial resolution. More importantly, the thermal conductivity values are directly related to the spatial resolution, thus rendering any physical interpretation of this value irrelevant. The numerical solution is then replaced by an analytical solution. The results of the numerical and analytical solutions are identical when fine spatial and temporal resolutions are used. However, when using resolutions that are typical of land surface models, significant differences are found. When using the analytical solution, the ground heat flux is directly calculated without calculating the soil temperature profile. The calculation of the temperature at each node in the soil profile is thus no longer required, unless the model contains parameters that depend on the soil temperature, which in this study is not the case. The calibration is repeated, and thermal conductivity values independent of the vertical spatial resolution are obtained. The main conclusion of this study is that care must be taken when interpreting land surface model results that have been obtained using numerical ground heat flux estimates. The use of exact analytical solutions, when available, is recommended.

  9. Exponential vanishing of the ground-state gap of the quantum random energy model via adiabatic quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Adame, J.; Warzel, S., E-mail: warzel@ma.tum.de [Zentrum Mathematik, TU München, Boltzmannstr. 3, 85747 Garching (Germany)

    2015-11-15

    In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.

  10. Use of borehole and surface geophysics to investigate ground-water quality near a road-deicing salt-storage facility, Valparaiso, Indiana

    Science.gov (United States)

    Risch, M.R.; Robinson, B.A.

    2001-01-01

    Borehole and surface geophysics were used to investigate ground-water quality affected by a road-deicing salt-storage facility located near a public water-supply well field. From 1994 through 1998, borehole geophysical logs were made in an existing network of monitoring wells completed near the bottom of a thick sand aquifer. Logs of natural gamma activity indicated a uniform and negligible contribution of clay to the electromagnetic conductivity of the aquifer so that the logs of electromagnetic conductivity primarily measured the amount of dissolved solids in the ground water near the wells. Electromagneticconductivity data indicated the presence of a saltwater plume near the bottom of the aquifer. Increases in electromagnetic conductivity, observed from sequential logging of wells, indicated the saltwater plume had moved north about 60 to 100 feet per year between 1994 and 1998. These rates were consistent with estimates of horizontal ground-water flow based on velocity calculations made with hydrologic data from the study area.

  11. Computational Fluid Dynamic Modeling of Horizontal Air-Ground Heat Exchangers (HAGHE for HVAC Systems

    Directory of Open Access Journals (Sweden)

    Paolo Maria Congedo

    2014-12-01

    Full Text Available In order to satisfy the requirements of Directive 2010/31/EU for Zero Energy Buildings (ZEB, innovative solutions were investigated for building HVAC systems. Horizontal air-ground heat exchangers (HAGHE offer a significant contribution in reducing energy consumption for ventilation, using the thermal energy stored underground, in order to pre-heat or pre-cool the ventilation air, in winter and summer, respectively. This is particularly interesting in applications for industrial, commercial and education buildings where keeping the indoor air quality under control is extremely important. Experimental measurements show that, throughout the year, the outside air temperature fluctuations are mitigated at sufficient ground depth (about 3 m because of the high thermal inertia of the soil, the ground temperature is relatively constant and instead higher than that of the outside air in winter and lower in summer. The study aims to numerically investigate the behavior of HAGHE by varying the air flow rate and soil conductivity in unsteady conditions by using annual weather data of South-East Italy. The analysis shows that, in warm climates, the HAGHE brings a real advantage for only a few hours daily in winter, while it shows significant benefits in the summer for the cooling of ventilation air up to several temperature degrees, already by a short pipe.

  12. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  13. A new formulation to compute self-potential signals associated with ground water flow

    Directory of Open Access Journals (Sweden)

    A. Bolève

    2007-06-01

    Full Text Available The classical formulation of the coupled hydroelectrical flow in porous media is based on a linear formulation of two coupled constitutive equations for the electrical current density and the seepage velocity of the water phase and obeying Onsager's reciprocity. This formulation shows that the streaming current density is controlled by the gradient of the fluid pressure of the water phase and a streaming current coupling coefficient that depends on the so-called zeta potential. Recently a new formulation has been introduced in which the streaming current density is directly connected to the seepage velocity of the water phase and to the excess of electrical charge per unit pore volume in the porous material. The advantages of this formulation are numerous. First this new formulation is more intuitive not only in terms of constitutive equation for the generalized Ohm's law but also in specifying boundary conditions for the influence of the flow field upon the streaming potential. With the new formulation, the streaming potential coupling coefficient shows a decrease of its magnitude with permeability in agreement with published results. The new formulation is also easily extendable to non-viscous laminar flow problems (high Reynolds number ground water flow in cracks for example and to unsaturated conditions with applications to the vadose zone. We demonstrate here that this formulation is suitable to model self-potential signals in the field. We investigate infiltration of water from an agricultural ditch, vertical infiltration of water into a sinkhole, and preferential horizontal flow of ground water in a paleochannel. For the three cases reported in the present study, a good match is obtained between the finite element simulations performed with the finite element code Comsol Multiphysics 3.3 and field observations. Finally, this formulation seems also very promising for the inversion of the geometry of ground water flow from the

  14. Auditory Power-Law Activation Avalanches Exhibit a Fundamental Computational Ground State

    Science.gov (United States)

    Stoop, Ruedi; Gomez, Florian

    2016-07-01

    The cochlea provides a biological information-processing paradigm that we are only beginning to understand in its full complexity. Our work reveals an interacting network of strongly nonlinear dynamical nodes, on which even a simple sound input triggers subnetworks of activated elements that follow power-law size statistics ("avalanches"). From dynamical systems theory, power-law size distributions relate to a fundamental ground state of biological information processing. Learning destroys these power laws. These results strongly modify the models of mammalian sound processing and provide a novel methodological perspective for understanding how the brain processes information.

  15. Production of Referring Expressions for an Unknown Audience: A Computational Model of Communal Common Ground.

    Science.gov (United States)

    Kutlak, Roman; van Deemter, Kees; Mellish, Chris

    2016-01-01

    This article presents a computational model of the production of referring expressions under uncertainty over the hearer's knowledge. Although situations where the hearer's knowledge is uncertain have seldom been addressed in the computational literature, they are common in ordinary communication, for example when a writer addresses an unknown audience, or when a speaker addresses a stranger. We propose a computational model composed of three complimentary heuristics based on, respectively, an estimation of the recipient's knowledge, an estimation of the extent to which a property is unexpected, and the question of what is the optimum number of properties in a given situation. The model was tested in an experiment with human readers, in which it was compared against the Incremental Algorithm and human-produced descriptions. The results suggest that the new model outperforms the Incremental Algorithm in terms of the proportion of correctly identified entities and in terms of the perceived quality of the generated descriptions.

  16. Ground-water quality near the northwest 58th Street solid-waste disposal facility, Dade County, Florida

    Science.gov (United States)

    Mattraw, H.C.; Hull, John E.; Klein, Howard

    1978-01-01

    The Northwest 58th Street solid-waste disposal facility, 3 miles west of a major Dade County municipal water-supply well field, overlays the Biscayne aquifer, a permeable, solution-riddled limestone which transmits leachates eastward at a calculated rate of 2.9 feet per day. A discrete, identifiable leachate plume has been recognized under and downgradient from the waste disposal facility. Concentrations of sodium, ammonia, and dissolved solids decreased with depth beneath the disposal area and downgradient in response to an advective and convective dispersion. At a distance of about one-half downgradient, the rate of contribution of leachate from the source to the leading edge of the plume was about equal to the rate of loss of leachate from the leading edge of the plume by diffusion and dilution by rainfall infiltration during the period August 1973 - July 1975. Heavy metals and pesticides are filtered, adsorbed by aquifer materials, or are precipitated near the disposal area. (Woodard-USGS)

  17. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  18. Developing a project-based computational physics course grounded in expert practice

    CERN Document Server

    Burke, Christopher

    2016-01-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  19. Prediction of peak ground acceleration of Iran's tectonic regions using a hybrid soft computing technique

    Directory of Open Access Journals (Sweden)

    Mostafa Gandomi

    2016-01-01

    Full Text Available A new model is derived to predict the peak ground acceleration (PGA utilizing a hybrid method coupling artificial neural network (ANN and simulated annealing (SA, called SA-ANN. The proposed model relates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran's tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R = 0.835 and ρ = 0.0908 and it is subsequently converted into a tractable design equation.

  20. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  1. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  2. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  3. Methods for using computer training facilities in studies of special disciplines

    Directory of Open Access Journals (Sweden)

    O.L. Tashlykov

    2016-12-01

    The use of the analytical simulator is illustrated by a laboratory research project entitled “BN-800 Reactor Power Maneuvering”, which investigates the reactor facility power control modes in a power range of 100–80–100% of the rated power.

  4. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [FRIB, MSU; Mokhov, Nikolai [FNAL; Niita, Koji [RIST, Japan

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  5. Comparison of Stereo-PIV and Plenoptic-PIV Measurements on the Wake of a Cylinder in NASA Ground Test Facilities.

    Science.gov (United States)

    Fahringer, Timothy W.; Thurow, Brian S.; Humphreys, William M., Jr.; Bartram, Scott M.

    2017-01-01

    A series of comparison experiments have been performed using a single-camera plenoptic PIV measurement system to ascertain the systems performance capabilities in terms of suitability for use in NASA ground test facilities. A proof-of-concept demonstration was performed in the Langley Advanced Measurements and Data Systems Branch 13-inch (33- cm) Subsonic Tunnel to examine the wake of a series of cylinders at a Reynolds number of 2500. Accompanying the plenoptic-PIV measurements were an ensemble of complementary stereo-PIV measurements. The stereo-PIV measurements were used as a truth measurement to assess the ability of the plenoptic-PIV system to capture relevant 3D/3C flow field features in the cylinder wake. Six individual tests were conducted as part of the test campaign using three different cylinder diameters mounted in two orientations in the tunnel test section. This work presents a comparison of measurements with the cylinders mounted horizontally (generating a 2D flow field in the x-y plane). Results show that in general the plenoptic-PIV measurements match those produced by the stereo-PIV system. However, discrepancies were observed in extracted pro les of the fuctuating velocity components. It is speculated that spatial smoothing of the vector fields in the stereo-PIV system could account for the observed differences. Nevertheless, the plenoptic-PIV system performed extremely well at capturing the flow field features of interest and can be considered a viable alternative to traditional PIV systems in smaller NASA ground test facilities with limited optical access.

  6. Taking the High Ground: A Case for Department of Defense Application of Public Cloud Computing

    Science.gov (United States)

    2011-06-01

    IT cannot be sustained in a declining budget environment with users demanding better services. Wyld captures the essence of much of the problem for...the DoD laboratory data centers into model versions of public providers. An open source project, called Eucalyptus (http://www.eucalyptus.com), would...be an excellent starting point for such a project. Eucalyptus is a software plat- form for implementing private cloud computing solutions on top of

  7. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    Energy Technology Data Exchange (ETDEWEB)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-09-05

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above.

  8. A high-performance ground-based prototype of horn-type sequential vegetable production facility for life support system in space

    Science.gov (United States)

    Fu, Yuming; Liu, Hui; Shao, Lingzhi; Wang, Minjuan; Berkovich, Yu A.; Erokhin, A. N.; Liu, Hong

    2013-07-01

    Vegetable cultivation plays a crucial role in dietary supplements and psychosocial benefits of the crew during manned space flight. Here we developed a ground-based prototype of horn-type sequential vegetable production facility, named Horn-type Producer (HTP), which was capable of simulating the microgravity effect and the continuous cultivation of leaf-vegetables on root modules. The growth chamber of the facility had a volume of 0.12 m3, characterized by a three-stage space expansion with plant growth. The planting surface of 0.154 m2 was comprised of six ring-shaped root modules with a fibrous ion-exchange resin substrate. Root modules were fastened to a central porous tube supplying water, and moved forward with plant growth. The total illuminated crop area of 0.567 m2 was provided by a combination of red and white light emitting diodes on the internal surfaces. In tests with a 24-h photoperiod, the productivity of the HTP at 0.3 kW for lettuce achieved 254.3 g eatable biomass per week. Long-term operation of the HTP did not alter vegetable nutrition composition to any great extent. Furthermore, the efficiency of the HTP, based on the Q-criterion, was 7 × 10-4 g2 m-3 J-1. These results show that the HTP exhibited high productivity, stable quality, and good efficiency in the process of planting lettuce, indicative of an interesting design for space vegetable production.

  9. Effects on radionuclide concentrations by cement/ground-water interactions in support of performance assessment of low-level radioactive waste disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Serne, R.J. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-05-01

    The US Nuclear Regulatory Commission is developing a technical position document that provides guidance regarding the performance assessment of low-level radioactive waste disposal facilities. This guidance considers the effects that the chemistry of the vault disposal system may have on radionuclide release. The geochemistry of pore waters buffered by cementitious materials in the disposal system will be different from the local ground water. Therefore, the cement-buffered environment needs to be considered within the source term calculations if credit is taken for solubility limits and/or sorption of dissolved radionuclides within disposal units. A literature review was conducted on methods to model pore-water compositions resulting from reactions with cement, experimental studies of cement/water systems, natural analogue studies of cement and concrete, and radionuclide solubilities experimentally determined in cement pore waters. Based on this review, geochemical modeling was used to calculate maximum concentrations for americium, neptunium, nickel, plutonium, radium, strontium, thorium, and uranium for pore-water compositions buffered by cement and local ground-water. Another literature review was completed on radionuclide sorption behavior onto fresh cement/concrete where the pore water pH will be greater than or equal 10. Based on this review, a database was developed of preferred minimum distribution coefficient values for these radionuclides in cement/concrete environments.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  11. Norovirus contamination levels in ground water treatment systems used for food-catering facilities in South Korea.

    Science.gov (United States)

    Lee, Bo-Ram; Lee, Sung-Geun; Park, Jong-Hyun; Kim, Kwang-Yup; Ryu, Sang-Ryeol; Rhee, Ok-Jae; Park, Jeong-Woong; Lee, Jeong-Su; Paik, Soon-Young

    2013-07-02

    This study aimed to inspect norovirus contamination of groundwater treatment systems used in food-catering facilities located in South Korea. A nationwide study was performed in 2010. Water samples were collected and, for the analysis of water quality, the temperature, pH, turbidity, and residual chlorine content were assessed. To detect norovirus genotypes GI and GII, RT-PCR and semi-nested PCR were performed with specific NV-GI and NV-GII primer sets, respectively. The PCR products amplified from the detected strains were then subjected to sequence analyses. Of 1,090 samples collected in 2010, seven (0.64%) were found to be norovirus-positive. Specifically, one norovirus strain was identified to have the GI-6 genotype, and six GII strains had the GII, GII-3, GII-4, and GII-17 genotypes. The very low detection rate of norovirus most likely reflects the preventative measures used. However, this virus can spread rapidly from person to person in crowded, enclosed places such as the schools investigated in this study. To promote better public health and sanitary conditions, it is necessary to periodically monitor noroviruses that frequently cause epidemic food poisoning in South Korea.

  12. Computational Studies of X-ray Framing Cameras for the National Ignition Facility

    Science.gov (United States)

    2013-06-01

    Livermore National Laboratory 7000 East Avenue Livermore, CA 94550 USA Abstract The NIF is the world’s most powerful laser facility and is...a phosphor screen where the output is recorded. The x-ray framing cameras have provided excellent information. As the yields at NIF have increased...experiments on the NIF . The basic operation of these cameras is shown in Fig. 1. Incident photons generate photoelectrons both in the pores of the MCP and

  13. Grounded cognition.

    Science.gov (United States)

    Barsalou, Lawrence W

    2008-01-01

    Grounded cognition rejects traditional views that cognition is computation on amodal symbols in a modular system, independent of the brain's modal systems for perception, action, and introspection. Instead, grounded cognition proposes that modal simulations, bodily states, and situated action underlie cognition. Accumulating behavioral and neural evidence supporting this view is reviewed from research on perception, memory, knowledge, language, thought, social cognition, and development. Theories of grounded cognition are also reviewed, as are origins of the area and common misperceptions of it. Theoretical, empirical, and methodological issues are raised whose future treatment is likely to affect the growth and impact of grounded cognition.

  14. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  15. A computational model of the lexical-semantic system based on a grounded cognition approach.

    Science.gov (United States)

    Ursino, Mauro; Cuppini, Cristiano; Magosso, Elisa

    2010-01-01

    This work presents a connectionist model of the semantic-lexical system based on grounded cognition. The model assumes that the lexical and semantic aspects of language are memorized in two distinct stores. The semantic properties of objects are represented as a collection of features, whose number may vary among objects. Features are described as activation of neural oscillators in different sensory-motor areas (one area for each feature) topographically organized to implement a similarity principle. Lexical items are represented as activation of neural groups in a different layer. Lexical and semantic aspects are then linked together on the basis of previous experience, using physiological learning mechanisms. After training, features which frequently occurred together, and the corresponding word-forms, become linked via reciprocal excitatory synapses. The model also includes some inhibitory synapses: features in the semantic network tend to inhibit words not associated with them during the previous learning phase. Simulations show that after learning, presentation of a cue can evoke the overall object and the corresponding word in the lexical area. Moreover, different objects and the corresponding words can be simultaneously retrieved and segmented via a time division in the gamma-band. Word presentation, in turn, activates the corresponding features in the sensory-motor areas, recreating the same conditions occurring during learning. The model simulates the formation of categories, assuming that objects belong to the same category if they share some features. Simple exempla are shown to illustrate how words representing a category can be distinguished from words representing individual members. Finally, the model can be used to simulate patients with focalized lesions, assuming an impairment of synaptic strength in specific feature areas.

  16. Water Treatment Unit Breadboard: Ground test facility for the recycling of urine and shower water for one astronaut

    Science.gov (United States)

    Lindeboom, Ralph E. F.; Lamaze, Brigitte; Clauwaert, Peter; Christiaens, Marlies E. R.; Rabaey, Korneel; Vlaeminck, Siegfried; Vanoppen, Marjolein; Demey, Dries; Farinas, Bernabé Alonso; Coessens, Wout; De Paepe, Jolien; Dotremont, Chris; Beckers, Herman; Verliefde, Arne

    2016-07-01

    One of the major challenges for long-term manned Space missions is the requirement of a regenerative life support system. Average water consumption in Western Countries is >100 L d-1. Even when minimizing the amount of water available per astronauts to 13 L d-1, a mission of 6 crew members requires almost 30 ton of fresh water supplies per year. Note that the International Space Station (ISS) weighs approximately 400 ton. Therefore the development of an efficient water recovery system is essential to future Space exploration. The ISS currently uses a Vapor Compression Distillation (VCD) unit following the addition of chromic and sulphuric acid for the microbial stabilization of urine (Carter, Tobias et al. 2012), yielding a water recovery percentage of only 70% due to scaling control. Additionally, Vapor Compression Distillation of 1.5 L urine cap 1 d-1 has a significantly higher power demand with 6.5 W cap-1 compared to a combination of electrodialysis (ED) and reverse osmosis (RO) with 1.9 and 0.6 W cap-1 respectively (Udert and Wächter 2012). A Water Treatment Unit Breadboard (WTUB) has been developed which combines a physicochemical and biological treatment. The aim was to recover 90% of the water in urine, condensate and shower water produced by one crew member and this life support testbed facility was inspired by the MELiSSA loop concept, ESA's Life Support System. Our experimental results showed that: 1) using a crystallisation reactor prior to the nitrification reduced scaling risks by Ca2+- and Mg2+ removal 2) the stabilization of urine diluted with condensate resulted in the biological conversion of 99% of Total Kjeldahl nitrogen into nitrate in the biological nitrification reactor 3) salinity and nitrate produced could be removed by 60-80% by electrodialysis, 4) shower water contaminated with skin microbiota and Neutrogena soap ® could be mixed with electrodialysis diluate and filtered directly over a ceramic nanofiltration at 93% water recovery and 5

  17. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2008-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  18. Clinical, pathological, and radiological characteristics of solitary ground-glass opacity lung nodules on high-resolution computed tomography

    Directory of Open Access Journals (Sweden)

    Qiu ZX

    2016-09-01

    Full Text Available Zhi-Xin Qiu,1 Yue Cheng,1 Dan Liu,1 Wei-Ya Wang,2 Xia Wu,2 Wei-Lu Wu,2 Wei-Min Li1,2 1Department of Respiratory Medicine, 2Department of Pathology, West China Hospital, Sichuan University, Chengdu, People’s Republic of China Background: Lung nodules are being detected at an increasing rate year by year with high-resolution computed tomography (HRCT being widely used. Ground-glass opacity nodule is one of the special types of pulmonary nodules that is confirmed to be closely associated with early stage of lung cancer. Very little is known about solitary ground-glass opacity nodules (SGGNs. In this study, we analyzed the clinical, pathological, and radiological characteristics of SGGNs on HRCT.Methods: A total of 95 resected SGGNs were evaluated with HRCT scan. The clinical, pathological, and radiological characteristics of these cases were analyzed.Results: Eighty-one adenocarcinoma and 14 benign nodules were observed. The nodules included 12 (15% adenocarcinoma in situ (AIS, 14 (17% minimally invasive adenocarcinoma (MIA, and 55 (68% invasive adenocarcinoma (IA. No patients with recurrence till date have been identified. The positive expression rates of anaplastic lymphoma kinase and ROS-1 (proto-oncogene tyrosine-protein kinase ROS were only 2.5% and 8.6%, respectively. The specificity and accuracy of HRCT of invasive lung adenocarcinoma were 85.2% and 87.4%. The standard uptake values of only two patients determined by 18F-FDG positron emission tomography/computed tomography (PET/CT were above 2.5. The size, density, shape, and pleural tag of nodules were significant factors that differentiated IA from AIS and MIA. Moreover, the size, shape, margin, pleural tag, vascular cluster, bubble-like sign, and air bronchogram of nodules were significant determinants for mixed ground-glass opacity nodules (all P<0.05.Conclusion: We analyzed the clinical, pathological, and radiological characteristics of SGGNs on HRCT and found that the size, density

  19. Injection and extraction computer control system HIRFL-SSC The HIRFL-SSC is stated for Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron

    CERN Document Server

    Zhang Wei; Chen Yun; Zhang Xia; Hu Jian Jun; Xu Xing Ming

    2002-01-01

    The injection and extraction computer control system of HIRFL-SSC (Heavy Ion Research Facility of Lanzhou-Separated Sector Cyclotron) have been introduced. Software is described briefly. Hardware structure is mainly presented. The computer control system realize that the adjustment of injection and extraction can done by PC and operate interface is Windows style. The system can make the adjustment convenient and veracious

  20. A computer-controlled experimental facility for krypton and xenon adsorption coefficient measurements on activated carbons

    Energy Technology Data Exchange (ETDEWEB)

    Del Serra, Daniele; Aquaro, Donato; Mazed, Dahmane; Pazzagli, Fabio; Ciolini, Riccardo, E-mail: r.ciolini@ing.unipi.it

    2015-07-15

    Highlights: • An experimental test facility for qualification of the krypton and xenon adsorption properties of activated carbons. • The measurement of the adsorption coefficient by using the elution curve method. • The simultaneous on-line control of the main physical parameters influencing the adsorption property of activated carbon. - Abstract: An automated experimental test facility, intended specifically for qualification of the krypton and xenon adsorption properties of activated carbon samples, was designed and constructed. The experimental apparatus was designed to allow an on-line control of the main physical parameters influencing greatly the adsorption property of activated carbon. The measurement of the adsorption coefficient, based upon the elution curve method, can be performed with a precision better than 5% at gas pressure values ranging from atmospheric pressure up to 9 bar and bed temperature from 0 up to 80 °C. The carrier gas flow rate can be varied from 40 up to 4000 N cm{sup 3} min{sup −1} allowing measurement of dynamic adsorption coefficient with face velocities from 0.3 up to 923 cm min{sup −1} depending on the gas pressure and the test cell being used. The moisture content of the activated carbon can be precisely controlled during measurement, through the relative humidity of the carrier gas.

  1. Computer simulations of comet- and asteroidlike bodies passing through the Venusian atmosphere: Preliminary results on atmospheric and ground shock effects

    Science.gov (United States)

    Roddy, D.; Hatfield, D.; Hassig, P.; Rosenblatt, M.; Soderblom, L.; Dejong, E.

    1992-01-01

    We have completed computer simulations that model shock effects in the venusian atmosphere caused during the passage of two cometlike bodies 100 m and 1000 m in diameter and an asteroidlike body 10 km in diameter. Our objective is to examine hypervelocity-generated shock effects in the venusian atmosphere for bodies of different types and sizes in order to understand the following: (1) their deceleration and depth of penetration through the atmosphere; and (2) the onset of possible ground-surface shock effects such as splotches, craters, and ejecta formations. The three bodies were chosen to include both a range of general conditions applicable to Venus as well as three specific cases of current interest. These calculations use a new multiphase computer code (DICE-MAZ) designed by California Research & Technology for shock-dynamics simulations in complex environments. The code was tested and calibrated in large-scale explosion, cratering, and ejecta research. It treats a wide range of different multiphase conditions, including material types (vapor, melt, solid), particle-size distributions, and shock-induced dynamic changes in velocities, pressures, temperatures (internal energies), densities, and other related parameters, all of which were recorded in our calculations.

  2. The Use of Public Computing Facilities by Library Patrons: Demography, Motivations, and Barriers

    Science.gov (United States)

    DeMaagd, Kurt; Chew, Han Ei; Huang, Guanxiong; Khan, M. Laeeq; Sreenivasan, Akshaya; LaRose, Robert

    2013-01-01

    Public libraries play an important part in the development of a community. Today, they are seen as more than store houses of books; they are also responsible for the dissemination of online, and offline information. Public access computers are becoming increasingly popular as more and more people understand the need for internet access. Using a…

  3. Computer Simulation of an Anesthesia Service at a U.S. Army Medical Treatment Facility

    Science.gov (United States)

    1999-08-01

    Anesthesia Simulation Study 1 Running head : ANESTHESIA SIMULATION Computer Simulation of an Anesthesia Service at a U.S. Army Medical Treatment...bettering marketing efforts). There are several articles that address staffing from the perspective of what type of provider is the most cost

  4. Prognostic value of the ratio of ground glass opacity on computed tomography in small lung adenocarcinoma: A meta-analysis

    Science.gov (United States)

    Miao, Xiao-Hui; Yao, Yan-Wen; Yuan, Dong-Mei; Lv, Yan-Ling; Zhan, Ping; Lv, Tang-Feng; Liu, Hong-Bing

    2012-01-01

    Introduction Lung cancer is the leading cause of cancer-associated death. In many countries, adenocarcinoma is the most common histologic type in lung cancer. Previously, few factors are identified to be prognostic indicators for the patients with small lung adenocarcinoma. Recently, the ground glass opacity (GGO) area found on high-resolution computed tomography (HRCT) scanning was identified as a prognostic indicator in some studies. But no clear consensus has been defined. Methods The PubMed/MEDLINE, EMBASE, Cochrane library and SpringerLink electronic databases were searched for articles related to ground glass opacity on computed tomography in patients with small lung adenocarcinoma. Data was extracted and analyzed independently by two investigators. An estimate of the hazard ratio (HR) for comparing high GGO ratio with low GGO ratio was extracted. The respective HRs was combined into a pooled HR, and 95% confidence interval (CI) was calculated for each study. The publication heterogeneity was assessed graphically using performing Beggs’ funnel plot. All the statistical tests used in our meta-analysis were performed with STATA version 11. Results Thirteen studies, encompassing 2,027 patients, were included in our meta-analysis. Ten of these studies revealed that the GGO ratio in small lung adenocarcinoma is a good prognostic indicator. Seven studies were combined in a meta-analysis using overall survival (OS) as the end point of interest. The weighted HR of 7 studies was 0.85, with relative 95% CI ranging from 0.78 to 0.93 (P=0.009). For the surgical patient population, the primary endpoint of relapse-free survival (RFS) was superior with high GGO area on computed tomography (The combined HR 0.82, 95% CI 0.74-0.90; P=0.007). Conclusions The result of our meta-analysis suggested that the GGO area measured on HRCT had a prognostic value of overall survival and relapse-free survival in small lung adenocarcinoma. The GGO ratio may be an independent prognostic

  5. Workplan/RCRA Facility Investigation/Remedial Investigation Report for the Old Radioactive Waste Burial Ground 643-E, S01-S22 - Volume I - Text and Volume II - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Conner, K.R.

    2000-12-12

    This document presents the assessment of environmental impacts resulting from releases of hazardous substances from the facilities in the Old Radioactive Waste Burial Ground 643-E, including Solvent Tanks 650-01E to 650-22E, also referred to as Solvent Tanks at the Savannah River Site, Aiken, South Carolina.

  6. Cambridge-Cranfield High Performance Computing Facility (HPCF) purchases ten Sun Fire(TM) 15K servers to dramatically increase power of eScience research

    CERN Multimedia

    2002-01-01

    "The Cambridge-Cranfield High Performance Computing Facility (HPCF), a collaborative environment for data and numerical intensive computing privately run by the University of Cambridge and Cranfield University, has purchased 10 Sun Fire(TM) 15K servers from Sun Microsystems, Inc.. The total investment, which includes more than $40 million in Sun technology, will dramatically increase the computing power, reliability, availability and scalability of the HPCF" (1 page).

  7. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    Science.gov (United States)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  8. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  9. Universal Drive Train Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This vehicle drive train research facility is capable of evaluating helicopter and ground vehicle power transmission technologies in a system level environment. The...

  10. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  11. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  12. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport

  13. A Model of an Expert Computer Vision and Recognition Facility with Applications of a Proportion Technique.

    Science.gov (United States)

    2014-09-26

    of research is being 14 function called WHATISFACE. [Rhodes][Tucker][ Hogg ][Sowa] The model offering the most specific information about structure and...1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image and Vision Computing", Vol. 1, No. 1, February 1983, pp. 5-20...Systems", Addison-Wesley Publishing Company, Inc., Massachusetts, 1983. Hogg , D., "Model-based vision: a program to see a walking person", from "Image

  14. Successful marriage: American Panel Corporation and LG Philips LCD custom-designed avionic, shipboard, and rugged ground vehicle display modules from a consumer-oriented fabrication facility

    Science.gov (United States)

    Dunn, William; Garrett, Kimberly S.

    2001-09-01

    American panel corporation (APC) believes the use of custom designed (instead of ruggedized commercial) AMLCD cells is the only way to meet the specific environmental and performance requirements of the military/commercial avionic, shipboard and rugged ground vehicle markets. The APC/LG.Philips LCD (LG) custom approach mitigates risk to the end-user in many ways. As a part of the APC/LG long- term agreement LG has committed to provide module level equivalent (form, fit and function equivalent) panels for a period of ten years. No other commercial glass manufacturer has provided such an agreement. With the use of LG's commercial production manufacturing capabilities, APC/LG can provide the opportunity to procure a lifetime buy for any program with delivery of the entire lot within six months of order placement. This ensures that the entire production program will receive identical glass for every unit. The APC/LG relationship works where others have failed due to the number of years spent cultivating the mutual trust and respect necessary for establishing such a partnership, LG's interest in capturing the market share of this niche application, and the magnitude of the initial up-front investment by APC in engineering, tooling, facilities, production equipment, and LCD cell inventory.

  15. Criteria report intermediate storage facility. Criteria for the evaluation of potential sites for an intermediate above-ground-storage facility for retrieves radioactive waste from the Asse II cavern; Kriterienbericht Zwischenlager. Kriterien zur Bewertung potenzieller Standorte fuer ein uebertaegiges Zwischenlager fuer die rueckgeholten radioaktiven Abfaelle aus der Schachtanlage Asse II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-10

    The BfS judged that the retrieval of the radioactive wastes from the Schacht Asse II is the best option for decommissioning. The recovered radioactive wastes shall be transported in special containers and conditioned in facilities near the site for the transport in a final repository. The criteria for the site selection for the required intermediate above-ground intermediate storage facility are defined including the criteria for the evaluation procedure.

  16. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  17. Facile identification of dual FLT3-Aurora A inhibitors: a computer-guided drug design approach.

    Science.gov (United States)

    Chang Hsu, Yung; Ke, Yi-Yu; Shiao, Hui-Yi; Lee, Chieh-Chien; Lin, Wen-Hsing; Chen, Chun-Hwa; Yen, Kuei-Jung; Hsu, John T-A; Chang, Chungming; Hsieh, Hsing-Pang

    2014-05-01

    Computer-guided drug design is a powerful tool for drug discovery. Herein we disclose the use of this approach for the discovery of dual FMS-like receptor tyrosine kinase-3 (FLT3)-Aurora A inhibitors against cancer. An Aurora hit compound was selected as a starting point, from which 288 virtual molecules were screened. Subsequently, some of these were synthesized and evaluated for their capacity to inhibit FLT3 and Aurora kinase A. To further enhance FLT3 inhibition, structure-activity relationship studies of the lead compound were conducted through a simplification strategy and bioisosteric replacement, followed by the use of computer-guided drug design to prioritize molecules bearing a variety of different terminal groups in terms of favorable binding energy. Selected compounds were then synthesized, and their bioactivity was evaluated. Of these, one novel inhibitor was found to exhibit excellent inhibition of FLT3 and Aurora kinase A and exert a dramatic antiproliferative effect on MOLM-13 and MV4-11 cells, with an IC50 value of 7 nM. Accordingly, it is considered a highly promising candidate for further development.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  20. A study of coupled rotor-fuselage vibration with higher harmonic control using a symbolic computing facility

    Science.gov (United States)

    Papavassiliou, I.; Venkatesan, C.; Friedmann, P. P.

    1990-01-01

    A fundamental study of vibration prediction and vibration reduction in helicopters using active controls was performed. The nonlinear equations of motion for a coupled rotor/flexible fuselage system have been derived using computer algebra on a special purpose symbolic computing facility. The details of the derivation using the MACSYMA program are described. The trim state and vibratory response of the helicopter are obtained in a single pass by applying the harmonic balance technique and simultaneously satisfying the trim and the vibratory response of the helicopter for all rotor and fuselage degrees of freedom. The influence of the fuselage flexibility on the vibratory response is studied. It is shown that the conventional single frequency higher harmonic control (HHC) capable of reducing either the hub loads or only the fuselage vibrations but not both simultaneously. It is demonstrated that for simultaneous reduction of hub shears and fuselage vibrations a new scheme called multiple higher harmonic control (MHHC) is required. The fundamental aspects of this scheme and its uniqueness are described in detail, providing new insight on vibration reduction in helicopters using HHC.

  1. Simulation concept of NICA-MPD-SPD Tier0-Tier1 computing facilities

    Science.gov (United States)

    Korenkov, V. V.; Nechaevskiy, A. V.; Ososkov, G. A.; Pryahina, D. I.; Trofomov, V. V.; Uzhinskiy, A. V.

    2016-09-01

    The simulation concept for grid-cloud services of contemporary HENP experiments of the Big Data scale was formulated in practicing the simulation system developed in LIT JINR Dubna. This system is intended to improve the efficiency of the design and development of a wide class of grid-cloud structures by using the work quality indicators of some real system to design and predict its evolution. For these purposes the simulation program is combined with a real monitoring system of the grid-cloud service through a special database (DB). The DB accomplishes acquisition and analysis of monitoring data to carry out dynamical corrections of the simulation. Such an approach allows us to construct a general model pattern which should not depend on a specific simulated object, while the parameters describing this object can be used as input to run the pattern. The simulation of some processes of the NICA-MPD-SPD Tier0-Tier1 distributed computing is considered as an example of our approach applications.

  2. Computational Design of High Efficiency Release Targets for Use at ISOL Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Alton, G.D.; Liu, Y.; Middleton, J.W.

    1998-11-04

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated vitreous carbon fiber (RVCF) or carbon-bonded-carbon-fiber (CBCF) to form highly permeable composite target matrices. Computational studies which simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived tlom diffusion release-rate simulation studies for selected targets and thermal analyses of temperature distributions within a prototype target/heat-sink system subjected to primary ion beam irradiation will be presented in this report.

  3. An interactive computer program for randomization analysis of response curves with facilities for multiple comparisons.

    Science.gov (United States)

    Tan, E S; Roos, J M; Volovics, A; Van Baak, M A; Does, R J

    1992-04-01

    An interactive Fortran program, MUCRA, is presented. The program can perform randomization analysis of a completely randomized or randomized-blocks design extended to growth and response curves. A single-step Scheffé-type procedure as well as the Peritz's closed step-down procedure have been implemented which control the familywise type I error-rate. In general, MUCRA is suitable as a computer tool for a distribution-free analysis of variance with repeated measures. The use of MUCRA is demonstrated by analyzing the effects oxprenolol and atenolol have on exercise heart rate. Oxprenolol is a non-selective beta-blocker with moderate intrinsic sympathomimetic activity (ISA), given by the Oros delivery system. Atenolol is a beta 1-selective blocker without ISA. A randomized placebo-controlled crossover design was used to compare the effects of the beta 1-blockers on heart rate during a progressive maximal exercise test on a bicycle ergometer. Application of the Scheffé-type procedure showed that the two drugs significantly (alpha = .05) reduce the heart rate during the exercise test at the three prechosen times (2, 5, and 24 hr) after intake. The reduction from atenolol is more pronounced than from oxprenolol Oros at 2 and 5 hr.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. Patterns of computed tomography surveillance in survivors of colorectal cancer at Veterans Health Administration facilities.

    Science.gov (United States)

    Sehdev, Amikar; Sherer, Eric A; Hui, Siu L; Wu, Jingwei; Haggstrom, David A

    2017-06-15

    Annual computed tomography (CT) scans are a component of the current standard of care for the posttreatment surveillance of survivors of colorectal cancer (CRC) after curative-intent resection. The authors conducted a retrospective study with the primary aim of assessing patient, physician, and organizational characteristics associated with the receipt of CT surveillance among veterans. The Department of Veterans Affairs Central Cancer Registry was used to identify patients diagnosed with AJCC collaborative stage I to III CRC between 2001 and 2009. Patient sociodemographic and clinical (ie, CRC stage and comorbidity) characteristics, provider specialty, and organizational characteristics were measured. Hierarchical multivariable logistic regression models were used to assess the association between patient, provider, and organizational characteristics on receipt of 1) consistently guideline-concordant care (at least 1 CT every 12 months for both of the first 2 years of CRC surveillance) versus no CT receipt and 2) potential overuse (>1 CT every 12 months during the first 2 years of CRC surveillance) of CRC surveillance using CT. The authors also analyzed the impact of the 2005 American Society of Clinical Oncology update in CRC surveillance guidelines on care received over time. For 2263 survivors of stage II/III CRC who were diagnosed after 2005, 19.4% of patients received no surveillance CT, whereas potential overuse occurred in both surveillance years for 14.9% of patients. Guideline-concordant care was associated with younger age, higher stage of disease (stage III vs stage II), and geographic region. In adjusted analyses, younger age and higher stage of disease (stage III vs stage II) were found to be associated with overuse. There was no significant difference in the annual rate of CT scanning noted across time periods (year ≤ 2005 vs year > 2005). Among a minority of veteran survivors of CRC, both underuse and potential overuse of CT surveillance

  8. The Design of Ground Test Equipment on Special Computer%某计算机地面检测设备的设计

    Institute of Scientific and Technical Information of China (English)

    田琨

    2001-01-01

    机载电子设备的研制和开发,需要功能完备的地面检测设备。本文简要介绍了某计算机基于微程序的地面检测设备的软硬件设计及其实施途径。%The research of equipment on airplane need ground test equipment which have maturity functions. This paper introduces design of hardware, software of the ground test equipment of the some airplane computer based micro-program, and the method how to carry into execution.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana.

    Science.gov (United States)

    Nkansah, A; Schandorf, C; Boadu, M; Fletcher, J J

    2013-08-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s(-1). An average dose equivalent rate estimated for supervised areas is 3.4±0.27 µSv week(-1) and that for the controlled area is 18.0±0.15 µSv week(-1), which are within acceptable values.

  11. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. Computer code analysis of steam generator in thermal-hydraulic test facility simulating nuclear power plant; Ydinvoimalaitosta kuvaavan koelaitteiston hoeyrystimien analysointi tietokoneohjelmilla

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.

    1995-12-31

    In the study three loss-of-feedwater type experiments which were preformed with the PACTEL facility has been calculated with two computer codes. The purpose of the experiments was to gain information about the behaviour of horizontal steam generator in a situation where the water level on the secondary side of the steam generator is decreasing. At the same time data that can be used in the assessment of thermal-hydraulic computer codes was assembled. The purpose of the work was to study the capabilities of two computer codes, APROS version 2.11 and RELAP5/MOD3.1, to calculate the phenomena in horizontal steam generator. In order to make the comparison of the calculation results easier the same kind of model of the steam generator was made for both codes. Only the steam generator was modelled, the rest of the facility was given for the codes as a boundary condition. (23 refs.).

  13. The Repeated Computation of the Bond Length and Ground- State Energy for H2 +%H2+键长和基态能量的再计算

    Institute of Scientific and Technical Information of China (English)

    李旭; 胡先权

    2002-01-01

    Ritz variation method was used to find the numerical relation bctween the energy near the ground - state of the hydrogenmolecular ion H2 + .and the changes of the variation parameter andthe bond length, the computation formula of bond length and ground- state energy for H2 * was also obtained by means of the method ofparabolie interpolation. The computation results were much closer toexperinental values than those of Refs. [ 1,2]' s.%用Ritz变分法求出了氢分子离子H2+基态能量附近的能量随变分参数和分子键长变化的数值关系,并用抛物线插值法获得了H2+键长和基态能量的值及其计算公式,比文献[1,2]更接近于实验值.

  14. Environmental Assessment and Finding of No Significant Impact: Interim Measures for the Mixed Waste Management Facility Groundwater at the Burial Ground Complex at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    N/A

    1999-12-08

    The U. S. Department of Energy (DOE) prepared this environmental assessment (EA) to analyze the potential environmental impacts associated with the proposed interim measures for the Mixed Waste Management Facility (MW) groundwater at the Burial Ground Complex (BGC) at the Savannah River Site (SRS), located near Aiken, South Carolina. DOE proposes to install a small metal sheet pile dam to impound water around and over the BGC groundwater seepline. In addition, a drip irrigation system would be installed. Interim measures will also address the reduction of volatile organic compounds (VOCS) from ''hot-spot'' regions associated with the Southwest Plume Area (SWPA). This action is taken as an interim measure for the MWMF in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC) to reduce the amount of tritium seeping from the BGC southwest groundwater plume. The proposed action of this EA is being planned and would be implemented concurrent with a groundwater corrective action program under the Resource Conservation and Recovery Act (RCRA). On September 30, 1999, SCDHEC issued a modification to the SRS RCRA Part B permit that adds corrective action requirements for four plumes that are currently emanating from the BGC. One of those plumes is the southwest plume. The RCRA permit requires SRS to submit a corrective action plan (CAP) for the southwest plume by March 2000. The permit requires that the initial phase of the CAP prescribe a remedy that achieves a 70-percent reduction in the annual amount of tritium being released from the southwest plume area to Fourmile Branch, a nearby stream. Approval and actual implementation of the corrective measure in that CAP may take several years. As an interim measure, the actions described in this EA would manage the release of tritium from the southwest plume area until the final actions under the CAP can be implemented. This proposed action is expected to reduce the

  15. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  16. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    NARCIS (Netherlands)

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L. A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S.; Asano, K.; Asorey, H.; Bähr, J.; Bais, A.; Baixeras, C.; Bajtlik, S.; Balis, D.; Bamba, A.; Barbier, C.; Barceló, M.; Barnacka, A.; Barnstedt, J.; Barres de Almeida, U.; Barrio, J. A.; Basso, S.; Bastieri, D.; Bauer, C.; Becerra, J.; Becherini, Y.; Bechtol, K.; Becker, J.; Beckmann, V.; Bednarek, W.; Behera, B.; Beilicke, M.; Belluso, M.; Benallou, M.; Benbow, W.; Berdugo, J.; Berger, K.; Bernardino, T.; Bernlöhr, K.; Biland, A.; Billotta, S.; Bird, T.; Birsin, E.; Bissaldi, E.; Blake, S.; Blanch, O.; Bobkov, A. A.; Bogacz, L.; Bogdan, M.; Boisson, C.; Boix, J.; Bolmont, J.; Bonanno, G.; Bonardi, A.; Bonev, T.; Borkowski, J.; Botner, O.; Bottani, A.; Bourgeat, M.; Boutonnet, C.; Bouvier, A.; Brau-Nogué, S.; Braun, I.; Bretz, T.; Briggs, M. S.; Brun, P.; Brunetti, L.; Buckley, J. H.; Bugaev, V.; Bühler, R.; Bulik, T.; Busetto, G.; Buson, S.; Byrum, K.; Cailles, M.; Cameron, R.; Canestrari, R.; Cantu, S.; Carmona, E.; Carosi, A.; Carr, J.; Carton, P. H.; Casiraghi, M.; Castarede, H.; Catalano, O.; Cavazzani, S.; Cazaux, S.; Cerruti, B.; Cerruti, M.; Chadwick, P. M.; Chiang, J.; Chikawa, M.; Cieślar, M.; Ciesielska, M.; Cillis, A.; Clerc, C.; Colin, P.; Colomé, J.; Compin, M.; Conconi, P.; Connaughton, V.; Conrad, J.; Contreras, J. L.; Coppi, P.; Corlier, M.; Corona, P.; Corpace, O.; Corti, D.; Cortina, J.; Costantini, H.; Cotter, G.; Courty, B.; Couturier, S.; Covino, S.; Croston, J.; Cusumano, G.; Daniel, M. K.; Dazzi, F.; Angelis, A. De; de Cea Del Pozo, E.; de Gouveia Dal Pino, E. M.; de Jager, O.; de La Calle Pérez, I.; de La Vega, G.; de Lotto, B.; de Naurois, M.; de Oña Wilhelmi, E.; de Souza, V.; Decerprit, B.; Deil, C.; Delagnes, E.; Deleglise, G.; Delgado, C.; Dettlaff, T.; di Paolo, A.; di Pierro, F.; Díaz, C.; Dick, J.; Dickinson, H.; Digel, S. W.; Dimitrov, D.; Disset, G.; Djannati-Ataï, A.; Doert, M.; Domainko, W.; Dorner, D.; Doro, M.; Dournaux, J.-L.; Dravins, D.; Drury, L.; Dubois, F.; Dubois, R.; Dubus, G.; Dufour, C.; Durand, D.; Dyks, J.; Dyrda, M.; Edy, E.; Egberts, K.; Eleftheriadis, C.; Elles, S.; Emmanoulopoulos, D.; Enomoto, R.; Ernenwein, J.-P.; Errando, M.; Etchegoyen, A.; Falcone, A. D.; Farakos, K.; Farnier, C.; Federici, S.; Feinstein, F.; Ferenc, D.; Fillin-Martino, E.; Fink, D.; Finley, C.; Finley, J. P.; Firpo, R.; Florin, D.; Föhr, C.; Fokitis, E.; Font, Ll.; Fontaine, G.; Fontana, A.; Förster, A.; Fortson, L.; Fouque, N.; Fransson, C.; Fraser, G. W.; Fresnillo, L.; Fruck, C.; Fujita, Y.; Fukazawa, Y.; Funk, S.; Gäbele, W.; Gabici, S.; Gadola, A.; Galante, N.; Gallant, Y.; García, B.; García López, R. J.; Garrido, D.; Garrido, L.; Gascón, D.; Gasq, C.; Gaug, M.; Gaweda, J.; Geffroy, N.; Ghag, C.; Ghedina, A.; Ghigo, M.; Gianakaki, E.; Giarrusso, S.; Giavitto, G.; Giebels, B.; Giro, E.; Giubilato, P.; Glanzman, T.; Glicenstein, J.-F.; Gochna, M.; Golev, V.; Gómez Berisso, M.; González, A.; González, F.; Grañena, F.; Graciani, R.; Granot, J.; Gredig, R.; Green, A.; Greenshaw, T.; Grimm, O.; Grube, J.; Grudzińska, M.; Grygorczuk, J.; Guarino, V.; Guglielmi, L.; Guilloux, F.; Gunji, S.; Gyuk, G.; Hadasch, D.; Haefner, D.; Hagiwara, R.; Hahn, J.; Hallgren, A.; Hara, S.; Hardcastle, M. J.; Hassan, T.; Haubold, T.; Hauser, M.; Hayashida, M.; Heller, R.; Henri, G.; Hermann, G.; Herrero, A.; Hinton, J. A.; Hoffmann, D.; Hofmann, W.; Hofverberg, P.; Horns, D.; Hrupec, D.; Huan, H.; Huber, B.; Huet, J.-M.; Hughes, G.; Hultquist, K.; Humensky, T. B.; Huppert, J.-F.; Ibarra, A.; Illa, J. M.; Ingjald, J.; Inoue, Y.; Inoue, S.; Ioka, K.; Jablonski, C.; Jacholkowska, A.; Janiak, M.; Jean, P.; Jensen, H.; Jogler, T.; Jung, I.; Kaaret, P.; Kabuki, S.; Kakuwa, J.; Kalkuhl, C.; Kankanyan, R.; Kapala, M.; Karastergiou, A.; Karczewski, M.; Karkar, S.; Karlsson, N.; Kasperek, J.; Katagiri, H.; Katarzyński, K.; Kawanaka, N.; Kȩdziora, B.; Kendziorra, E.; Khélifi, B.; Kieda, D.; Kifune, T.; Kihm, T.; Klepser, S.; Kluźniak, W.; Knapp, J.; Knappy, A. R.; Kneiske, T.; Knödlseder, J.; Köck, F.; Kodani, K.; Kohri, K.; Kokkotas, K.; Komin, N.; Konopelko, A.; Kosack, K.; Kossakowski, R.; Kostka, P.; Kotuła, J.; Kowal, G.; Kozioł, J.; Krähenbühl, T.; Krause, J.; Krawczynski, H.; Krennrich, F.; Kretzschmann, A.; Kubo, H.; Kudryavtsev, V. A.; Kushida, J.; La Barbera, N.; La Parola, V.; La Rosa, G.; López, A.; Lamanna, G.; Laporte, P.; Lavalley, C.; Le Flour, T.; Le Padellec, A.; Lenain, J.-P.; Lessio, L.; Lieunard, B.; Lindfors, E.; Liolios, A.; Lohse, T.; Lombardi, S.; Lopatin, A.; Lorenz, E.; Lubiński, P.; Luz, O.; Lyard, E.; Maccarone, M. C.; Maccarone, T.; Maier, G.; Majumdar, P.; Maltezos, S.; Małkiewicz, P.; Mañá, C.; Manalaysay, A.; Maneva, G.; Mangano, A.; Manigot, P.; Marín, J.; Mariotti, M.; Markoff, S.; Martínez, G.; Martínez, M.; Mastichiadis, A.; Matsumoto, H.; Mattiazzo, S.; Mazin, D.; McComb, T. J. L.; McCubbin, N.; McHardy, I.; Medina, C.; Melkumyan, D.; Mendes, A.; Mertsch, P.; Meucci, M.; Michałowski, J.; Micolon, P.; Mineo, T.; Mirabal, N.; Mirabel, F.; Miranda, J. M.; Mirzoyan, R.; Mizuno, T.; Moal, B.; Moderski, R.; Molinari, E.; Monteiro, I.; Moralejo, A.; Morello, C.; Mori, K.; Motta, G.; Mottez, F.; Moulin, E.; Mukherjee, R.; Munar, P.; Muraishi, H.; Murase, K.; Murphy, A. Stj.; Nagataki, S.; Naito, T.; Nakamori, T.; Nakayama, K.; Naumann, C.; Naumann, D.; Nayman, P.; Nedbal, D.; Niedźwiecki, A.; Niemiec, J.; Nikolaidis, A.; Nishijima, K.; Nolan, S. J.; Nowak, N.; O'Brien, P. T.; Ochoa, I.; Ohira, Y.; Ohishi, M.; Ohka, H.; Okumura, A.; Olivetto, C.; Ong, R. A.; Orito, R.; Orr, M.; Osborne, J. P.; Ostrowski, M.; Otero, L.; Otte, A. N.; Ovcharov, E.; Oya, I.; Oziȩbło, A.; Paiano, S.; Pallota, J.; Panazol, J. L.; Paneque, D.; Panter, M.; Paoletti, R.; Papyan, G.; Paredes, J. M.; Pareschi, G.; Parsons, R. D.; Paz Arribas, M.; Pedaletti, G.; Pepato, A.; Persic, M.; Petrucci, P. O.; Peyaud, B.; Piechocki, W.; Pita, S.; Pivato, G.; Płatos, Ł.; Platzer, R.; Pogosyan, L.; Pohl, M.; Pojmański, G.; Ponz, J. D.; Potter, W.; Prandini, E.; Preece, R.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quel, E.; Quirrenbach, A.; Rajda, P.; Rando, R.; Rataj, M.; Raue, M.; Reimann, C.; Reimann, O.; Reimer, A.; Reimer, O.; Renaud, M.; Renner, S.; Reymond, J.-M.; Rhode, W.; Ribó, M.; Ribordy, M.; Rico, J.; Rieger, F.; Ringegni, P.; Ripken, J.; Ristori, P.; Rivoire, S.; Rob, L.; Rodriguez, S.; Roeser, U.; Romano, P.; Romero, G. E.; Rosier-Lees, S.; Rovero, A. C.; Roy, F.; Royer, S.; Rudak, B.; Rulten, C. B.; Ruppel, J.; Russo, F.; Ryde, F.; Sacco, B.; Saggion, A.; Sahakian, V.; Saito, K.; Saito, T.; Sakaki, N.; Salazar, E.; Salini, A.; Sánchez, F.; Sánchez Conde, M. Á.; Santangelo, A.; Santos, E. M.; Sanuy, A.; Sapozhnikov, L.; Sarkar, S.; Scalzotto, V.; Scapin, V.; Scarcioffolo, M.; Schanz, T.; Schlenstedt, S.; Schlickeiser, R.; Schmidt, T.; Schmoll, J.; Schroedter, M.; Schultz, C.; Schultze, J.; Schulz, A.; Schwanke, U.; Schwarzburg, S.; Schweizer, T.; Seiradakis, J.; Selmane, S.; Seweryn, K.; Shayduk, M.; Shellard, R. C.; Shibata, T.; Sikora, M.; Silk, J.; Sillanpää, A.; Sitarek, J.; Skole, C.; Smith, N.; Sobczyńska, D.; Sofo Haro, M.; Sol, H.; Spanier, F.; Spiga, D.; Spyrou, S.; Stamatescu, V.; Stamerra, A.; Starling, R. L. C.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Steiner, S.; Stergioulas, N.; Sternberger, R.; Stinzing, F.; Stodulski, M.; Straumann, U.; Suárez, A.; Suchenek, M.; Sugawara, R.; Sulanke, K. H.; Sun, S.; Supanitsky, A. D.; Sutcliffe, P.; Szanecki, M.; Szepieniec, T.; Szostek, A.; Szymkowiak, A.; Tagliaferri, G.; Tajima, H.; Takahashi, H.; Takahashi, K.; Takalo, L.; Takami, H.; Talbot, R. G.; Tam, P. H.; Tanaka, M.; Tanimori, T.; Tavani, M.; Tavernet, J.-P.; Tchernin, C.; Tejedor, L. A.; Telezhinsky, I.; Temnikov, P.; Tenzer, C.; Terada, Y.; Terrier, R.; Teshima, M.; Testa, V.; Tibaldo, L.; Tibolla, O.; Tluczykont, M.; Todero Peixoto, C. J.; Tokanai, F.; Tokarz, M.; Toma, K.; Torres, D. F.; Tosti, G.; Totani, T.; Toussenel, F.; Vallania, P.; Vallejo, G.; van der Walt, J.; van Eldik, C.; Vandenbroucke, J.; Vankov, H.; Vasileiadis, G.; Vassiliev, V. V.; Vegas, I.; Venter, L.; Vercellone, S.; Veyssiere, C.; Vialle, J. P.; Videla, M.; Vincent, P.; Vink, J.; Vlahakis, N.; Vlahos, L.; Vogler, P.; Vollhardt, A.; Volpe, F.; von Gunten, H. P.; Vorobiov, S.; Wagner, S.; Wagner, R. M.; Wagner, B.; Wakely, S. P.; Walter, P.; Walter, R.; Warwick, R.; Wawer, P.; Wawrzaszek, R.; Webb, N.; Wegner, P.; Weinstein, A.; Weitzel, Q.; Welsing, R.; Wetteskind, H.; White, R.; Wierzcholska, A.; Wilkinson, M. I.; Williams, D. A.; Winde, M.; Wischnewski, R.; Wiśniewski, Ł.; Wolczko, A.; Wood, M.; Xiong, Q.; Yamamoto, T.; Yamaoka, K.; Yamazaki, R.; Yanagita, S.; Yoffo, B.; Yonetani, M.; Yoshida, A.; Yoshida, T.; Yoshikoshi, T.; Zabalza, V.; Zagdański, A.; Zajczyk, A.; Zdziarski, A.; Zech, A.; Ziȩtara, K.; Ziółkowski, P.; Zitelli, V.; Zychowski, P.

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to b

  17. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    NARCIS (Netherlands)

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L. A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S.; Asano, K.; Asorey, H.; Bähr, J.; Bais, A.; Baixeras, C.; Bajtlik, S.; Balis, D.; Bamba, A.; Barbier, C.; Barceló, M.; Barnacka, A.; Barnstedt, J.; Barres de Almeida, U.; Barrio, J. A.; Basso, S.; Bastieri, D.; Bauer, C.; Becerra, J.; Becherini, Y.; Bechtol, K.; Becker, J.; Beckmann, V.; Bednarek, W.; Behera, B.; Beilicke, M.; Belluso, M.; Benallou, M.; Benbow, W.; Berdugo, J.; Berger, K.; Bernardino, T.; Bernlöhr, K.; Biland, A.; Billotta, S.; Bird, T.; Birsin, E.; Bissaldi, E.; Blake, S.; Blanch, O.; Bobkov, A. A.; Bogacz, L.; Bogdan, M.; Boisson, C.; Boix, J.; Bolmont, J.; Bonanno, G.; Bonardi, A.; Bonev, T.; Borkowski, J.; Botner, O.; Bottani, A.; Bourgeat, M.; Boutonnet, C.; Bouvier, A.; Brau-Nogué, S.; Braun, I.; Bretz, T.; Briggs, M. S.; Brun, P.; Brunetti, L.; Buckley, J. H.; Bugaev, V.; Bühler, R.; Bulik, T.; Busetto, G.; Buson, S.; Byrum, K.; Cailles, M.; Cameron, R.; Canestrari, R.; Cantu, S.; Carmona, E.; Carosi, A.; Carr, J.; Carton, P. H.; Casiraghi, M.; Castarede, H.; Catalano, O.; Cavazzani, S.; Cazaux, S.; Cerruti, B.; Cerruti, M.; Chadwick, P. M.; Chiang, J.; Chikawa, M.; Cieślar, M.; Ciesielska, M.; Cillis, A.; Clerc, C.; Colin, P.; Colomé, J.; Compin, M.; Conconi, P.; Connaughton, V.; Conrad, J.; Contreras, J. L.; Coppi, P.; Corlier, M.; Corona, P.; Corpace, O.; Corti, D.; Cortina, J.; Costantini, H.; Cotter, G.; Courty, B.; Couturier, S.; Covino, S.; Croston, J.; Cusumano, G.; Daniel, M. K.; Dazzi, F.; Angelis, A. De; de Cea Del Pozo, E.; de Gouveia Dal Pino, E. M.; de Jager, O.; de La Calle Pérez, I.; de La Vega, G.; de Lotto, B.; de Naurois, M.; de Oña Wilhelmi, E.; de Souza, V.; Decerprit, B.; Deil, C.; Delagnes, E.; Deleglise, G.; Delgado, C.; Dettlaff, T.; di Paolo, A.; di Pierro, F.; Díaz, C.; Dick, J.; Dickinson, H.; Digel, S. W.; Dimitrov, D.; Disset, G.; Djannati-Ataï, A.; Doert, M.; Domainko, W.; Dorner, D.; Doro, M.; Dournaux, J.-L.; Dravins, D.; Drury, L.; Dubois, F.; Dubois, R.; Dubus, G.; Dufour, C.; Durand, D.; Dyks, J.; Dyrda, M.; Edy, E.; Egberts, K.; Eleftheriadis, C.; Elles, S.; Emmanoulopoulos, D.; Enomoto, R.; Ernenwein, J.-P.; Errando, M.; Etchegoyen, A.; Falcone, A. D.; Farakos, K.; Farnier, C.; Federici, S.; Feinstein, F.; Ferenc, D.; Fillin-Martino, E.; Fink, D.; Finley, C.; Finley, J. P.; Firpo, R.; Florin, D.; Föhr, C.; Fokitis, E.; Font, Ll.; Fontaine, G.; Fontana, A.; Förster, A.; Fortson, L.; Fouque, N.; Fransson, C.; Fraser, G. W.; Fresnillo, L.; Fruck, C.; Fujita, Y.; Fukazawa, Y.; Funk, S.; Gäbele, W.; Gabici, S.; Gadola, A.; Galante, N.; Gallant, Y.; García, B.; García López, R. J.; Garrido, D.; Garrido, L.; Gascón, D.; Gasq, C.; Gaug, M.; Gaweda, J.; Geffroy, N.; Ghag, C.; Ghedina, A.; Ghigo, M.; Gianakaki, E.; Giarrusso, S.; Giavitto, G.; Giebels, B.; Giro, E.; Giubilato, P.; Glanzman, T.; Glicenstein, J.-F.; Gochna, M.; Golev, V.; Gómez Berisso, M.; González, A.; González, F.; Grañena, F.; Graciani, R.; Granot, J.; Gredig, R.; Green, A.; Greenshaw, T.; Grimm, O.; Grube, J.; Grudzińska, M.; Grygorczuk, J.; Guarino, V.; Guglielmi, L.; Guilloux, F.; Gunji, S.; Gyuk, G.; Hadasch, D.; Haefner, D.; Hagiwara, R.; Hahn, J.; Hallgren, A.; Hara, S.; Hardcastle, M. J.; Hassan, T.; Haubold, T.; Hauser, M.; Hayashida, M.; Heller, R.; Henri, G.; Hermann, G.; Herrero, A.; Hinton, J. A.; Hoffmann, D.; Hofmann, W.; Hofverberg, P.; Horns, D.; Hrupec, D.; Huan, H.; Huber, B.; Huet, J.-M.; Hughes, G.; Hultquist, K.; Humensky, T. B.; Huppert, J.-F.; Ibarra, A.; Illa, J. M.; Ingjald, J.; Inoue, Y.; Inoue, S.; Ioka, K.; Jablonski, C.; Jacholkowska, A.; Janiak, M.; Jean, P.; Jensen, H.; Jogler, T.; Jung, I.; Kaaret, P.; Kabuki, S.; Kakuwa, J.; Kalkuhl, C.; Kankanyan, R.; Kapala, M.; Karastergiou, A.; Karczewski, M.; Karkar, S.; Karlsson, N.; Kasperek, J.; Katagiri, H.; Katarzyński, K.; Kawanaka, N.; Kȩdziora, B.; Kendziorra, E.; Khélifi, B.; Kieda, D.; Kifune, T.; Kihm, T.; Klepser, S.; Kluźniak, W.; Knapp, J.; Knappy, A. R.; Kneiske, T.; Knödlseder, J.; Köck, F.; Kodani, K.; Kohri, K.; Kokkotas, K.; Komin, N.; Konopelko, A.; Kosack, K.; Kossakowski, R.; Kostka, P.; Kotuła, J.; Kowal, G.; Kozioł, J.; Krähenbühl, T.; Krause, J.; Krawczynski, H.; Krennrich, F.; Kretzschmann, A.; Kubo, H.; Kudryavtsev, V. A.; Kushida, J.; La Barbera, N.; La Parola, V.; La Rosa, G.; López, A.; Lamanna, G.; Laporte, P.; Lavalley, C.; Le Flour, T.; Le Padellec, A.; Lenain, J.-P.; Lessio, L.; Lieunard, B.; Lindfors, E.; Liolios, A.; Lohse, T.; Lombardi, S.; Lopatin, A.; Lorenz, E.; Lubiński, P.; Luz, O.; Lyard, E.; Maccarone, M. C.; Maccarone, T.; Maier, G.; Majumdar, P.; Maltezos, S.; Małkiewicz, P.; Mañá, C.; Manalaysay, A.; Maneva, G.; Mangano, A.; Manigot, P.; Marín, J.; Mariotti, M.; Markoff, S.; Martínez, G.; Martínez, M.; Mastichiadis, A.; Matsumoto, H.; Mattiazzo, S.; Mazin, D.; McComb, T. J. L.; McCubbin, N.; McHardy, I.; Medina, C.; Melkumyan, D.; Mendes, A.; Mertsch, P.; Meucci, M.; Michałowski, J.; Micolon, P.; Mineo, T.; Mirabal, N.; Mirabel, F.; Miranda, J. M.; Mirzoyan, R.; Mizuno, T.; Moal, B.; Moderski, R.; Molinari, E.; Monteiro, I.; Moralejo, A.; Morello, C.; Mori, K.; Motta, G.; Mottez, F.; Moulin, E.; Mukherjee, R.; Munar, P.; Muraishi, H.; Murase, K.; Murphy, A. Stj.; Nagataki, S.; Naito, T.; Nakamori, T.; Nakayama, K.; Naumann, C.; Naumann, D.; Nayman, P.; Nedbal, D.; Niedźwiecki, A.; Niemiec, J.; Nikolaidis, A.; Nishijima, K.; Nolan, S. J.; Nowak, N.; O'Brien, P. T.; Ochoa, I.; Ohira, Y.; Ohishi, M.; Ohka, H.; Okumura, A.; Olivetto, C.; Ong, R. A.; Orito, R.; Orr, M.; Osborne, J. P.; Ostrowski, M.; Otero, L.; Otte, A. N.; Ovcharov, E.; Oya, I.; Oziȩbło, A.; Paiano, S.; Pallota, J.; Panazol, J. L.; Paneque, D.; Panter, M.; Paoletti, R.; Papyan, G.; Paredes, J. M.; Pareschi, G.; Parsons, R. D.; Paz Arribas, M.; Pedaletti, G.; Pepato, A.; Persic, M.; Petrucci, P. O.; Peyaud, B.; Piechocki, W.; Pita, S.; Pivato, G.; Płatos, Ł.; Platzer, R.; Pogosyan, L.; Pohl, M.; Pojmański, G.; Ponz, J. D.; Potter, W.; Prandini, E.; Preece, R.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quel, E.; Quirrenbach, A.; Rajda, P.; Rando, R.; Rataj, M.; Raue, M.; Reimann, C.; Reimann, O.; Reimer, A.; Reimer, O.; Renaud, M.; Renner, S.; Reymond, J.-M.; Rhode, W.; Ribó, M.; Ribordy, M.; Rico, J.; Rieger, F.; Ringegni, P.; Ripken, J.; Ristori, P.; Rivoire, S.; Rob, L.; Rodriguez, S.; Roeser, U.; Romano, P.; Romero, G. E.; Rosier-Lees, S.; Rovero, A. C.; Roy, F.; Royer, S.; Rudak, B.; Rulten, C. B.; Ruppel, J.; Russo, F.; Ryde, F.; Sacco, B.; Saggion, A.; Sahakian, V.; Saito, K.; Saito, T.; Sakaki, N.; Salazar, E.; Salini, A.; Sánchez, F.; Sánchez Conde, M. Á.; Santangelo, A.; Santos, E. M.; Sanuy, A.; Sapozhnikov, L.; Sarkar, S.; Scalzotto, V.; Scapin, V.; Scarcioffolo, M.; Schanz, T.; Schlenstedt, S.; Schlickeiser, R.; Schmidt, T.; Schmoll, J.; Schroedter, M.; Schultz, C.; Schultze, J.; Schulz, A.; Schwanke, U.; Schwarzburg, S.; Schweizer, T.; Seiradakis, J.; Selmane, S.; Seweryn, K.; Shayduk, M.; Shellard, R. C.; Shibata, T.; Sikora, M.; Silk, J.; Sillanpää, A.; Sitarek, J.; Skole, C.; Smith, N.; Sobczyńska, D.; Sofo Haro, M.; Sol, H.; Spanier, F.; Spiga, D.; Spyrou, S.; Stamatescu, V.; Stamerra, A.; Starling, R. L. C.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Steiner, S.; Stergioulas, N.; Sternberger, R.; Stinzing, F.; Stodulski, M.; Straumann, U.; Suárez, A.; Suchenek, M.; Sugawara, R.; Sulanke, K. H.; Sun, S.; Supanitsky, A. D.; Sutcliffe, P.; Szanecki, M.; Szepieniec, T.; Szostek, A.; Szymkowiak, A.; Tagliaferri, G.; Tajima, H.; Takahashi, H.; Takahashi, K.; Takalo, L.; Takami, H.; Talbot, R. G.; Tam, P. H.; Tanaka, M.; Tanimori, T.; Tavani, M.; Tavernet, J.-P.; Tchernin, C.; Tejedor, L. A.; Telezhinsky, I.; Temnikov, P.; Tenzer, C.; Terada, Y.; Terrier, R.; Teshima, M.; Testa, V.; Tibaldo, L.; Tibolla, O.; Tluczykont, M.; Todero Peixoto, C. J.; Tokanai, F.; Tokarz, M.; Toma, K.; Torres, D. F.; Tosti, G.; Totani, T.; Toussenel, F.; Vallania, P.; Vallejo, G.; van der Walt, J.; van Eldik, C.; Vandenbroucke, J.; Vankov, H.; Vasileiadis, G.; Vassiliev, V. V.; Vegas, I.; Venter, L.; Vercellone, S.; Veyssiere, C.; Vialle, J. P.; Videla, M.; Vincent, P.; Vink, J.; Vlahakis, N.; Vlahos, L.; Vogler, P.; Vollhardt, A.; Volpe, F.; von Gunten, H. P.; Vorobiov, S.; Wagner, S.; Wagner, R. M.; Wagner, B.; Wakely, S. P.; Walter, P.; Walter, R.; Warwick, R.; Wawer, P.; Wawrzaszek, R.; Webb, N.; Wegner, P.; Weinstein, A.; Weitzel, Q.; Welsing, R.; Wetteskind, H.; White, R.; Wierzcholska, A.; Wilkinson, M. I.; Williams, D. A.; Winde, M.; Wischnewski, R.; Wiśniewski, Ł.; Wolczko, A.; Wood, M.; Xiong, Q.; Yamamoto, T.; Yamaoka, K.; Yamazaki, R.; Yanagita, S.; Yoffo, B.; Yonetani, M.; Yoshida, A.; Yoshida, T.; Yoshikoshi, T.; Zabalza, V.; Zagdański, A.; Zajczyk, A.; Zdziarski, A.; Zech, A.; Ziȩtara, K.; Ziółkowski, P.; Zitelli, V.; Zychowski, P.

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to b

  18. Design concepts for the Cherenkov Telescope Array CTA : An advanced facility for ground-based high-energy gamma-ray astronomy

    NARCIS (Netherlands)

    Actis et al., M.; Cazaux, Stéphanie

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to b

  19. Facile formation of dendrimer-stabilized gold nanoparticles modified with diatrizoic acid for enhanced computed tomography imaging applications.

    Science.gov (United States)

    Peng, Chen; Li, Kangan; Cao, Xueyan; Xiao, Tingting; Hou, Wenxiu; Zheng, Linfeng; Guo, Rui; Shen, Mingwu; Zhang, Guixiang; Shi, Xiangyang

    2012-11-07

    We report a facile approach to forming dendrimer-stabilized gold nanoparticles (Au DSNPs) through the use of amine-terminated fifth-generation poly(amidoamine) (PAMAM) dendrimers modified by diatrizoic acid (G5.NH(2)-DTA) as stabilizers for enhanced computed tomography (CT) imaging applications. In this study, by simply mixing G5.NH(2)-DTA dendrimers with gold salt in aqueous solution at room temperature, dendrimer-entrapped gold nanoparticles (Au DENPs) with a mean core size of 2.5 nm were able to be spontaneously formed. Followed by an acetylation reaction to neutralize the dendrimer remaining terminal amines, Au DSNPs with a mean size of 6 nm were formed. The formed DTA-containing [(Au(0))(50)-G5.NHAc-DTA] DSNPs were characterized via different techniques. We show that the Au DSNPs are colloid stable in aqueous solution under different pH and temperature conditions. In vitro hemolytic assay, cytotoxicity assay, flow cytometry analysis, and cell morphology observation reveal that the formed Au DSNPs have good hemocompatibility and are non-cytotoxic at a concentration up to 3.0 μM. X-ray absorption coefficient measurements show that the DTA-containing Au DSNPs have enhanced attenuation intensity, much higher than that of [(Au(0))(50)-G5.NHAc] DENPs without DTA or Omnipaque at the same molar concentration of the active element (Au or iodine). The formed DTA-containing Au DSNPs can be used for CT imaging of cancer cells in vitro as well as for blood pool CT imaging of mice in vivo with significantly improved signal enhancement. With the two radiodense elements of Au and iodine incorporated within one particle, the formed DTA-containing Au DSNPs may be applicable for CT imaging of various biological systems with enhanced X-ray attenuation property and detection sensitivity.

  20. Studies of Plasma Instabilities Excited by Ground-Based High Power HF (Heating) Facilities and of X and Gamma Ray Emission in Runaway Breakdown Processes

    Science.gov (United States)

    2006-08-01

    latitude ( HAARP , TROMSO) and mid latitude (SURA) facilities [1]. The very strong and fully reproducible plasma perturbations in ionosphere are observed...beam propagating along magnetic field (θ = 0), in this case factor κs ≈ 1. As an a example we will consider now the HAARP facility. The ERP for HAARP ...as a function of fre- quency f0 is presented in the Table 1. ISTC 2236p 12 Table 1 ERP as function of wave frequency for HAARP (2001) f0 (MHz

  1. THE METHOD of computation images matching with the standard AS A METHOD FOR IDENTIFICATION OF MOVING Ground OBJECTS

    Directory of Open Access Journals (Sweden)

    B. V. Kazbekov

    2014-01-01

    Full Text Available The article focuses on the identification of moving ground targets on board unmanned aerial vehicle. The possibility of realization of algorithm for identification of objects in real-time by comparing the image of the object under consideration and a set of reference images of the objects of the classes are considered. The merit of the developed modification and the results of the experiments are given.

  2. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Science.gov (United States)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  3. Aircraft Test & Evaluation Facility (Hush House)

    Data.gov (United States)

    Federal Laboratory Consortium — The Aircraft Test and Evaluation Facility (ATEF), or Hush House, is a noise-abated ground test sub-facility. The facility's controlled environment provides 24-hour...

  4. Robotics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This 60 feet x 100 feet structure on the grounds of the Fort Indiantown Gap Pennsylvania National Guard (PNG) Base is a mixed-use facility comprising office space,...

  5. Robotics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This 60 feet x 100 feet structure on the grounds of the Fort Indiantown Gap Pennsylvania National Guard (PNG) Base is a mixed-use facility comprising office space,...

  6. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford facilities: Progress report for the period October 1 to December 31, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E. (eds.)

    1990-03-01

    This is Volume 1 of a two-volume document that describes the progress of 15 Hanford Site ground-water monitoring projects for the period October 1 to December 31, 1989. This volume discusses the projects. The work described in this document is conducted by the Pacific Northwest Laboratory under the management of Westinghouse Hanford Company for the US Department of Energy. Concentrations of ground-water constituents are compared to federal drinking water standards throughout this document for reference purposes. All drinking water supplied from the samples aquifer meets regulatory standards for drinking water quality. 51 refs., 35 figs., 86 tabs.

  7. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    OpenAIRE

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L.A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV to 10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will con...

  8. Shieldings for X-ray radiotherapy facilities calculated by computer; Blindagens para instalacoes de radioterapia por raios-X calculada por computador

    Energy Technology Data Exchange (ETDEWEB)

    Pedrosa, Paulo S.; Farias, Marcos S. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Gavazza, Sergio [Universidade Gama Filho (UGF), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    This work presents a methodology for calculation of X-ray shielding in facilities of radiotherapy with help of computer. Even today, in Brazil, the calculation of shielding for X-ray radiotherapy is done based on NCRP-49 recommendation establishing a methodology for calculating required to the elaboration of a project of shielding. With regard to high energies, where is necessary the construction of a labyrinth, the NCRP-49 is not very clear, so that in this field, studies were made resulting in an article that proposes a solution to the problem. It was developed a friendly program in Delphi programming language that, through the manual data entry of a basic design of architecture and some parameters, interprets the geometry and calculates the shields of the walls, ceiling and floor of on X-ray radiation therapy facility. As the final product, this program provides a graphical screen on the computer with all the input data and the calculation of shieldings and the calculation memory. The program can be applied in practical implementation of shielding projects for radiotherapy facilities and can be used in a didactic way compared to NCRP-49.

  9. Prediction of peak ground acceleration of Iran’s tectonic regions using a hybrid soft computing technique

    Institute of Scientific and Technical Information of China (English)

    Mostafa Gandomi; Mohsen Soltanpour; Mohammad R. Zolfaghari; Amir H. Gandomi

    2016-01-01

    A new model is derived to predict the peak ground acceleration (PGA) utilizing a hybrid method coupling artificial neural network (ANN) and simulated annealing (SA), called SA-ANN. The proposed model re-lates PGA to earthquake source to site distance, earthquake magnitude, average shear-wave velocity, faulting mechanisms, and focal depth. A database of strong ground-motion recordings of 36 earthquakes, which happened in Iran’s tectonic regions, is used to establish the model. For more validity verification, the SA-ANN model is employed to predict the PGA of a part of the database beyond the training data domain. The proposed SA-ANN model is compared with the simple ANN in addition to 10 well-known models proposed in the literature. The proposed model performance is superior to the single ANN and other existing attenuation models. The SA-ANN model is highly correlated to the actual records (R ¼ 0.835 and r ¼ 0.0908) and it is subsequently converted into a tractable design equation.

  10. Modeling Potential Carbon Monoxide Exposure Due to Operation of a Major Rocket Engine Altitude Test Facility Using Computational Fluid Dynamics

    Science.gov (United States)

    Blotzer, Michael J.; Woods, Jody L.

    2009-01-01

    This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.

  11. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...

  12. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    Science.gov (United States)

    Actis, M.; Agnetta, G.; Aharonian, F.; Akhperjanian, A.; Aleksić, J.; Aliu, E.; Allan, D.; Allekotte, I.; Antico, F.; Antonelli, L. A.; Antoranz, P.; Aravantinos, A.; Arlen, T.; Arnaldi, H.; Artmann, S.; Asano, K.; Asorey, H.; Bähr, J.; Bais, A.; Baixeras, C.; Bajtlik, S.; Balis, D.; Bamba, A.; Barbier, C.; Barceló, M.; Barnacka, A.; Barnstedt, J.; Barres de Almeida, U.; Barrio, J. A.; Basso, S.; Bastieri, D.; Bauer, C.; Becerra, J.; Becherini, Y.; Bechtol, K.; Becker, J.; Beckmann, V.; Bednarek, W.; Behera, B.; Beilicke, M.; Belluso, M.; Benallou, M.; Benbow, W.; Berdugo, J.; Berger, K.; Bernardino, T.; Bernlöhr, K.; Biland, A.; Billotta, S.; Bird, T.; Birsin, E.; Bissaldi, E.; Blake, S.; Blanch, O.; Bobkov, A. A.; Bogacz, L.; Bogdan, M.; Boisson, C.; Boix, J.; Bolmont, J.; Bonanno, G.; Bonardi, A.; Bonev, T.; Borkowski, J.; Botner, O.; Bottani, A.; Bourgeat, M.; Boutonnet, C.; Bouvier, A.; Brau-Nogué, S.; Braun, I.; Bretz, T.; Briggs, M. S.; Brun, P.; Brunetti, L.; Buckley, J. H.; Bugaev, V.; Bühler, R.; Bulik, T.; Busetto, G.; Buson, S.; Byrum, K.; Cailles, M.; Cameron, R.; Canestrari, R.; Cantu, S.; Carmona, E.; Carosi, A.; Carr, J.; Carton, P. H.; Casiraghi, M.; Castarede, H.; Catalano, O.; Cavazzani, S.; Cazaux, S.; Cerruti, B.; Cerruti, M.; Chadwick, P. M.; Chiang, J.; Chikawa, M.; Cieślar, M.; Ciesielska, M.; Cillis, A.; Clerc, C.; Colin, P.; Colomé, J.; Compin, M.; Conconi, P.; Connaughton, V.; Conrad, J.; Contreras, J. L.; Coppi, P.; Corlier, M.; Corona, P.; Corpace, O.; Corti, D.; Cortina, J.; Costantini, H.; Cotter, G.; Courty, B.; Couturier, S.; Covino, S.; Croston, J.; Cusumano, G.; Daniel, M. K.; Dazzi, F.; Angelis, A. De; de Cea Del Pozo, E.; de Gouveia Dal Pino, E. M.; de Jager, O.; de La Calle Pérez, I.; de La Vega, G.; de Lotto, B.; de Naurois, M.; de Oña Wilhelmi, E.; de Souza, V.; Decerprit, B.; Deil, C.; Delagnes, E.; Deleglise, G.; Delgado, C.; Dettlaff, T.; di Paolo, A.; di Pierro, F.; Díaz, C.; Dick, J.; Dickinson, H.; Digel, S. W.; Dimitrov, D.; Disset, G.; Djannati-Ataï, A.; Doert, M.; Domainko, W.; Dorner, D.; Doro, M.; Dournaux, J.-L.; Dravins, D.; Drury, L.; Dubois, F.; Dubois, R.; Dubus, G.; Dufour, C.; Durand, D.; Dyks, J.; Dyrda, M.; Edy, E.; Egberts, K.; Eleftheriadis, C.; Elles, S.; Emmanoulopoulos, D.; Enomoto, R.; Ernenwein, J.-P.; Errando, M.; Etchegoyen, A.; Falcone, A. D.; Farakos, K.; Farnier, C.; Federici, S.; Feinstein, F.; Ferenc, D.; Fillin-Martino, E.; Fink, D.; Finley, C.; Finley, J. P.; Firpo, R.; Florin, D.; Föhr, C.; Fokitis, E.; Font, Ll.; Fontaine, G.; Fontana, A.; Förster, A.; Fortson, L.; Fouque, N.; Fransson, C.; Fraser, G. W.; Fresnillo, L.; Fruck, C.; Fujita, Y.; Fukazawa, Y.; Funk, S.; Gäbele, W.; Gabici, S.; Gadola, A.; Galante, N.; Gallant, Y.; García, B.; García López, R. J.; Garrido, D.; Garrido, L.; Gascón, D.; Gasq, C.; Gaug, M.; Gaweda, J.; Geffroy, N.; Ghag, C.; Ghedina, A.; Ghigo, M.; Gianakaki, E.; Giarrusso, S.; Giavitto, G.; Giebels, B.; Giro, E.; Giubilato, P.; Glanzman, T.; Glicenstein, J.-F.; Gochna, M.; Golev, V.; Gómez Berisso, M.; González, A.; González, F.; Grañena, F.; Graciani, R.; Granot, J.; Gredig, R.; Green, A.; Greenshaw, T.; Grimm, O.; Grube, J.; Grudzińska, M.; Grygorczuk, J.; Guarino, V.; Guglielmi, L.; Guilloux, F.; Gunji, S.; Gyuk, G.; Hadasch, D.; Haefner, D.; Hagiwara, R.; Hahn, J.; Hallgren, A.; Hara, S.; Hardcastle, M. J.; Hassan, T.; Haubold, T.; Hauser, M.; Hayashida, M.; Heller, R.; Henri, G.; Hermann, G.; Herrero, A.; Hinton, J. A.; Hoffmann, D.; Hofmann, W.; Hofverberg, P.; Horns, D.; Hrupec, D.; Huan, H.; Huber, B.; Huet, J.-M.; Hughes, G.; Hultquist, K.; Humensky, T. B.; Huppert, J.-F.; Ibarra, A.; Illa, J. M.; Ingjald, J.; Inoue, Y.; Inoue, S.; Ioka, K.; Jablonski, C.; Jacholkowska, A.; Janiak, M.; Jean, P.; Jensen, H.; Jogler, T.; Jung, I.; Kaaret, P.; Kabuki, S.; Kakuwa, J.; Kalkuhl, C.; Kankanyan, R.; Kapala, M.; Karastergiou, A.; Karczewski, M.; Karkar, S.; Karlsson, N.; Kasperek, J.; Katagiri, H.; Katarzyński, K.; Kawanaka, N.; Kȩdziora, B.; Kendziorra, E.; Khélifi, B.; Kieda, D.; Kifune, T.; Kihm, T.; Klepser, S.; Kluźniak, W.; Knapp, J.; Knappy, A. R.; Kneiske, T.; Knödlseder, J.; Köck, F.; Kodani, K.; Kohri, K.; Kokkotas, K.; Komin, N.; Konopelko, A.; Kosack, K.; Kossakowski, R.; Kostka, P.; Kotuła, J.; Kowal, G.; Kozioł, J.; Krähenbühl, T.; Krause, J.; Krawczynski, H.; Krennrich, F.; Kretzschmann, A.; Kubo, H.; Kudryavtsev, V. A.; Kushida, J.; La Barbera, N.; La Parola, V.; La Rosa, G.; López, A.; Lamanna, G.; Laporte, P.; Lavalley, C.; Le Flour, T.; Le Padellec, A.; Lenain, J.-P.; Lessio, L.; Lieunard, B.; Lindfors, E.; Liolios, A.; Lohse, T.; Lombardi, S.; Lopatin, A.; Lorenz, E.; Lubiński, P.; Luz, O.; Lyard, E.; Maccarone, M. C.; Maccarone, T.; Maier, G.; Majumdar, P.; Maltezos, S.; Małkiewicz, P.; Mañá, C.; Manalaysay, A.; Maneva, G.; Mangano, A.; Manigot, P.; Marín, J.; Mariotti, M.; Markoff, S.; Martínez, G.; Martínez, M.; Mastichiadis, A.; Matsumoto, H.; Mattiazzo, S.; Mazin, D.; McComb, T. J. L.; McCubbin, N.; McHardy, I.; Medina, C.; Melkumyan, D.; Mendes, A.; Mertsch, P.; Meucci, M.; Michałowski, J.; Micolon, P.; Mineo, T.; Mirabal, N.; Mirabel, F.; Miranda, J. M.; Mirzoyan, R.; Mizuno, T.; Moal, B.; Moderski, R.; Molinari, E.; Monteiro, I.; Moralejo, A.; Morello, C.; Mori, K.; Motta, G.; Mottez, F.; Moulin, E.; Mukherjee, R.; Munar, P.; Muraishi, H.; Murase, K.; Murphy, A. Stj.; Nagataki, S.; Naito, T.; Nakamori, T.; Nakayama, K.; Naumann, C.; Naumann, D.; Nayman, P.; Nedbal, D.; Niedźwiecki, A.; Niemiec, J.; Nikolaidis, A.; Nishijima, K.; Nolan, S. J.; Nowak, N.; O'Brien, P. T.; Ochoa, I.; Ohira, Y.; Ohishi, M.; Ohka, H.; Okumura, A.; Olivetto, C.; Ong, R. A.; Orito, R.; Orr, M.; Osborne, J. P.; Ostrowski, M.; Otero, L.; Otte, A. N.; Ovcharov, E.; Oya, I.; Oziȩbło, A.; Paiano, S.; Pallota, J.; Panazol, J. L.; Paneque, D.; Panter, M.; Paoletti, R.; Papyan, G.; Paredes, J. M.; Pareschi, G.; Parsons, R. D.; Paz Arribas, M.; Pedaletti, G.; Pepato, A.; Persic, M.; Petrucci, P. O.; Peyaud, B.; Piechocki, W.; Pita, S.; Pivato, G.; Płatos, Ł.; Platzer, R.; Pogosyan, L.; Pohl, M.; Pojmański, G.; Ponz, J. D.; Potter, W.; Prandini, E.; Preece, R.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quel, E.; Quirrenbach, A.; Rajda, P.; Rando, R.; Rataj, M.; Raue, M.; Reimann, C.; Reimann, O.; Reimer, A.; Reimer, O.; Renaud, M.; Renner, S.; Reymond, J.-M.; Rhode, W.; Ribó, M.; Ribordy, M.; Rico, J.; Rieger, F.; Ringegni, P.; Ripken, J.; Ristori, P.; Rivoire, S.; Rob, L.; Rodriguez, S.; Roeser, U.; Romano, P.; Romero, G. E.; Rosier-Lees, S.; Rovero, A. C.; Roy, F.; Royer, S.; Rudak, B.; Rulten, C. B.; Ruppel, J.; Russo, F.; Ryde, F.; Sacco, B.; Saggion, A.; Sahakian, V.; Saito, K.; Saito, T.; Sakaki, N.; Salazar, E.; Salini, A.; Sánchez, F.; Sánchez Conde, M. Á.; Santangelo, A.; Santos, E. M.; Sanuy, A.; Sapozhnikov, L.; Sarkar, S.; Scalzotto, V.; Scapin, V.; Scarcioffolo, M.; Schanz, T.; Schlenstedt, S.; Schlickeiser, R.; Schmidt, T.; Schmoll, J.; Schroedter, M.; Schultz, C.; Schultze, J.; Schulz, A.; Schwanke, U.; Schwarzburg, S.; Schweizer, T.; Seiradakis, J.; Selmane, S.; Seweryn, K.; Shayduk, M.; Shellard, R. C.; Shibata, T.; Sikora, M.; Silk, J.; Sillanpää, A.; Sitarek, J.; Skole, C.; Smith, N.; Sobczyńska, D.; Sofo Haro, M.; Sol, H.; Spanier, F.; Spiga, D.; Spyrou, S.; Stamatescu, V.; Stamerra, A.; Starling, R. L. C.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Steiner, S.; Stergioulas, N.; Sternberger, R.; Stinzing, F.; Stodulski, M.; Straumann, U.; Suárez, A.; Suchenek, M.; Sugawara, R.; Sulanke, K. H.; Sun, S.; Supanitsky, A. D.; Sutcliffe, P.; Szanecki, M.; Szepieniec, T.; Szostek, A.; Szymkowiak, A.; Tagliaferri, G.; Tajima, H.; Takahashi, H.; Takahashi, K.; Takalo, L.; Takami, H.; Talbot, R. G.; Tam, P. H.; Tanaka, M.; Tanimori, T.; Tavani, M.; Tavernet, J.-P.; Tchernin, C.; Tejedor, L. A.; Telezhinsky, I.; Temnikov, P.; Tenzer, C.; Terada, Y.; Terrier, R.; Teshima, M.; Testa, V.; Tibaldo, L.; Tibolla, O.; Tluczykont, M.; Todero Peixoto, C. J.; Tokanai, F.; Tokarz, M.; Toma, K.; Torres, D. F.; Tosti, G.; Totani, T.; Toussenel, F.; Vallania, P.; Vallejo, G.; van der Walt, J.; van Eldik, C.; Vandenbroucke, J.; Vankov, H.; Vasileiadis, G.; Vassiliev, V. V.; Vegas, I.; Venter, L.; Vercellone, S.; Veyssiere, C.; Vialle, J. P.; Videla, M.; Vincent, P.; Vink, J.; Vlahakis, N.; Vlahos, L.; Vogler, P.; Vollhardt, A.; Volpe, F.; von Gunten, H. P.; Vorobiov, S.; Wagner, S.; Wagner, R. M.; Wagner, B.; Wakely, S. P.; Walter, P.; Walter, R.; Warwick, R.; Wawer, P.; Wawrzaszek, R.; Webb, N.; Wegner, P.; Weinstein, A.; Weitzel, Q.; Welsing, R.; Wetteskind, H.; White, R.; Wierzcholska, A.; Wilkinson, M. I.; Williams, D. A.; Winde, M.; Wischnewski, R.; Wiśniewski, Ł.; Wolczko, A.; Wood, M.; Xiong, Q.; Yamamoto, T.; Yamaoka, K.; Yamazaki, R.; Yanagita, S.; Yoffo, B.; Yonetani, M.; Yoshida, A.; Yoshida, T.; Yoshikoshi, T.; Zabalza, V.; Zagdański, A.; Zajczyk, A.; Zdziarski, A.; Zech, A.; Ziȩtara, K.; Ziółkowski, P.; Zitelli, V.; Zychowski, P.

    2011-12-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

  13. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford Facilities: Progress report for the period July 1 to September 30, 1989 - Volume 1 - Text

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-12-01

    This is Volume 1 of a two-volume document that describes the progress of 14 Hanford Site ground-water monitoring projects for the period July 1 to September 30, 1989. This volume discusses the projects; Volume 2 provides as-built diagrams, completion/inspection reports, drilling logs, and geophysical logs for wells drilled, completed, or logged during this period. Volume 2 can be found on microfiche in the back pocket of Volume 1. The work described in this document is conducted by the Pacific Northwest Laboratory under the management of Westinghouse Hanford Company for the US Department of Energy. Concentrations of ground-water constituents are compared to federal drinking water standards throughout this document for reference purposes. All drinking water supplied from the sampled aquifer meets regulatory standards for drinking water quality.

  14. Design Concepts for the Cherenkov Telescope Array CTA: An Advanced Facility for Ground-Based High-Energy Gamma-Ray Astronomy

    Energy Technology Data Exchange (ETDEWEB)

    Actis, M

    2012-04-17

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

  15. Ground-water monitoring compliance projects for Hanford Site Facilities: Progress report for the period April 1--June 30, 1988: Volume 1, Text

    Energy Technology Data Exchange (ETDEWEB)

    1988-09-01

    This is Volume 1 of a two-volume set of documents that describes the progress of 10 Hanford Site ground-water monitoring projects for the period April 1 to June 30, 1988. This volume discusses the projects; Volume 2 provides as-built diagrams, drilling logs, and geophysical logs for wells drilled during this period in the 100-N Area and near the 216-A-36B Crib.

  16. Atmospheric transfer of radiation above an inhomogeneous non-Lambertian reflective ground. II - Computational considerations and results

    Science.gov (United States)

    Diner, D. J.; Martonchik, J. V.

    1984-10-01

    The theoretical foundation for solution of the three-dimensional radiative transfer problem described in the preceding paper is reviewed. Practical considerations involved in implementing the Fourier transform/Gauss-Seidel method on a minicomputer are discussed, along with derivations of symmetry relations and approximations which can be used to enhance the computational efficiency. Model results for a surface whose albedo varies as a step function are presented and compared with published solutions obtained by using the Monte Carlo method.

  17. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  18. A Computer-Aided Instruction Program for Teaching the TOPS20-MM Facility on the DDN (Defense Data Network)

    Science.gov (United States)

    1988-06-01

    provide learning-by-doing, which means a student aquires knowledge by solving real-world problems [Ref. 21. To tutor well, a system has to specify...apply artificial intelligence (AI) techniques. AI research such as natural- language understanding, knowledge representation, and inferencing, have been...programming language by providing a programming environment and online help facility. It has a tutoring module called Soft Tutor that examines the

  19. Ground-water monitoring compliance projects for Hanford Site facilities: Progress report for the period April 1 to June 30, 1988: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1988-09-01

    This is Volume 2 of a two-volume set of documents that describes the progress of 10 Hanford Site ground-water monitoring projects for the period April 1 to June 30, 1988. This volume discusses as-built diagrams, drilling logs, and geophysical logs for wells drilled during this period in the 100-N Area (Appendix A) and near the 216-A-36B Crib (Appendix B). Volume 1 discusses the 10 projects. This work was supported by the US Department of Energy under Contract AC06-76RL01830.

  20. Ground-state properties of the retinal molecule: from quantum mechanical to classical mechanical computations of retinal proteins

    Energy Technology Data Exchange (ETDEWEB)

    Suhai, Sandor [German Cancer Research Center, Heidelberg

    2011-01-01

    Retinal proteins are excellent systems for understanding essential physiological processes such as signal transduction and ion pumping. Although the conjugated polyene system of the retinal chromophore is best described with quantum mechanics, simulations of the long-timescale dynamics of a retinal protein in its physiological, flexible, lipid-membrane environment can only be performed at the classical mechanical level. Torsional energy barriers are a critical ingredient of the classical force-field parameters. Here we review briefly current retinal force fields and discuss new quantum mechanical computations to assess how the retinal Schiff base model and the approach used to derive the force-field parameters may influence the torsional potentials.

  1. Computational and Experimental Characterization of the Mach 6 Facility Nozzle Flow for the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Cabell, Karen F.; Passe, Bradley J.; Baurle, Robert A.

    2017-01-01

    Computational fluid dynamics analyses and experimental data are presented for the Mach 6 facility nozzle used in the Arc-Heated Scramjet Test Facility for the Enhanced Injection and Mixing Project (EIMP). This project, conducted at the NASA Langley Research Center, aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics relevant to flight Mach numbers greater than 8. The EIMP experiments use a two-dimensional Mach 6 facility nozzle to provide the high-speed air simulating the combustor entrance flow of a scramjet engine. Of interest are the physical extent and the thermodynamic properties of the core flow at the nozzle exit plane. The detailed characterization of this flow is obtained from three-dimensional, viscous, Reynolds-averaged simulations. Thermodynamic nonequilibrium effects are also investigated. The simulations are compared with the available experimental data, which includes wall static pressures as well as in-stream static pressure, pitot pressure and total temperature obtained via in-stream probes positioned just downstream of the nozzle exit plane.

  2. In Pursuit of Improving Airburst and Ground Damage Predictions: Recent Advances in Multi-Body Aerodynamic Testing and Computational Tools Validation

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian

    2017-01-01

    An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.

  3. First record of single event upset on the ground, Cray-1 computer memory at Los Alamos in 1976

    Energy Technology Data Exchange (ETDEWEB)

    Michalak, Sarah E [Los Alamos National Laboratory; Quinn, Heather M [Los Alamos National Laboratory; Grider, Gary A [Los Alamos National Laboratory; Iwanchuk, Paul N [Los Alamos National Laboratory; Morrison, John F [Los Alamos National Laboratory; Wender, Stephen A [Los Alamos National Laboratory; Normand, Eugene [EN ASSOCIATES, LLC; Wert, Jerry L [BOEING RESEARCH AND TEC; Johnson, Steve [CRAY, INC.

    2010-01-01

    Records of bit flips in the Cray-1 computer installed at Los Alamos in 1976 lead to an upset rate in the Cray-1 's bipolar SRAMs that correlates with the SEUs being induced by the atmospheric neutrons. In 1976 the Cray Research Company delivered its first supercomputer, the Cray-1, installing it at Los Alamos National Laboratory. Los Alamos had competed with the Lawrence Livermore National Laboratory for the Cray-1 and won, reaching an agreement with Seymour Cray to install the machine for a period of six months for free, after which they could decide whether to buy, lease or return it. As a result, Los Alamos personnel kept track of the computer reliability and performance and so we know that during those six months of operation, 152 memory parity errors were recorded. The computer memory consisted of approximately 70,000 1Kx1 bipolar ECL static RAMs, the Fairchild 10415. What the Los Alamos engineers didn't know is that those bit flips were the result of single event upsets (SEUs) caused by the atmospheric neutrons. Thus, these 152 bit flips were the first recorded SEUs on the earth, and were observed 2 years before the SEUs in the Intel DRAMs that had been found by May and Woods in 1978. The upsets in the DRAMs were shown to have been caused by alpha particles from the chip packaging material. In this paper we will demonstrate that the Cray-1 bit flips, which were found through the use of parity bits in the Cray-1, were likely due to atmospheric neutrons. This paper will follow the same approach as that of the very first paper to demonstrate single event effects, which occurred in satellite flip-flop circuits in 1975. The main difference is that in the four events that occurred over the course of 17 satellite years of operation were shown to be due to single event effects just a few years after those satellite anomalies were recorded. In the case of the Cray-1 bit flips, there has been a delay of more than 30 years between the occurrence of the bit

  4. Forward Modeling and validation of a new formulation to compute self-potential signals associated with ground water flow

    Directory of Open Access Journals (Sweden)

    A. Bolève

    2007-10-01

    Full Text Available The classical formulation of the coupled hydroelectrical flow in porous media is based on a linear formulation of two coupled constitutive equations for the electrical current density and the seepage velocity of the water phase and obeying Onsager's reciprocity. This formulation shows that the streaming current density is controlled by the gradient of the fluid pressure of the water phase and a streaming current coupling coefficient that depends on the so-called zeta potential. Recently a new formulation has been introduced in which the streaming current density is directly connected to the seepage velocity of the water phase and to the excess of electrical charge per unit pore volume in the porous material. The advantages of this formulation are numerous. First this new formulation is more intuitive not only in terms of establishing a constitutive equation for the generalized Ohm's law but also in specifying boundary conditions for the influence of the flow field upon the streaming potential. With the new formulation, the streaming potential coupling coefficient shows a decrease of its magnitude with permeability in agreement with published results. The new formulation has been extended in the inertial laminar flow regime and to unsaturated conditions with applications to the vadose zone. This formulation is suitable to model self-potential signals in the field. We investigate infiltration of water from an agricultural ditch, vertical infiltration of water into a sinkhole, and preferential horizontal flow of ground water in a paleochannel. For the three cases reported in the present study, a good match is obtained between finite element simulations performed and field observations. Thus, this formulation could be useful for the inverse mapping of the geometry of groundwater flow from self-potential field measurements.

  5. Memory reliability of spintronic materials and devices for disaster-resilient computing against radiation-induced bit flips on the ground

    Science.gov (United States)

    Hirose, Kazuyuki; Kobayashi, Daisuke; Ito, Taichi; Endoh, Tetsuo

    2017-08-01

    The memory reliability of magnetic tunnel junctions has been examined from the aspect of their potential use in disaster-resilient computing. This computing technology requires memories that can keep stored information intact even in power-cut emergency situations. Such a requirement has been quantified as a score of acceptable flip probability, which is the failure in time (FIT) rate of 1 for a single-interface perpendicular magnetic tunnel junction (p-MTJ) with a disk diameter of 20 nm. For comparison with this acceptable probability, p-MTJ memory reliability has been evaluated. The risk of particle radiation bombardments, i.e., alpha particles and neutrons — the well-known soft error sources on the ground — has been evaluated from the aspects of both frequency of bombardments and the hazardous effects of bombardments. This study highlights that high-energy terrestrial neutrons may lead to soft errors in p-MTJs, but the flip probability, or the risk, is expected to be lower than 1 × 10-6 FIT/p-MTJ, which is much smaller than the target probability. It has also been found that the use of p-MTJs can reduce the risk by three orders of magnitude compared with that of the conventional SRAMs. Few risks have been suggested for other radiation particles, such as alpha particles and thermal neutrons.

  6. Facile Formation of Acetic Sulfuric Anhydride in a Supersonic Jet: Characterization by Microwave Spectroscopy and Computational Chemistry

    Science.gov (United States)

    Huff, Anna; Smith, CJ; Mackenzie, Becca; Leopold, Ken

    2017-06-01

    Sulfur trioxide and acetic acid are shown to react under supersonic jet conditions to form acetic sulfuric anhydride, CH_{3}COOSO_{2}OH. Rotational spectra of the parent, ^{34}S, methyl ^{13}C, and fully deuterated isotopologues have been observed by chirped-pulse and conventional cavity microwave spectroscopy. A and E internal rotation states have been observed for each isotopologue studied and the methyl group internal rotation barriers have been determined (241.043(65) \\wn for the parent species). The reaction is analogous to that of our previous report on the reaction of sulfur trioxide and formic acid. DFT and CCSD calculations are also presented which indicate that the reaction proceeds via a π_{2} + π_{2} + σ_{2} cycloaddition reaction. These results support our previous conjecture that the reaction of SO_{3} with carboxylic acids is both facile and general. Possible implications for atmospheric aerosol formation are discussed.

  7. Dynamics of large-scale ionospheric inhomogeneities caused by a powerful radio emission of the Sura facility from the data collected onto ground-based GNSS network

    Science.gov (United States)

    Kogogin, D. A.; Nasyrov, I. A.; Grach, S. M.; Shindin, A. V.; Zagretdinov, R. V.

    2017-01-01

    The measurements of variations in the total electron content of the Earth's ionosphere along the GPS satellite signal propagation path are described. The signal parameters were measured at a network of receivers at three distant sites: Sura (Vasilsursk), Zelenodolsk, and Kazan. They are arranged along the geomagnetic latitude of the Sura Facility under short-wave radio irradiation of the ionosphere. One feature of the experiment is the crossing of a disturbed region by the radio path between a GPS satellite and Vasilsursk. This resulted from the angular sizes of the Sura array pattern; the radio paths between a GPS satellite and Zelenodolsk and a GPS satellite and Kazan did not cross. Variations in the total electron content of up to 0.15-0.3 TECU were revealed at all three sites during four experimental campaigns (March 2010, March 2013, May 2013, and November 2013). The lateral scale of an ionospheric disturbance stimulated by a high-power radio wave and the velocity of its west-to-east propagation along the geomagnetic latitude were 30-60 km and 270-350 m/s, respectively. A decrease in the total electron content (down to 0.55 TECU) was recorded along the Kazan-Zelenodolsk-Vasilsurks line, which is connected with the solar terminator transit; the lateral scale of the related ionospheric inhomogeneities was 65-80 km.

  8. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Reliable Facility Location Problem with Facility Protection.

    Science.gov (United States)

    Tang, Luohao; Zhu, Cheng; Lin, Zaili; Shi, Jianmai; Zhang, Weiming

    2016-01-01

    This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  13. Sensor test facilities and capabilities at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, W.B.; Burke, L.J.; Gomez, B.J.; Livingston, L.; Nelson, D.S.; Smathers, D.C.

    1996-12-31

    Sandia National Laboratories has recently developed two major field test capabilities for unattended ground sensor systems at the Department of energy`s Nevada Test Site (NTS). The first capability utilizes the NTS large area, varied terrain, and intrasite communications systems for testing sensors for detecting and tracking vehicular traffic. Sensor and ground truth data can be collected at either of two secure control centers. This system also includes an automated ground truth capability that consists of differential Global Positioning Satellite (GPS) receivers on test vehicles and live TV coverage of critical road sections. Finally there is a high-speed, secure computer network link between the control centers and the Air Force`s Theater Air Command and Control Simulation Facility in Albuquerque NM. The second capability is Bunker 2-300. It is a facility for evaluating advanced sensor systems for monitoring activities in underground cut-and-cover facilities. The main part of the facility consists of an underground bunker with three large rooms for operating various types of equipment. This equipment includes simulated chemical production machinery and controlled seismic and acoustic signal sources. There has been a thorough geologic and electromagnetic characterization of the region around the bunker. Since the facility is in a remote location, it is well-isolated from seismic, acoustic, and electromagnetic interference.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  15. The small impact of various partial charge distributions in ground and excited state on the computational Stokes shift of 1-methyl-6-oxyquinolinium betaine in diverse water models

    Science.gov (United States)

    Heid, Esther; Harringer, Sophia; Schröder, Christian

    2016-10-01

    The influence of the partial charge distribution obtained from quantum mechanics of the solute 1-methyl-6-oxyquinolinium betaine in the ground- and first excited state on the time-dependent Stokes shift is studied via molecular dynamics computer simulation. Furthermore, the effect of the employed solvent model — here the non-polarizable SPC, TIP4P and TIP4P/2005 and the polarizable SWM4 water model — on the solvation dynamics of the system is investigated. The use of different functionals and calculation methods influences the partial charge distribution and the magnitude of the dipole moment of the solute, but not the orientation of the dipole moment. Simulations based on the calculated charge distributions show nearly the same relaxation behavior. Approximating the whole solute molecule by a dipole results in the same relaxation behavior, but lower solvation energies, indicating that the time scale of the Stokes shift does not depend on peculiarities of the solute. However, the SPC and TIP4P water models show too fast dynamics which can be ascribed to a too large diffusion coefficient and too low viscosity. The calculated diffusion coefficient and viscosity for the SWM4 and TIP4P/2005 models coincide well with experimental values and the corresponding relaxation behavior is comparable to experimental values. Furthermore we found that for a quantitative description of the Stokes shift of the applied system at least two solvation shells around the solute have to be taken into account.

  16. High-resolution computed tomography to differentiate chronic diffuse interstitial lung diseases with predominant ground-glass pattern using logical analysis of data

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Sophie Grivaud; Brauner, Michel W.; Rety, Frederique [Universite Paris 13, Assistance Publique-Hopitaux de Paris, Hopital Avicenne, UPRES EA 2363, Department of Radiology, Bobigny (France); Kronek, Louis-Philippe; Brauner, Nadia [Universite Joseph Fourier, Laboratoire G-SCOP, Grenoble (France); Valeyre, Dominique; Nunes, Hilario [Universite Paris 13, Assistance Publique-Hopitaux de Paris, Hopital Avicenne, UPRES EA 2363, Department of Pneumology, Bobigny (France); Brillet, Pierre-Yves [Universite Paris 13, Assistance Publique-Hopitaux de Paris, Hopital Avicenne, UPRES EA 2363, Department of Radiology, Bobigny (France); Hopital Avicenne, Service de Radiologie, Bobigny Cedex (France)

    2010-06-15

    We evaluated the performance of high-resolution computed tomography (HRCT) to differentiate chronic diffuse interstitial lung diseases (CDILD) with predominant ground-glass pattern by using logical analysis of data (LAD). A total of 162 patients were classified into seven categories: sarcoidosis (n = 38), connective tissue disease (n = 32), hypersensitivity pneumonitis (n = 18), drug-induced lung disease (n = 15), alveolar proteinosis (n = 12), idiopathic non-specific interstitial pneumonia (n = 10) and miscellaneous (n = 37). First, 40 CT attributes were investigated by the LAD to build up patterns characterising a category. From the association of patterns, LAD determined models specific to each CDILD. Second, data were recomputed by adding eight clinical attributes to the analysis. The 20 x 5 cross-folding method was used for validation. Models could be individualised for sarcoidosis, hypersensitivity pneumonitis, connective tissue disease and alveolar proteinosis. An additional model was individualised for drug-induced lung disease by adding clinical data. No model was demonstrated for idiopathic non-specific interstitial pneumonia and the miscellaneous category. The results showed that HRCT had a good sensitivity ({>=}64%) and specificity ({>=}78%) and a high negative predictive value ({>=}93%) for diseases with a model. Higher sensitivity ({>=}78%) and specificity ({>=}89%) were achieved by adding clinical data. The diagnostic performance of HRCT is high and can be increased by adding clinical data. (orig.)

  17. Development of a ground facility for simulating wide-band angular vibration%宽频角振动地面模拟试验设备的研制

    Institute of Scientific and Technical Information of China (English)

    邢志钢; 邢建伟; 王立; 郑钢铁

    2013-01-01

    To mitigate the task risks of the lunar and deep-space landers caused by the wide-band vibrations of its optics and microwave sensors,the ground simulation test is necessary.In this paper,firstly,a vibration transform structure and a measurement method are proposed and a ground facility for simulating the wide-band angular vibration simulation is developed.With regard to the angular vibration environment for the imaging sensor used for the moon lander,a measurement test is carried out.The results show that the simulator can have angular vibrations in a frequency range of 0 to 2000 Hz,w ith a control precision of ±15%,and the rotational and translational speeds can be also set to the required values of the lander.the facility has been implemented for validating the performance of optic sensors in the angular vibration environment.%为了解并减小探月着陆系统之敏感器容易受到宽频角振动影响的任务风险,有必要进行宽频角振动地面模拟试验.文章首先提出了宽频角振动的振动转换结构和测量方法,在此基础上完成了试验设备的研制,之后针对探月着陆系统成像敏感器遭遇的角振动环境进行了地面模拟测试.测试结果表明:该试验设备的宽频角振动模拟频率范围为0~2000 Hz,控制精度达到士l5%(2σ),同时还可模拟包括平移和转动在内的着陆系统刚体运动.该设备已用于敏感器在角振动环境下的地面模拟试验验证.

  18. NASA Ames's electric arc-driven shock tube facility and research on nonequilibrium phenomena in low density hypersonic flows

    Science.gov (United States)

    Sharma, Surendra P.

    1992-01-01

    Basic requirements for a ground test facility simulating low density hypersonic flows are discussed. Such facilities should be able to produce shock velocities in the range of 10-17 km/sec in an initial pressure of 0.010 to 0.050 Torr. The facility should be equipped with diagnostics systems to be able to measure the emitted radiation, characteristic temperatures and populations in various energy levels. In the light of these requirements, NASA Ames's electric arc-driven low density shock tube facility is described and available experimental diagnostics systems and computational tools are discussed.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  3. Clinical Study of Intra-operative Computed Tomography Guided Localization with A Hook-wire System for Small Ground Glass Opacities in Minimally Invasive Resection

    Directory of Open Access Journals (Sweden)

    Xiangyang CHU

    2014-12-01

    Full Text Available Background and objective Localization of pulmonary ground glass small nodule is the technical difficulty of minimally invasive operation resection. The aim of this study is to evaluate the value of intraoperative computed tomography (CT-guided localization using a hook-wire system for small ground glass opacity (GGO in minimally invasive resection, as well as to discuss the necessity and feasibility of surgical resection of small GGOs (<10 mm through a minimally invasive approach. Methods The records of 32 patients with 41 small GGOs who underwent intraoperative CT-guided double-thorn hook wire localization prior to video-assisted thoracoscopic wedge resection from October 2009 to October 2013 were retrospectively reviewed. All patients received video-assisted thoracoscopic surgery (VATS within 10 min after wire localization. The efficacy of intraoperative localization was evaluated in terms of procedure time, VATS success rate, and associated complications of localization. Results A total of 32 patients (15 males and 17 females underwent 41 VATS resections, with 2 simultaneous nodule resections performed in 3 patients, 3 lesion resections in 1 patient, and 5 lesions in a patient. Nodule diameters ranged from 2 mm-10 mm (mean: 5 mm. The distance of lung lesions from the nearest pleural surfaces ranged within 5 mm-24 mm (mean: 12.5 mm. All resections of lesions guided by the inserted hook wires were successfully performed by VATS (100% success rate. The mean procedure time for the CT-guided hook wire localization was 8.4 min (range: 4 min-18 min. The mean procedure time for VATS was 32 min (range: 14 min-98 min. The median hospital time was 8 d (range: 5 d-14 d. Results of pathological examination revealed 28 primary lung cancers, 9 atypical adenomatous hyperplasia, and 4 nonspecific chronic inflammations. No major complication related to the intraoperative hook wire localization and VATS was noted. Conclusion Intraoperative CT-guided hook wire

  4. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  6. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  10. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  11. Ground energy coupling

    Science.gov (United States)

    Metz, P. D.

    The feasibility of ground coupling for various heat pump systems was investigated. Analytical heat flow models were developed to approximate design ground coupling devices for use in solar heat pump space conditioning systems. A digital computer program called GROCS (GRound Coupled Systems) was written to model 3-dimensional underground heat flow in order to simulate the behavior of ground coupling experiments and to provide performance predictions which have been compared to experimental results. GROCS also has been integrated with TRNSYS. Soil thermal property and ground coupling device experiments are described. Buried tanks, serpentine earth coils in various configurations, lengths and depths, and sealed vertical wells are being investigated. An earth coil used to heat a house without use of resistance heating is described.

  12. Pure ground glass nodular adenocarcinomas: Are preoperative positron emission tomography/computed tomography and brain magnetic resonance imaging useful or necessary?

    Science.gov (United States)

    Cho, Hyoun; Lee, Ho Yun; Kim, Jhingook; Kim, Hong Kwan; Choi, Joon Young; Um, Sang-Won; Lee, Kyung Soo

    2015-09-01

    The utility of (18)F-Fluorodeoxyglucose positron emission tomography/computed tomography (FDG PET/CT) scanning and brain magnetic resonance imaging (MRI) as a staging workup for lung adenocarcinoma manifesting as pure ground glass opacity (GGO) is unknown. The purpose of this study was to determine the utility of these 2 tests for preoperative staging of pure GGO nodular lung adenocarcinoma. The study included 164 patients (male:female, 73:91; mean age, 62 years) with pure GGO nodular lung adenocarcinoma who underwent PET/CT (in 136 patients) and/or brain MRI (in 109 patients) before surgery. Pathologic N staging and dedicated standard imaging or follow-up imaging findings for M staging were used as reference standards. The median follow-up time was 47.9 months. On PET/CT scan, abnormal FDG uptake of lymph nodes was found in 2 of 136 patients (1.5%); both were negative on final pathology. Abnormal FDG uptake of the liver was detected in 1 patient, which was also confirmed to be negative by dedicated abdominal CT. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of PET/CT in detecting metastases were not applicable, 98% (95% confidence interval [CI], 94%-100%), 0% (95% CI, 0%-71%), 100% (95% CI, 97%-100%), and 98% (95% CI, 94%-100%), respectively. No brain metastasis was found in preoperative brain MRI of 109 patients. Of 109 patients, 1 (0.9%) developed brain metastasis 30 months after surgical resection. PET/CT and brain MRI is not necessary in the staging of pure GGO nodular lung adenocarcinoma. Copyright © 2015 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  13. Numerical Simulation of Ground Coupling of Low Yield Nuclear Detonation

    Science.gov (United States)

    2010-06-01

    Without nuclear testing, advanced simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring...in planning future experimental work at NIF . 15. NUMBER OF PAGES 93 14. SUBJECT TERMS National Ignition Facility, GEODYN, Ground Coupling...simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring safety, reliability, and effectiveness

  14. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  15. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  16. 40 CFR 265.91 - Ground-water monitoring system.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Ground-water monitoring system. 265.91... DISPOSAL FACILITIES Ground-Water Monitoring § 265.91 Ground-water monitoring system. (a) A ground-water monitoring system must be capable of yielding ground-water samples for analysis and must consist of: (1...

  17. Development of a computer code for shielding calculation in X-ray facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em salas radiograficas

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Diogo da S.; Lava, Deise D.; Affonso, Renato R.W.; Moreira, Maria de L.; Guimaraes, Antonio C.F., E-mail: diogosb@outlook.com, E-mail: deise_dy@hotmail.com, E-mail: raoniwa@yahoo.com.br, E-mail: malu@ien.gov.br, E-mail: tony@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011.

  18. Development of a computational code for calculations of shielding in dental facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em instalacoes odontologicas

    Energy Technology Data Exchange (ETDEWEB)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L., E-mail: deise_dy@hotmail.com, E-mail: diogosb@outlook.com, E-mail: raoniwa@yahoo.com.br, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  20. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    Science.gov (United States)

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation

  1. Central Computer Facility Software Directory.

    Science.gov (United States)

    1986-05-01

    CHAPTER 1 1nt rQductG =.-l -. .... . . .. . . .. . . .. . . .1.. . CHAPTER 2 Editors and Text Processors ~...............2 Digital Standard Run off AVAX...some cases, a table of contents or an index. 2 Digital Standard Runoff (VAX) OPERATING SYSTEM: VMS DESCRIPTION Digital Standard Runoff (DSR) is a

  2. Fast ground filtering for TLS data via Scanline Density Analysis

    Science.gov (United States)

    Che, Erzhuo; Olsen, Michael J.

    2017-07-01

    Terrestrial Laser Scanning (TLS) efficiently collects 3D information based on lidar (light detection and ranging) technology. TLS has been widely used in topographic mapping, engineering surveying, forestry, industrial facilities, cultural heritage, and so on. Ground filtering is a common procedure in lidar data processing, which separates the point cloud data into ground points and non-ground points. Effective ground filtering is helpful for subsequent procedures such as segmentation, classification, and modeling. Numerous ground filtering algorithms have been developed for Airborne Laser Scanning (ALS) data. However, many of these are error prone in application to TLS data because of its different angle of view and highly variable resolution. Further, many ground filtering techniques are limited in application within challenging topography and experience difficulty coping with some objects such as short vegetation, steep slopes, and so forth. Lastly, due to the large size of point cloud data, operations such as data traversing, multiple iterations, and neighbor searching significantly affect the computation efficiency. In order to overcome these challenges, we present an efficient ground filtering method for TLS data via a Scanline Density Analysis, which is very fast because it exploits the grid structure storing TLS data. The process first separates the ground candidates, density features, and unidentified points based on an analysis of point density within each scanline. Second, a region growth using the scan pattern is performed to cluster the ground candidates and further refine the ground points (clusters). In the experiment, the effectiveness, parameter robustness, and efficiency of the proposed method is demonstrated with datasets collected from an urban scene and a natural scene, respectively.

  3. 基于云计算的卫星地面应用系统设计%Design of satellite ground application system based on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    温志军

    2016-01-01

    Article writing around satellite ground system design of expansion,mainly describes the challenges faced by the satellite ground application system design,in the analysis of the challenge to design under the premise,elaborate scheme design of satellite ground application system of concrete.At last,the paper introduces the satellite ground application system implementation and deployment of the system.%文章的撰写围绕卫星地面系统设计展开,主要内容阐述了卫星地面应用系统设计面临的挑战,在分析设计挑战的前提下,阐述卫星地面应用系统方案设计的具体,最后详细介绍了卫星地面应用系统实现与部署系统。

  4. A cryogenic test facility

    Science.gov (United States)

    Veenendaal, Ian

    The next generation, space-borne instruments for far infrared spectroscopy will utilize large diameter, cryogenically cooled telescopes in order to achieve unprecedented sensitivities. Low background, ground-based cryogenic facilities are required for the cryogenic testing of materials, components and subsystems. The Test Facility Cryostat (TFC) at the University of Lethbridge is a large volume, closed cycle, 4K cryogenic facility, developed for this purpose. This thesis discusses the design and performance of the facility and associated external instrumentation. An apparatus for measuring the thermal properties of materials is presented, and measurements of the thermal expansion and conductivity of carbon fibre reinforced polymers (CFRPs) at cryogenic temperatures are reported. Finally, I discuss the progress towards the design and fabrication of a demonstrator cryogenic, far infrared Fourier transform spectrometer.

  5. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  6. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  7. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurie, Carol

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  8. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Office of Energy Efficiency and Renewable Energy

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  9. Fabrication Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Fabrication Facilities are a direct result of years of testing support. Through years of experience, the three fabrication facilities (Fort Hood, Fort Lewis, and...

  10. Facility Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Z.; Walling, R.; Miller, N.; Du, P.; Nelson, K.

    2005-05-01

    Microgrids are receiving a considerable interest from the power industry, partly because their business and technical structure shows promise as a means of taking full advantage of distributed generation. This report investigates three issues associated with facility microgrids: (1) Multiple-distributed generation facility microgrids' unintentional islanding protection, (2) Facility microgrids' response to bulk grid disturbances, and (3) Facility microgrids' intentional islanding.

  11. Ground-glass opacity in diffuse lung diseases: high-resolution computed tomography-pathology correlation; Opacidades em vidro fosco nas doencas pulmonares difusas: correlacao da tomografia computadorizada de alta resolucao com a anatomopatologia

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maria Lucia de Oliveira; Vianna, Alberto Domingues; Marchiori, Edson [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Radiologia; Souza Junior, Arthur Soares [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Disciplina de Radiologia; Moraes, Heleno Pinto de [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Patologia]. E-mail: edmarchiori@zipmail.com.br

    2003-12-01

    Ground-glass opacity is a finding frequently seen in high-resolution computed tomography examinations of the chest and is characterized by hazy increased attenuation of lung, however without blurring of bronchial and vascular margins. Due to its un specificity, association with other radiological, clinical and pathological findings must be considered for an accurate diagnostic interpretation. In this paper were reviewed 62 computed tomography examinations of patients with diffuse pulmonary diseases of 14 different etiologies in which ground-glass opacity was the only or the most remarkable finding, and correlated this findings with pathology abnormalities seen on specimens obtained from biopsies or necropsies. In pneumocystosis, ground-glass opacities correlated histologically with alveolar occupation by a foaming material containing parasites, in bronchiole alveolar cell carcinoma with thickening of the alveolar septa and occupation of the lumen by mucus and tumoral cells, in paracoccidioidomycosis with thickening of the alveolar septa, areas of fibrosis and alveolar bronchopneumonia exudate, in sarcoidosis with fibrosis or clustering of granulomas and in idiopathic pulmonary fibrosis with alveolar septa thickening due to fibrosis. Alveolar occupation by blood was found in cases of leptospirosis, idiopathic hemo siderosis, metastatic kidney tumor and invasive aspergillosis whereas oily vacuole were seen in lipoid pneumonia, proteinaceous and lipo proteinaceous material in silico proteinosis and pulmonary alveolar proteinosis, and edematous fluid in cardiac failure. (author)

  12. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  13. Ground Wars

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Kleis

    Political campaigns today are won or lost in the so-called ground war--the strategic deployment of teams of staffers, volunteers, and paid part-timers who work the phones and canvass block by block, house by house, voter by voter. Ground Wars provides an in-depth ethnographic portrait of two...... infrastructures that utilize large databases with detailed individual-level information for targeting voters, and armies of dedicated volunteers and paid part-timers. Nielsen challenges the notion that political communication in America must be tightly scripted, controlled, and conducted by a select coterie...... of professionals. Yet he also quashes the romantic idea that canvassing is a purer form of grassroots politics. In today's political ground wars, Nielsen demonstrates, even the most ordinary-seeming volunteer knocking at your door is backed up by high-tech targeting technologies and party expertise. Ground Wars...

  14. 47 CFR 69.110 - Entrance facilities.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Entrance facilities. 69.110 Section 69.110... Computation of Charges § 69.110 Entrance facilities. (a) A flat-rated entrance facilities charge expressed in... that use telephone company facilities between the interexchange carrier or other person's point of...

  15. COMPUTER REALIZATION OF SEARCH TASK OF THE SHORTEST ROUTE WITH THE HELP OF EXCEL AND VBA

    OpenAIRE

    2008-01-01

    The necessity of the development of simple and accessible to the final users of the applied software for the decision of local optimization tasks of transport technologies has been grounded with the help of the Excel and VBA facilities. The computer task decision and search of the shortest route has been described.

  16. Water Activities in Laxemar Simpevarp. The final disposal facility for spent nuclear fuel - removal of groundwater and water activities above ground; Vattenverksamhet i Laxemar-Simpevarp. Slutfoervarsanlaeggning foer anvaent kaernbraensle - bortledande av grundvatten samt vattenverksamheter ovan mark

    Energy Technology Data Exchange (ETDEWEB)

    Werner, Kent (EmpTec (Sweden)); Hamren, Ulrika; Collinder, Per (Ekologigruppen AB (Sweden))

    2010-12-15

    operations would include a bridge across Laxemaraan and measures in the vicinity of the surface facility (the industrial area) for the repository, in Laxemaraan and in a ditch (Oxhagsbaecken). During construction of the bridge, measures would be taken to reduce the consequences of turbid water, for instance for spawning fish. No intermediate support in the stream would be required, and the bridge would be constructed not to influence the flow conditions of the stream and not to be a wandering obstacle for people and animals. Other water operations above ground would be executed for handling of drainage water from the underground part of the repository and leachate from a rock dump. These waters would be diverted to Laxemaraan via a constructed 'lake' adjacent to the stream. The leachate would also be treated in a broad irrigation area with a recirculation- and detention pond (Laxemarkaerren).

  17. Ground and excited state behavior of 1,4-dimethoxy-3-methyl-anthracene-9,10-dione in silver nanoparticles: Spectral and computational investigations

    Energy Technology Data Exchange (ETDEWEB)

    Umadevi, M., E-mail: ums10@yahoo.com [Department of Physics, Mother Teresa Women' s University, Kodaikanal 624101, Tamil Nadu (India); Kavitha, S.R. [Department of Physics, Mother Teresa Women' s University, Kodaikanal 624101, Tamil Nadu (India); Vanelle, P.; Terme, T.; Khoumeri, O. [Laboratoire de Pharmaco-Chimie Radicalaire, Faculté de Pharmacie, Aix-Marseille Univ, CNRS, Institut de Chimie Radicalaire ICR, UMR 7273, 27 Boulevard Jean Moulin, 13385 Marseille Cedex 05 (France)

    2013-10-15

    Silver nanoparticles (Ag NPs) of various sizes have been successfully synthesized by the simple and convenient Creighton method using sodium borohydride as the reducing agent under microwave irradiation. Optical absorption and fluorescence emission spectroscopic techniques were employed to investigate the effect of silver nanoparticles on the ground and excited state of 1,4-dimethoxy-3-methylanthracene-9,10-dione (DMMAD). The surface plasmon resonance (SPR) peak of the prepared silver colloidal solution was observed at 400 nm. Fluorescence quenching of DMMAD by silver nanoparticles has been found to increase with increase in the size of Ag. The fluorescence quenching has been explained by Forster Resonance Energy Transfer (FRET) theory between DMMAD and silver nanoparticles. The Stern–Volmer quenching constant and Benesi–Hildebrand association constant for the above system were calculated. DFT calculations were also performed to study the charge distribution of DMMAD in Ag both in ground and excited states. -- Highlights: • Silver nanoparticles (Ag NPs) have been synthesized using the Creighton method. • Effect of Ag NPs on the ground state of DMMAD was studied. • Influence of Ag NPs on the excited state of DMMAD was investigated. • Fluorescence quenching has been explained by Forster Resonance Energy Transfer. • Quenching and binding constants were also calculated.

  18. Variable gravity research facility

    Science.gov (United States)

    Allan, Sean; Ancheta, Stan; Beine, Donna; Cink, Brian; Eagon, Mark; Eckstein, Brett; Luhman, Dan; Mccowan, Daniel; Nations, James; Nordtvedt, Todd

    1988-01-01

    Spin and despin requirements; sequence of activities required to assemble the Variable Gravity Research Facility (VGRF); power systems technology; life support; thermal control systems; emergencies; communication systems; space station applications; experimental activities; computer modeling and simulation of tether vibration; cost analysis; configuration of the crew compartments; and tether lengths and rotation speeds are discussed.

  19. National facilities study. Volume 2: Task group on aeronautical research and development facilities report

    Science.gov (United States)

    1994-01-01

    The Task Group on Aeronautics R&D Facilities examined the status and requirements for aeronautics facilities against the competitive need. Emphasis was placed on ground-based facilities for subsonic, supersonic and hypersonic aerodynamics, and propulsion. Subsonic and transonic wind tunnels were judged to be most critical and of highest priority. Results of the study are presented.

  20. Experimenting with Science Facility Design.

    Science.gov (United States)

    Butterfield, Eric

    1999-01-01

    Discusses the modern school science facility and how computers and teaching methods are changing their design. Issues include power, lighting, and space requirements; funding for planning; architect assessment; materials requirements for work surfaces; and classroom flexibility. (GR)

  1. The VLT Adaptive Optics Facility Project: Adaptive Optics Modules

    Science.gov (United States)

    Arsenault, Robin; Hubin, Norbert; Stroebele, Stefan; Fedrigo, Enrico; Oberti, Sylvain; Kissler-Patig, Markus; Bacon, Roland; McDermid, Richard; Bonaccini-Calia, Domenico; Biasi, Roberto; Gallieni, Daniele; Riccardi, Armando; Donaldson, Rob; Lelouarn, Miska; Hackenberg, Wolfgang; Conzelman, Ralf; Delabre, Bernard; Stuik, Remko; Paufique, Jerome; Kasper, Markus; Vernet, Elise; Downing, Mark; Esposito, Simone; Duchateau, Michel; Franx, Marijn; Myers, Richard; Goodsell, Steven

    2006-03-01

    The Adaptive Optics Facility is a project to convert UT4 into a specialised Adaptive Telescope with the help of a Deformable Secondary Mirror (see previous article). The two instruments that have been identified for the two Nasmyth foci are: Hawk-I with its AO module GRAAL allowing a Ground Layer Adaptive Optics correction (GLAO) and MUSE with GALACSI for GLAO correction and Laser Tomography Adaptive Optics correction. This article describes the AO modules GRAAL and GALACSI and their Real-Time Computers based on SPARTA.

  2. Ground Control System Description Document

    Energy Technology Data Exchange (ETDEWEB)

    Eric Loros

    2001-07-31

    The Ground Control System contributes to the safe construction and operation of the subsurface facility, including accesses and waste emplacement drifts, by maintaining the configuration and stability of the openings during construction, development, emplacement, and caretaker modes for the duration of preclosure repository life. The Ground Control System consists of ground support structures installed within the subsurface excavated openings, any reinforcement made to the rock surrounding the opening, and inverts if designed as an integral part of the system. The Ground Control System maintains stability for the range of geologic conditions expected at the repository and for all expected loading conditions, including in situ rock, construction, operation, thermal, and seismic loads. The system maintains the size and geometry of operating envelopes for all openings, including alcoves, accesses, and emplacement drifts. The system provides for the installation and operation of sensors and equipment for any required inspection and monitoring. In addition, the Ground Control System provides protection against rockfall for all subsurface personnel, equipment, and the engineered barrier system, including the waste package during the preclosure period. The Ground Control System uses materials that are sufficiently maintainable and that retain the necessary engineering properties for the anticipated conditions of the preclosure service life. These materials are also compatible with postclosure waste isolation performance requirements of the repository. The Ground Control System interfaces with the Subsurface Facility System for operating envelopes, drift orientation, and excavated opening dimensions, Emplacement Drift System for material compatibility, Monitored Geologic Repository Operations Monitoring and Control System for ground control instrument readings, Waste Emplacement/Retrieval System to support waste emplacement operations, and the Subsurface Excavation System

  3. 6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

    CERN Multimedia

    Teams : M. Brice, JC Gadmer

    2010-01-01

    6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

  4. Evaluation of ground level concentration of pollutant due to gas flaring by computer simulation: A case study of Niger - Delta area of Nigeria

    Directory of Open Access Journals (Sweden)

    A. S. ABDULKAREEM

    2005-01-01

    Full Text Available The disposal of associated gases through flaring has been a major problem for the Nigerian oil and gas industries and most of theses gases are flared due to the lack of commercial out lets. The resultant effects of gas flaring are the damaging effect of the environment due to acid rain formation, green house effect, global warming and ozone depletion.This writes up is aimed at evaluating ground level concentration of CO2, SO2, NO2 and total hydrocarbon (THC, which are product of gas flared in oil producing areas. Volumes of gas flared at different flow station were collected as well as geometrical parameters. The results of simulation of model developed based on the principles of gaseous dispersion by Gaussian showed a good agreement with dispersion pattern.The results showed that the dispersion pattern of pollutants at ground level depends on the volume of gas flared, wind speed, velocity of discharge and nearness to the source of flaring. The results shows that continuous gas flaring irrespective of the quantity deposited in the immediate environment will in long run lead to change in the physicochemical properties of soil.

  5. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  6. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 2; Unsteady Analyses and Risk Assessment

    Science.gov (United States)

    Ahuja, Vineet; Hosangadi, Ashvin; Allgood, Daniel

    2008-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Design Support of the feasibility of operating conditions and procedures is critical in such cases due to the possibility of startup/shutdown transients, moving shock structures, unsteady shock-boundary layer interactions and engine and diffuser unstart modes that can result in catastrophic failure. Analyses of such systems is difficult due to resolution requirements needed to accurately capture moving shock structures, shock-boundary layer interactions, two-phase flow regimes and engine unstart modes. In a companion paper, we will demonstrate with the use of CFD, steady analyses advanced capability to evaluate supersonic diffuser and steam ejector performance in the sub-scale A-3 facility. In this paper we will address transient issues with the operation of the facility especially at startup and shutdown, and assess risks related to afterburning due to the interaction of a fuel rich plume with oxygen that is a by-product of the steam ejectors. The primary areas that will be addressed in this paper are: (1) analyses of unstart modes due to flow transients especially during startup/ignition, (2) engine safety during the shutdown process (3) interaction of steam ejectors with the primary plume i.e. flow transients as well as probability of afterburning. In this abstract we discuss unsteady analyses of the engine shutdown process. However, the final paper will include analyses of a staged startup, drawdown of the engine test cell pressure, and risk assessment of potential afterburning in the facility. Unsteady

  7. Computational modeling of on-contact antennas for the detection and localization of anti-personnel landmines via ground penetrating radar

    Science.gov (United States)

    Hines, Margery Jeanne

    Ground-penetrating radar (GPR) is a mature technology which has developed into a popular tool for subsurface imaging; however its application in landmine detection is still in its infancy. Landmines are typically buried in dispersive soils below a rough surface where the effectiveness of conventional air-coupled GPR is limited. By utilizing ground-contact antennas the signal penetration is dramatically improved and data analysis is simplified. In order to canvas an area while achieving ground-contact with the antennas, this research proposes that the antennas be mounted to the bottom of the feet of a walking robotic platform developed by Square One Systems Design, called the Tri-Sphere Multi-Mode Mobility Platform. Using three antennas in both the transmitting and receiving modes, three unique bistatic GPR traces can be obtained from which a novel anti-personnel landmine detection and localization method is proposed. For each GPR trace, the target reflection is enhanced using circular polarization and is extracted using background removal. The full-path travel times are then determined by correlating the target reflections with a reference signal. These travel times are used to geometrically determine the target position to a single subsurface scattering point, which is identified as the potential target location. This detection method is fully autonomous, thereby allowing the robot to canvas a large amount of area and mark potential threats without any human interaction. Using a 3-dimensional finite-difference time domain model, GPR data is simulated for sixteen statistically different rough surfaces, nine different target locations, and two target casings, amounting to 288 unique simulations. The soil modeled is 10% wet Bosnian soil, which is both lossy and dispersive. For comparison, the various simulations are analyzed with both the exact simulated background response and the statistically approximated background response. Ultimately, using the approximated

  8. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    Science.gov (United States)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the

  9. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  10. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, such as birthing centers and psychiatric care centers. When you ...

  11. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  12. Investigation of Pharmaceutical Residues in Hospital Effluents, in Ground- and Drinking Water from Bundeswehr Facilities, and their Removal During Drinking Water Purification (Arzneimittelrueckstaende in Trinkwasser(versorgungsanlagen) und Krankenhausabwaessern der Bundeswehr: Methodenentwicklung - Verkommen - Wasseraufbereitung)

    Science.gov (United States)

    1999-11-01

    Fluorchinolo- ne ( Ciprofloxacin , Norfloxacin , Enrofloxacin, Ofloxacin), Chloramphenicol, Lincomycin, Clindamycin und Trimethoprim mit Konzentrationen bis in den...water from Bundeswehr facilities, and their removal during drinking water purification) 6. AUTHOR(S) Th. Heberer, Dirk Feldmann, Marc Adam, Kirsten...occurrence and the removal of pharmaceutical residues was investigated In a scientific research project (InSan I 1299-V-7502) entitled "Investigation

  13. Implementation is crucial but must be neurobiologically grounded. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L.

    2014-09-01

    From the perspective of language, Fitch's [1] claim that theories of cognitive computation should not be separated from those of implementation surely deserves applauding. Recent developments in the Cognitive Neuroscience of Language, leading to the new field of the Neurobiology of Language [2-4], emphasise precisely this point: rather than attempting to simply map cognitive theories of language onto the brain, we should aspire to understand how the brain implements language. This perspective resonates with many of the points raised by Fitch in his review, such as the discussion of unhelpful dichotomies (e.g., Nature versus Nurture). Cognitive dichotomies and debates have repeatedly turned out to be of limited usefulness when it comes to understanding language in the brain. The famous modularity-versus-interactivity and dual route-versus-connectionist debates are cases in point: in spite of hundreds of experiments using neuroimaging (or other techniques), or the construction of myriad computer models, little progress has been made in their resolution. This suggests that dichotomies proposed at a purely cognitive (or computational) level without consideration of biological grounding appear to be "asking the wrong questions" about the neurobiology of language. In accordance with these developments, several recent proposals explicitly consider neurobiological constraints while seeking to explain language processing at a cognitive level (e.g. [5-7]).

  14. 'Grounded' Politics

    DEFF Research Database (Denmark)

    Schmidt, Garbi

    2012-01-01

    play within one particular neighbourhood: Nørrebro in the Danish capital, Copenhagen. The article introduces the concept of grounded politics to analyse how groups of Muslim immigrants in Nørrebro use the space, relationships and history of the neighbourhood for identity political statements....... The article further describes how national political debates over the Muslim presence in Denmark affect identity political manifestations within Nørrebro. By using Duncan Bell’s concept of mythscape (Bell, 2003), the article shows how some political actors idealize Nørrebro’s past to contest the present...

  15. Engineering a Multimission Approach to Navigation Ground Data System Operations

    Science.gov (United States)

    Gerasimatos, Dimitrios V.; Attiyah, Ahlam A.

    2012-01-01

    The Mission Design and Navigation (MDNAV) Section at the Jet Propulsion Laboratory (JPL) supports many deep space and earth orbiting missions from formulation to end of mission operations. The requirements of these missions are met with a multimission approach to MDNAV ground data system (GDS) infrastructure capable of being shared and allocated in a seamless and consistent manner across missions. The MDNAV computing infrastructure consists of compute clusters, network attached storage, mission support area facilities, and desktop hardware. The multimission architecture allows these assets, and even personnel, to be leveraged effectively across the project lifecycle and across multiple missions simultaneously. It provides a more robust and capable infrastructure to each mission than might be possible if each constructed its own. It also enables a consistent interface and environment within which teams can conduct all mission analysis and navigation functions including: trajectory design; ephemeris generation; orbit determination; maneuver design; and entry, descent, and landing analysis. The savings of these efficiencies more than offset the costs of increased complexity and other challenges that had to be addressed: configuration management, scheduling conflicts, and competition for resources. This paper examines the benefits of the multimission MDNAV ground data system infrastructure, focusing on the hardware and software architecture. The result is an efficient, robust, scalable MDNAV ground data system capable of supporting more than a dozen active missions at once.

  16. 30 CFR 77.1608 - Dumping facilities.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Dumping facilities. 77.1608 Section 77.1608... Haulage § 77.1608 Dumping facilities. (a) Dumping locations and haulage roads shall be kept reasonably free of water, debris, and spillage. (b) Where the ground at a dumping place may fail to support...

  17. Brayton Isotope Power System (BIPS) facility specification

    Energy Technology Data Exchange (ETDEWEB)

    1976-05-31

    General requirements for the Brayton Isotope Power System (BIPS)/Ground Demonstration System (GDS) assembly and test facility are defined. The facility will include provisions for a complete test laboratory for GDS checkout, performance, and endurance testing, and a contamination-controlled area for assembly, fabrication, storage, and storage preparation of GDS components. Specifications, schedules, and drawings are included.

  18. Enhanced computational efficiency in the direct determination of the two-electron reduced density matrix from the anti-Hermitian contracted Schrödinger equation with application to ground and excited states of conjugated π-systems

    Energy Technology Data Exchange (ETDEWEB)

    Sand, Andrew M.; Mazziotti, David A., E-mail: damazz@uchicago.edu [Department of Chemistry and The James Franck Institute, The University of Chicago, Chicago, Illinois 60637 (United States)

    2015-10-07

    Determination of the two-electron reduced density matrix (2-RDM) from the solution of the anti-Hermitian contracted Schrödinger equation (ACSE) yields accurate energies and properties for both ground and excited states. Here, we develop a more efficient method to solving the ACSE that uses second-order information to select a more optimal step towards the solution. Calculations on the ground and excited states of water, hydrogen fluoride, and conjugated π systems show that the improved ACSE algorithm is 10-20 times faster than the previous ACSE algorithm. The ACSE can treat both single- and multi-reference electron correlation with the initial 2-RDM from a complete-active-space self-consistent-field (CASSCF) calculation. Using the improved algorithm, we explore the relationship between truncation of the active space in the CASSCF calculation and the accuracy of the energy and 2-RDM from the ACSE calculation. The accuracy of the ACSE, we find, is less sensitive to the size of the active space than the accuracy of other wavefunction methods, which is useful when large active space calculations are computationally infeasible.

  19. Enhanced computational efficiency in the direct determination of the two-electron reduced density matrix from the anti-Hermitian contracted Schrödinger equation with application to ground and excited states of conjugated π-systems.

    Science.gov (United States)

    Sand, Andrew M; Mazziotti, David A

    2015-10-01

    Determination of the two-electron reduced density matrix (2-RDM) from the solution of the anti-Hermitian contracted Schrödinger equation (ACSE) yields accurate energies and properties for both ground and excited states. Here, we develop a more efficient method to solving the ACSE that uses second-order information to select a more optimal step towards the solution. Calculations on the ground and excited states of water, hydrogen fluoride, and conjugated π systems show that the improved ACSE algorithm is 10-20 times faster than the previous ACSE algorithm. The ACSE can treat both single- and multi-reference electron correlation with the initial 2-RDM from a complete-active-space self-consistent-field (CASSCF) calculation. Using the improved algorithm, we explore the relationship between truncation of the active space in the CASSCF calculation and the accuracy of the energy and 2-RDM from the ACSE calculation. The accuracy of the ACSE, we find, is less sensitive to the size of the active space than the accuracy of other wavefunction methods, which is useful when large active space calculations are computationally infeasible.

  20. Thin-section computed tomography-histopathologic comparisons of pulmonary focal interstitial fibrosis, atypical adenomatous hyperplasia, adenocarcinoma in situ, and minimally invasive adenocarcinoma with pure ground-glass opacity.

    Science.gov (United States)

    Si, Ming-Jue; Tao, Xiao-Feng; Du, Guang-Ye; Cai, Ling-Ling; Han, Hong-Xiu; Liang, Xi-Zi; Zhao, Jiang-Min

    2016-10-01

    To retrospectively compare focal interstitial fibrosis (FIF), atypical adenomatous hyperplasia (AAH), adenocarcinoma in situ (AIS), and minimally invasive adenocarcinoma (MIA) with pure ground-glass opacity (GGO) using thin-section computed tomography (CT). Sixty pathologically confirmed cases were reviewed including 7 cases of FIF, 17 of AAH, 23of AIS, and 13 of MIA. All nodules kept pure ground glass appearances before surgical resection and their last time of thin-section CT imaging data before operation were collected. Differences of patient demographics and CT features were compared among these four types of lesions. FIF occurred more frequently in males and smokers while the others occurred more frequently in female nonsmokers. Nodule size was significant larger in MIA (P0.05) in age, malignant history, attenuation value, location, and presence of bubble-like lucency. A nodule size >7.5mm increases the possibility of MIA. A concave margin could be useful for differentiation of FIF from the other malignant or pre-malignant GGO nodules. The presence of spiculation or pleural indentation may preclude the diagnosis of AAH. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, Exploration, and Human Health and Safety

    Science.gov (United States)

    Koontz, Steve

    2015-01-01

    In this presentation a review of galactic cosmic ray (GCR) effects on microelectronic systems and human health and safety is given. The methods used to evaluate and mitigate unwanted cosmic ray effects in ground-based, atmospheric flight, and space flight environments are also reviewed. However not all GCR effects are undesirable. We will also briefly review how observation and analysis of GCR interactions with planetary atmospheres and surfaces and reveal important compositional and geophysical data on earth and elsewhere. About 1000 GCR particles enter every square meter of Earth’s upper atmosphere every second, roughly the same number striking every square meter of the International Space Station (ISS) and every other low- Earth orbit spacecraft. GCR particles are high energy ionized atomic nuclei (90% protons, 9% alpha particles, 1% heavier nuclei) traveling very close to the speed of light. The GCR particle flux is even higher in interplanetary space because the geomagnetic field provides some limited magnetic shielding. Collisions of GCR particles with atomic nuclei in planetary atmospheres and/or regolith as well as spacecraft materials produce nuclear reactions and energetic/highly penetrating secondary particle showers. Three twentieth century technology developments have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems and assess effects on human health and safety effects. The key technology developments are: 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems. Space and geophysical exploration needs drove the development of the instruments and analytical tools needed to recover compositional and structural data from GCR induced nuclear reactions and secondary particle showers. Finally, the

  2. Significant RF-EMF and thermal levels observed in a computational model of a person with a tibial plate for grounded 40 MHz exposure.

    Science.gov (United States)

    McIntosh, Robert L; Iskra, Steve; Anderson, Vitas

    2014-05-01

    Using numerical modeling, a worst-case scenario is considered when a person with a metallic implant is exposed to a radiofrequency (RF) electromagnetic field (EMF). An adult male standing on a conductive ground plane was exposed to a 40 MHz vertically polarized plane wave field, close to whole-body resonance where maximal induced current flows are expected in the legs. A metal plate (50-300 mm long) was attached to the tibia in the left leg. The findings from this study re-emphasize the need to ensure compliance with limb current reference levels for exposures near whole-body resonance, and not just rely on compliance with ambient electric (E) and magnetic (H) field reference levels. Moreover, we emphasize this recommendation for someone with a tibial plate, as failure to comply may result in significant tissue damage (increases in the localized temperature of 5-10 °C were suggested by the modeling for an incident E-field of 61.4 V/m root mean square (rms)). It was determined that the occupational reference level for limb current (100 mA rms), as stipulated in the 1998 guidelines of the International Commission on Non-Ionizing Radiation Protection (ICNIRP), is satisfied if the plane wave incident E-field levels are no more than 29.8 V/m rms without an implant and 23.4 V/m rms for the model with a 300 mm implant.

  3. Cleanup Verification Package for the 618-2 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2006-12-28

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities.

  4. Ground crewmen maneuver the Helios Prototype flying wing on its ground support dolly during function

    Science.gov (United States)

    2001-01-01

    Ground crewmen maneuver AeroVironment's solar-powered Helios Prototype flying wing on its ground support dolly during functional checkouts prior to its first flights under solar power from the U.S. Navy's Pacific Missile Range Facility on Kaua'i, Hawaii.

  5. Measurement of ground motion in various sites

    Energy Technology Data Exchange (ETDEWEB)

    Bialowons, W.; Amirikas, R.; Bertolini, A.; Kruecker, D.

    2007-04-15

    Ground vibrations may affect low emittance beam transport in linear colliders, Free Electron Lasers (FEL) and synchrotron radiation facilities. This paper is an overview of a study program to measure ground vibrations in various sites which can be used for site characterization in relation to accelerator design. Commercial broadband seismometers have been used to measure ground vibrations and the resultant database is available to the scientific community. The methodology employed is to use the same equipment and data analysis tools for ease of comparison. This database of ground vibrations taken in 19 sites around the world is first of its kind. (orig.)

  6. Reading visually embodied meaning from the brain: Visually grounded computational models decode visual-object mental imagery induced by written text.

    Science.gov (United States)

    Anderson, Andrew James; Bruni, Elia; Lopopolo, Alessandro; Poesio, Massimo; Baroni, Marco

    2015-10-15

    Embodiment theory predicts that mental imagery of object words recruits neural circuits involved in object perception. The degree of visual imagery present in routine thought and how it is encoded in the brain is largely unknown. We test whether fMRI activity patterns elicited by participants reading objects' names include embodied visual-object representations, and whether we can decode the representations using novel computational image-based semantic models. We first apply the image models in conjunction with text-based semantic models to test predictions of visual-specificity of semantic representations in different brain regions. Representational similarity analysis confirms that fMRI structure within ventral-temporal and lateral-occipital regions correlates most strongly with the image models and conversely text models correlate better with posterior-parietal/lateral-temporal/inferior-frontal regions. We use an unsupervised decoding algorithm that exploits commonalities in representational similarity structure found within both image model and brain data sets to classify embodied visual representations with high accuracy (8/10) and then extend it to exploit model combinations to robustly decode different brain regions in parallel. By capturing latent visual-semantic structure our models provide a route into analyzing neural representations derived from past perceptual experience rather than stimulus-driven brain activity. Our results also verify the benefit of combining multimodal data to model human-like semantic representations. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Asian Facilities

    Science.gov (United States)

    Nakahata, M.

    2011-04-01

    Asian underground facilities are reviewed. The YangYang underground Laboratory in Korea and the Kamioka observatory in Japan are operational and several astrophysical experiments are running. Indian Neutrino Observatory(INO) and China JinPing Underground Laboratory (CJPL) are under construction and underground experiments are being prepared. Current activities and future prospects at those underground sites are described.

  8. Regional analysis of ground and above-ground climate

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-01

    The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long-term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of earth tempering as a practice and of specific earth-sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermal advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Also contained in the report are reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 29 locations in the United States.

  9. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  10. Spacelab Ground Processing

    Science.gov (United States)

    Scully, Edward J.; Gaskins, Roger B.

    1982-02-01

    Spacelab (SL) ground processing is active at the Kennedy Space Center (KSC). The palletized payload for the second Shuttle launch is staged and integrated with interface verification active. The SL Engineering Model is being assembled for subsequent test and checkout activities. After delivery of SL flight elements from Europe, prelaunch operations for the first SL flight start with receipt of the flight experiment packages and staging of the SL hardware. Experiment operations consist of integrating the various experiment elements into the SL racks, floors and pallets. Rack and floor assemblies with the experiments installed, are integrated into the flight module. Aft end-cone installation, pallet connections, and SL subsystems interface verifications are accomplished, and SL-Orbiter interfaces verified. The Spacelab cargo is then transferred to the Orbiter Processing Facility (OPF) in a controlled environment using a canister/transporter. After the SL is installed into the Orbiter payload bay, physical and functional integrity of all payload-to-Orbiter interfaces are verified and final close-out operations conducted. Spacelab payload activities at the launch pad are minimal with the payload bay doors remaining closed. Limited access is available to the module through the Spacelab Transfer Tunnel. After mission completion, the SL is removed from the Orbiter in the OPF and returned to the SL processing facility for experiment equipment removal and reconfiguration for the subsequent mission.

  11. Scalable distributed computing hierarchy: cloud, fog and dew computing

    OpenAIRE

    Skala, Karolj; Davidović, Davor; Afgan, Enis; Sović, Ivan; Šojat, Zorislav

    2015-01-01

    The paper considers the conceptual approach for organization of the vertical hierarchical links between the scalable distributed computing paradigms: Cloud Computing, Fog Computing and Dew Computing. In this paper, the Dew Computing is described and recognized as a new structural layer in the existing distributed computing hierarchy. In the existing computing hierarchy, the Dew computing is positioned as the ground level for the Cloud and Fog computing paradigms. Vertical, complementary, hier...

  12. Arc Heating Facility and Test Technique for Planetary Entry Missions

    OpenAIRE

    2003-01-01

    A 1-MW segmented-type arc heater has been designed and installed in the ISAS high enthalpy flow facility for the purpose of basic study of aerothermophysics and the development of thermal protection materials for the atmospheric hypersonic vehicles. The aerothermophysical flight environment for the vehicles, generally speaking, can not be duplicated in the ground facility. In most cases of vehicles reentering with super-orbital velocity, the flow enthalpy of the ground facility submits to be ...

  13. Research and design of asset management system for coal mine facilities based on cloud computing%基于云计算的煤矿设备资产管理系统的研究与设计

    Institute of Scientific and Technical Information of China (English)

    李勇

    2014-01-01

    The current situation of asset management of coal mine facilities was introduced, the existing typical problems were analyzed,and the new asset management plan was proposed on the basis of cloud computing. Furthermore,the general plan and the system frame were intro-duced in detail. And the key techniques for the system were discussed at last.%介绍了煤矿设备资产管理方案的现状,分析了煤矿设备资产管理存在的典型问题,提出了基于云计算技术的煤矿设备资产管理方案,详细介绍了系统的总体方案和系统框架,最后对系统所涉及的关键技术进行了论述。

  14. Guide to user facilities at the Lawrence Berkeley Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    1984-04-01

    Lawrence Berkeley Laboratories' user facilities are described. Specific facilities include: the National Center for Electron Microscopy; the Bevalac; the SuperHILAC; the Neutral Beam Engineering Test Facility; the National Tritium Labeling Facility; the 88 inch Cyclotron; the Heavy Charged-Particle Treatment Facility; the 2.5 MeV Van de Graaff; the Sky Simulator; the Center for Computational Seismology; and the Low Background Counting Facility. (GHT)

  15. On the validation of SPDM task verification facility

    NARCIS (Netherlands)

    Ma, Ou; Wang, Jiegao; Misra, Sarthak; Liu, Michael

    2004-01-01

    This paper describes a methodology for validating a ground-based, hardware-in-the-loop, space-robot simulation facility. This facility, called ‘‘SPDM task verification facility,’’ is being developed by the Canadian Space Agency for the purpose of verifying the contact dynamics performance of the spe

  16. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  17. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.

  18. Trends in Facility Management Technology: The Emergence of the Internet, GIS, and Facility Assessment Decision Support.

    Science.gov (United States)

    Teicholz, Eric

    1997-01-01

    Reports research on trends in computer-aided facilities management using the Internet and geographic information system (GIS) technology for space utilization research. Proposes that facility assessment software holds promise for supporting facility management decision making, and outlines four areas for its use: inventory; evaluation; reporting;…

  19. Ground Truth Annotation in T Analyst

    DEFF Research Database (Denmark)

    2015-01-01

    This video shows how to annotate the ground truth tracks in the thermal videos. The ground truth tracks are produced to be able to compare them to tracks obtained from a Computer Vision tracking approach. The program used for annotation is T-Analyst, which is developed by Aliaksei Laureshyn, Ph...

  20. Ground Truth Annotation in T Analyst

    DEFF Research Database (Denmark)

    2015-01-01

    This video shows how to annotate the ground truth tracks in the thermal videos. The ground truth tracks are produced to be able to compare them to tracks obtained from a Computer Vision tracking approach. The program used for annotation is T-Analyst, which is developed by Aliaksei Laureshyn, Ph...

  1. Emission Facilities - Erosion & Sediment Control Facilities

    Data.gov (United States)

    NSGIC Education | GIS Inventory — An Erosion and Sediment Control Facility is a DEP primary facility type related to the Water Pollution Control program. The following sub-facility types related to...

  2. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  3. Are There Any Additional Benefits to Performing Positron Emission Tomography/Computed Tomography Scans and Brain Magnetic Resonance Imaging on Patients with Ground-Glass Nodules Prior to Surgery?

    Science.gov (United States)

    Song, Jae Uk; Song, Junwhi; Lee, Kyung Jong; Kim, Hojoong; Kwon, O Jung; Choi, Joon Young; Kim, Jhingook; Han, Joungho; Um, Sang Won

    2017-10-01

    A ground-glass nodule (GGN) represents early-stage lung adenocarcinoma. However, there is still no consensus for preoperative staging of GGNs. Therefore, we evaluated the need for the routine use of positron emission tomography/computed tomography (PET)/computed tomography (CT) scans and brain magnetic resonance imaging (MRI) during staging. A retrospective analysis was undertaken in 72 patients with 74 GGNs of less than 3 cm in diameter, which were confirmed via surgery as malignancy, at the Samsung Medical Center between May 2010 and December 2011. The median age of the patients was 59 years. The median GGN diameter was 18 mm. Pure and part-solid GGNs were identified in 35 (47.3%) and 39 (52.7%) cases, respectively. No mediastinal or distant metastasis was observed in these patients. In preoperative staging, all of the 74 GGNs were categorized as stage IA via chest CT scans. Additional PET/CT scans and brain MRIs classified 71 GGNs as stage IA, one as stage IIIA, and two as stage IV. However, surgery and additional diagnostic work-ups for abnormal findings from PET/CT scans classified 70 GGNs as stage IA, three as stage IB, and one as stage IIA. The chest CT scans did not differ from the combined modality of PET/CT scans and brain MRIs for the determination of the overall stage (94.6% vs. 90.5%; kappa value, 0.712). PET/CT scans in combination with brain MRIs have no additional benefit for the staging of patients with GGN lung adenocarcinoma before surgery.

  4. A non-randomized confirmatory trial of segmentectomy for clinical T1N0 lung cancer with dominant ground glass opacity based on thin-section computed tomography (JCOG1211).

    Science.gov (United States)

    Aokage, Keiju; Saji, Hisashi; Suzuki, Kenji; Mizutani, Tomonori; Katayama, Hiroshi; Shibata, Taro; Watanabe, Syunichi; Asamura, Hisao

    2017-05-01

    Lobectomy has been the standard surgery for even stage I lung cancer since the validity of limited resection for stage I lung cancer was denied by the randomized study reported in 1995. The aim of this non-randomized confirmatory going on since September 2013 is to confirm the efficacy of a segmentectomy for clinical T1N0 lung cancer with dominant ground glass opacity based on thin-slice computed tomography. A total of 390 patients from 42 Japanese institutions are recruited within 4 years. The primary endpoint of this study is a 5-year relapse-free survival in all of the patients who undergo a segmentectomy for a lung nodule. The secondary endpoints are overall survival, annual relapse-free survival, disease-free survival, proportion of local relapse, postoperative pulmonary function, proportion of segmentectomy completion, proportion of R0 resection completion by segmentectomy, adverse events, and serious adverse events. This trial has been registered at the UMIN Clinical Trials Registry as UMIN000011819 ( http://www.umin.ac.jp/ctr/ ). Patient's accrual has been already finished in November, 2015 and the primary analysis will be performed in 2021. This study is one of the pivotal trial of lung segmentectomy for early lung cancer. The result will provide a clear evidence for our daily clinics and will be possible contribution to preserving pulmonary function for lung cancer patients.

  5. Facile synthesis of silver nanoparticles and its antibacterial activity against Escherichia coli and unknown bacteria on mobile phone touch surfaces/computer keyboards

    Science.gov (United States)

    Reddy, T. Ranjeth Kumar; Kim, Hyun-Joong

    2016-07-01

    In recent years, there has been significant interest in the development of novel metallic nanoparticles using various top-down and bottom-up synthesis techniques. Kenaf is a huge biomass product and a potential component for industrial applications. In this work, we investigated the green synthesis of silver nanoparticles (AgNPs) by using kenaf ( Hibiscus cannabinus) cellulose extract and sucrose, which act as stabilizing and reducing agents in solution. With this method, by changing the pH of the solution as a function of time, we studied the optical, morphological and antibacterial properties of the synthesized AgNPs. In addition, these nanoparticles were characterized by Ultraviolet-visible spectroscopy, transmission electron microscopy (TEM), field-emission scanning electron microscopy, Fourier transform infrared (FTIR) spectroscopy and energy-dispersive X-ray spectroscopy (EDX). As the pH of the solution varies, the surface plasmon resonance peak also varies. A fast rate of reaction at pH 10 compared with that at pH 5 was identified. TEM micrographs confirm that the shapes of the particles are spherical and polygonal. Furthermore, the average size of the nanoparticles synthesized at pH 5, pH 8 and pH 10 is 40.26, 28.57 and 24.57 nm, respectively. The structure of the synthesized AgNPs was identified as face-centered cubic (fcc) by XRD. The compositional analysis was determined by EDX. FTIR confirms that the kenaf cellulose extract and sucrose act as stabilizing and reducing agents for the silver nanoparticles. Meanwhile, these AgNPs exhibited size-dependent antibacterial activity against Escherichia coli ( E. coli) and two other unknown bacteria from mobile phone screens and computer keyboard surfaces.

  6. Ground Motion Models and Computer Techniques

    Science.gov (United States)

    1972-04-01

    of the Ouid pressure through the pores of the rock. Both the cime rate of loading in the rock matrix and the pore fluid are depicted. The...oU 10 m«t the lull range" of pressure and strain encountered in undo rg.oun tests Gene , aliped Moh.-Coulonb nodels include one without work

  7. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  8. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other facilities...

  9. Theme: Laboratory Facilities Improvement.

    Science.gov (United States)

    Miller, Glen M.; And Others

    1993-01-01

    Includes "Laboratory Facilities Improvement" (Miller); "Remodeling Laboratories for Agriscience Instruction" (Newman, Johnson); "Planning for Change" (Mulcahy); "Laboratory Facilities Improvement for Technology Transfer" (Harper); "Facilities for Agriscience Instruction" (Agnew et al.); "Laboratory Facility Improvement" (Boren, Dwyer); and…

  10. The Galileo Ground Segment Integrity Algorithms: Design and Performance

    Directory of Open Access Journals (Sweden)

    Carlos Hernández Medel

    2008-01-01

    Full Text Available Galileo, the European Global Navigation Satellite System, will provide to its users highly accurate global positioning services and their associated integrity information. The element in charge of the computation of integrity messages within the Galileo Ground Mission Segment is the integrity processing facility (IPF, which is developed by GMV Aerospace and Defence. The main objective of this paper is twofold: to present the integrity algorithms implemented in the IPF and to show the achieved performance with the IPF software prototype, including aspects such as: implementation of the Galileo overbounding concept, impact of safety requirements on the algorithm design including the threat models for the so-called feared events, and finally the achieved performance with real GPS and simulated Galileo scenarios.

  11. Water Activities in Laxemar Simpevarp. The final disposal facility for spent nuclear fuel - removal of groundwater and water activities above ground; Vattenverksamhet i Laxemar-Simpevarp. Slutfoervarsanlaeggning foer anvaent kaernbraensle - bortledande av grundvatten samt vattenverksamheter ovan mark

    Energy Technology Data Exchange (ETDEWEB)

    Werner, Kent (EmpTec (Sweden)); Hamren, Ulrika; Collinder, Per (Ekologigruppen AB (Sweden))

    2010-12-15

    This report concerns water operations (Chapter 11 in the Environmental Code) below and above ground associated with construction, operation, and decommissioning of a repository for spent nuclear fuel in Laxemar in the municipality of Oskarshamn. SKB has chosen Forsmark in the municipality of Oesthammar as site for the repository, and the report hence describes a non-chosen alternative. The report provides a comprehensive description of how the water operations would be executed, their hydrogeological and hydrological effects and the resulting consequences. The description is a background material for comparisons between the two sites in terms of water operations. The underground part of a repository in Laxemar would, among other things, consist of an access ramp and a repository area at a depth of approximately 500 metres. The construction, operation, and decommissioning phases would in total comprise a time period of 60-70 years. Inflowing groundwater would be diverted during construction and operation. The modelling tool MIKE SHE has been used to assess the effects of the groundwater diversion, for instance in terms of groundwater levels and stream discharges. According to MIKE SHE calculations for a hypothetical case with a fully open repository, the total groundwater inflow would be in the order of 55-90 litres per second depending on the permeability of the grouted zone around ramp, shafts and tunnels. In reality, the whole repository would not be open simultaneously, and the inflow would therefore be less. The groundwater diversion would cause groundwater- level drawdown in the rock, which in turn would lead to drawdown of the groundwater table in relatively large areas above and around the repository. According to model calculations, there would be an insignificant drawdown of the water level in Lake Frisksjoen, the largest lake in the area. The discharge in the most important stream of the area (Laxemaraan) would be reduced by less than ten percent

  12. 超声速燃烧地面试验的蓄热式加热器及其关键技术%Thermal energy storage heater and its key technologies for supersonic combustion ground test facilities

    Institute of Scientific and Technical Information of China (English)

    李龙飞; 王延涛; 杨伟东; 洪流

    2012-01-01

    为了模拟飞行状态下进入超燃冲压发动机燃烧室的高焓空气,在地面模拟试验中需要对空气加热,可再生蓄热式加热器是一种能提供相对纯净高焓空气的试验设备。介绍了蓄热式加热器的工作原理与特点,分析了关键技术。结果表明,蓄热式加热器具有加热空气总温高、流量大和相对纯净的优点,是我国超燃冲压发动机地面试验的发展趋势,但蓄热阵材料、加热器结构、超高温阀和大范围调节预热燃烧器等是关键技术,有待进一步研究和攻关。%In order to simulate the air temperature in the combustion chamber of scramjet,the air used by the ground simulation test should be heated.This paper presents the design specification of a cored brick storage heater,which can supply high temperature clean air to meet the demands of supersonic combustion experiments.Key technologies of developing thermal energy storage heater are analyzed.The results show that with proper material and structural design,it is possible to use a thermal energy storage heater to obtain clean air flow of Mach 6.However,real performance of materials,the heater structure,the ultra high-temperature valve and gas generator still need to be studied to solve the remaining issues in the thermal energy storage heater.

  13. Procedures for computing site seismicity

    Science.gov (United States)

    Ferritto, John

    1994-02-01

    This report was prepared as part of the Navy's Seismic Hazard Mitigation Program. The Navy has numerous bases located in seismically active regions throughout the world. Safe effective design of waterfront structures requires determining expected earthquake ground motion. The Navy's problem is further complicated by the presence of soft saturated marginal soils that can significantly amplify the levels of seismic shaking as evidenced in the 1989 Loma Prieta earthquake. The Naval Facilities Engineering Command's seismic design manual, NAVFAC P355.l, requires a probabilistic assessment of ground motion for design of essential structures. This report presents the basis for the Navy's Seismic Hazard Analysis procedure that was developed and is intended to be used with the Seismic Hazard Analysis computer program and user's manual. This report also presents data on geology and seismology to establish the background for the seismic hazard model developed. The procedure uses the historical epicenter data base and available geologic data, together with source models, recurrence models, and attenuation relationships to compute the probability distribution of site acceleration and an appropriate spectra. This report discusses the developed stochastic model for seismic hazard evaluation and the associated research.

  14. A ship-borne meteorological station for ground truth measurements

    Digital Repository Service at National Institute of Oceanography (India)

    Desai, R.G.P.; Desa, B.A.E.

    Oceanographic upwelling studies required ground truth measurements of meteorological parameters and sea surface temperature to be made from a research vessel which did not have the necessary facilities. A ship-borne station was therefore designed...

  15. Subsurface investigation with ground penetrating radar

    Science.gov (United States)

    Ground penetrating radar (GPR) data was collected on a small test plot at the OTF/OSU Turfgrass Research & Education Facility in Columbus, Ohio. This test plot was built to USGA standards for a golf course green, with a constructed sand layer just beneath the surface overlying a gravel layer, that i...

  16. Biotechnology Facility (BTF) for ISS

    Science.gov (United States)

    1998-01-01

    Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.

  17. Cad Graphics in Facilities Planning.

    Science.gov (United States)

    Collier, Linda M.

    1984-01-01

    By applying a computer-aided drafting system to a range of facilities layouts and plans, a division of Tektronix, Inc., Oregon, is maintaining staffing levels with an added workload. The tool is also being used in other areas of the company for illustration, design, and administration. (MLF)

  18. LLNL superconducting magnets test facility

    Energy Technology Data Exchange (ETDEWEB)

    Manahan, R; Martovetsky, N; Moller, J; Zbasnik, J

    1999-09-16

    The FENIX facility at Lawrence Livermore National Laboratory was upgraded and refurbished in 1996-1998 for testing CICC superconducting magnets. The FENIX facility was used for superconducting high current, short sample tests for fusion programs in the late 1980s--early 1990s. The new facility includes a 4-m diameter vacuum vessel, two refrigerators, a 40 kA, 42 V computer controlled power supply, a new switchyard with a dump resistor, a new helium distribution valve box, several sets of power leads, data acquisition system and other auxiliary systems, which provide a lot of flexibility in testing of a wide variety of superconducting magnets in a wide range of parameters. The detailed parameters and capabilities of this test facility and its systems are described in the paper.

  19. Space shuttle/food system study. Volume 2, Appendix G: Ground support system analysis. Appendix H: Galley functional details analysis

    Science.gov (United States)

    1974-01-01

    The capabilities for preflight feeding of flight personnel and the supply and control of the space shuttle flight food system were investigated to determine ground support requirements; and the functional details of an onboard food system galley are shown in photographic mockups. The elements which were identified as necessary to the efficient accomplishment of ground support functions include the following: (1) administration; (2) dietetics; (3) analytical laboratories; (4) flight food warehouse; (5) stowage module assembly area; (6) launch site module storage area; (7) alert crew restaurant and disperse crew galleys; (8) ground food warehouse; (9) manufacturing facilities; (10) transport; and (11) computer support. Each element is discussed according to the design criteria of minimum cost, maximum flexibility, reliability, and efficiency consistent with space shuttle requirements. The galley mockup overview illustrates the initial operation configuration, food stowage locations, meal assembly and serving trays, meal preparation configuration, serving, trash management, and the logistics of handling and cleanup equipment.

  20. Underground Facilities, Technological Challenges

    CERN Document Server

    Spooner, N

    2010-01-01

    This report gives a summary overview of the status of international under- ground facilities, in particular as relevant to long-baseline neutrino physics and neutrino astrophysics. The emphasis is on the technical feasibility aspects of creating the large underground infrastructures that will be needed in the fu- ture to house the necessary detectors of 100 kton to 1000 kton scale. There is great potential in Europe to build such a facility, both from the technical point of view and because Europe has a large concentration of the necessary engi- neering and geophysics expertise. The new LAGUNA collaboration has made rapid progress in determining the feasibility for a European site for such a large detector. It is becoming clear in fact that several locations are technically fea- sible in Europe. Combining this with the possibility of a new neutrino beam from CERN suggests a great opportunity for Europe to become the leading centre of neutrino studies, combining both neutrino astrophysics and neutrino beam stu...

  1. FRACTURING FLUID CHARACTERIZATION FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Subhash Shah

    2000-08-01

    Hydraulic fracturing technology has been successfully applied for well stimulation of low and high permeability reservoirs for numerous years. Treatment optimization and improved economics have always been the key to the success and it is more so when the reservoirs under consideration are marginal. Fluids are widely used for the stimulation of wells. The Fracturing Fluid Characterization Facility (FFCF) has been established to provide the accurate prediction of the behavior of complex fracturing fluids under downhole conditions. The primary focus of the facility is to provide valuable insight into the various mechanisms that govern the flow of fracturing fluids and slurries through hydraulically created fractures. During the time between September 30, 1992, and March 31, 2000, the research efforts were devoted to the areas of fluid rheology, proppant transport, proppant flowback, dynamic fluid loss, perforation pressure losses, and frictional pressure losses. In this regard, a unique above-the-ground fracture simulator was designed and constructed at the FFCF, labeled ''The High Pressure Simulator'' (HPS). The FFCF is now available to industry for characterizing and understanding the behavior of complex fluid systems. To better reflect and encompass the broad spectrum of the petroleum industry, the FFCF now operates under a new name of ''The Well Construction Technology Center'' (WCTC). This report documents the summary of the activities performed during 1992-2000 at the FFCF.

  2. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  3. Leaders break ground for INFINITY

    Science.gov (United States)

    2008-01-01

    Community leaders from Mississippi and Louisiana break ground for the new INFINITY at NASA Stennis Space Center facility during a Nov. 20 ceremony. Groundbreaking participants included (l to r): Gottfried Construction representative John Smith, Mississippi Highway Commissioner Wayne Brown, INFINITY board member and Apollo 13 astronaut Fred Haise, Stennis Director Gene Goldman, Studio South representative David Hardy, Leo Seal Jr. family representative Virginia Wagner, Hancock Bank President George Schloegel, Mississippi Rep. J.P. Compretta, Mississippi Band of Choctaw Indians representative Charlie Benn and Louisiana Sen. A.G. Crowe.

  4. The Envisat-1 ground segment

    Science.gov (United States)

    Harris, Ray; Ashton, Martin

    1995-03-01

    The European Space Agency (ESA) Earth Remote Sensing Satellite (ERS-1 and ERS-2) missions will be followed by the Polar Orbit Earth Mission (POEM) program. The first of the POEM missions will be Envisat-1. ESA has completed the design phase of the ground segment. This paper presents the main elements of that design. The main part of this paper is an overview of the Payload Data Segment (PDS) which is the core of the Envisat-1 ground segment, followed by two further sections which describe in more detail the facilities to be offered by the PDS for archiving and for user servcies. A further section describes some future issues for ground segment development. Logica was the prime contractor of a team of 18 companies which undertook the ESA financed architectural design study of the Envisat-1 ground segment. The outputs of the study included detailed specifications of the components that will acquire, process, archive and disseminate the payload data, together with the functional designs of the flight operations and user data segments.

  5. ECR ion source based low energy ion beam facility

    Indian Academy of Sciences (India)

    P Kumar; G Rodrigues; U K Rao; C P Safvan; D Kanjilal; A Roy

    2002-11-01

    Mass analyzed highly charged ion beams of energy ranging from a few keV to a few MeV plays an important role in various aspects of research in modern physics. In this paper a unique low energy ion beam facility (LEIBF) set up at Nuclear Science Centre (NSC) for providing low and medium energy multiply charged ion beams ranging from a few keV to a few MeV for research in materials sciences, atomic and molecular physics is described. One of the important features of this facility is the availability of relatively large currents of multiply charged positive ions from an electron cyclotron resonance (ECR) source placed entirely on a high voltage platform. All the electronic and vacuum systems related to the ECR source including 10 GHz ultra high frequency (UHF) transmitter, high voltage power supplies for extractor and Einzel lens are placed on a high voltage platform. All the equipments are controlled using a personal computer at ground potential through optical fibers for high voltage isolation. Some of the experimental facilities available are also described.

  6. Ground state of a confined Yukawa plasma

    CERN Document Server

    Henning, C; Block, D; Bonitz, M; Golubnichiy, V; Ludwig, P; Piel, A

    2006-01-01

    The ground state of an externally confined one-component Yukawa plasma is derived analytically. In particular, the radial density profile is computed. The results agree very well with computer simulations on three-dimensional spherical Coulomb crystals. We conclude in presenting an exact equation for the density distribution for a confinement potential of arbitrary geometry.

  7. Ground state of 16O

    Science.gov (United States)

    Pieper, Steven C.; Wiringa, R. B.; Pandharipande, V. R.

    1990-01-01

    A variational method is used to study the ground state of 16O. Expectation values are computed with a cluster expansion for the noncentral correlations in the wave function; the central correlations and exchanges are treated to all orders by Monte Carlo integration. The expansion has good convergence. Results are reported for the Argonne v14 two-nucleon and Urbana VII three-nucleon potentials.

  8. Ground water and energy

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    This national workshop on ground water and energy was conceived by the US Department of Energy's Office of Environmental Assessments. Generally, OEA needed to know what data are available on ground water, what information is still needed, and how DOE can best utilize what has already been learned. The workshop focussed on three areas: (1) ground water supply; (2) conflicts and barriers to ground water use; and (3) alternatives or solutions to the various issues relating to ground water. (ACR)

  9. 40 CFR 257.3-4 - Ground water.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Ground water. 257.3-4 Section 257.3-4... and Practices § 257.3-4 Ground water. (a) A facility or practice shall not contaminate an underground drinking water source beyond the solid waste boundary or beyond an alternative boundary specified...

  10. DynaMax+ ground-tracking algorithm

    Science.gov (United States)

    Smock, Brandon; Gader, Paul; Wilson, Joseph

    2011-06-01

    In this paper, we propose a new method for performing ground-tracking using ground-penetrating radar (GPR). Ground-tracking involves identifying the air-ground interface, which is usually the dominant feature in a radar image but frequently is obscured or mimicked by other nearby elements. It is an important problem in landmine detection using vehicle-mounted systems because antenna motion, caused by bumpy ground, can introduce distortions in downtrack radar images, which ground-tracking makes it possible to correct. Because landmine detection is performed in real-time, any algorithm for ground-tracking must be able to run quickly, prior to other, more computationally expensive algorithms for detection. In this investigation, we first describe an efficient algorithm, based on dynamic programming, that can be used in real-time for tracking the ground. We then demonstrate its accuracy through a quantitative comparison with other proposed ground-tracking methods, and a qualitative comparison showing that its ground-tracking is consistent with human observations in challenging terrain.

  11. Nuclear thermal propulsion test facility requirements and development strategy

    Science.gov (United States)

    Allen, George C.; Warren, John; Clark, J. S.

    1991-01-01

    The Nuclear Thermal Propulsion (NTP) subpanel of the Space Nuclear Propulsion Test Facilities Panel evaluated facility requirements and strategies for nuclear thermal propulsion systems development. High pressure, solid core concepts were considered as the baseline for the evaluation, with low pressure concepts an alternative. The work of the NTP subpanel revealed that a wealth of facilities already exists to support NTP development, and that only a few new facilities must be constructed. Some modifications to existing facilities will be required. Present funding emphasis should be on long-lead-time items for the major new ground test facility complex and on facilities supporting nuclear fuel development, hot hydrogen flow test facilities, and low power critical facilities.

  12. Nuclear thermal propulsion test facility requirements and development strategy

    Science.gov (United States)

    Allen, George C.; Clark, John S.; Warren, John; Perkins, David R.; Martinell, John

    1992-01-01

    The Nuclear Thermal Propulsion (NTP) subpanel of the Space Nuclear Propulsion Test Facilities Panel evaluated facility requirements and strategies for nuclear thermal propulsion systems development. High pressure, solid core concepts were considered as the baseline for the evaluation, with low pressure concepts an alternative. The work of the NTP subpanel revealed that a wealth of facilities already exists to support NTP development, and that only a few new facilities must be constructed. Some modifications to existing facilities will be required. Present funding emphasis should be on long-lead-time items for the major new ground test facility complex and on facilities supporting nuclear fuel development, hot hydrogen flow test facilities, and low power critical facilities.

  13. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  14. INTEGRITY -- Integrated Human Exploration Mission Simulation Facility

    Science.gov (United States)

    Henninger, D.; Tri, T.; Daues, K.

    It is proposed to develop a high -fidelity ground facil ity to carry out long-duration human exploration mission simulations. These would not be merely computer simulations - they would in fact comprise a series of actual missions that just happen to stay on earth. These missions would include all elements of an actual mission, using actual technologies that would be used for the real mission. These missions would also include such elements as extravehicular activities, robotic systems, telepresence and teleoperation, surface drilling technology--all using a simulated planetary landscape. A sequence of missions would be defined that get progressively longer and more robust, perhaps a series of five or six missions over a span of 10 to 15 years ranging in durat ion from 180 days up to 1000 days. This high-fidelity ground facility would operate hand-in-hand with a host of other terrestrial analog sites such as the Antarctic, Haughton Crater, and the Arizona desert. Of course, all of these analog mission simulations will be conducted here on earth in 1-g, and NASA will still need the Shuttle and ISS to carry out all the microgravity and hypogravity science experiments and technology validations. The proposed missions would have sufficient definition such that definitive requirements could be derived from them to serve as direction for all the program elements of the mission. Additionally, specific milestones would be established for the "launch" date of each mission so that R&D programs would have both good requirements and solid milestones from which to build their implementation plans. Mission aspects that could not be directly incorporated into the ground facility would be simulated via software. New management techniques would be developed for evaluation in this ground test facility program. These new techniques would have embedded metrics which would allow them to be continuously evaluated and adjusted so that by the time the sequence of missions is completed

  15. 7 CFR 500.23 - Fees for commercial photography and cinematography on grounds.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Fees for commercial photography and cinematography on... National Arboretum Facilities and Grounds § 500.23 Fees for commercial photography and cinematography on... photography or cinematography as specified in § 500.24. Facilities and grounds are available for use...

  16. Computational requirements for on-orbit identification of space systems

    Science.gov (United States)

    Hadaegh, Fred Y.

    1988-01-01

    For the future space systems, on-orbit identification (ID) capability will be required to complement on-orbit control, due to the fact that the dynamics of large space structures, spacecrafts, and antennas will not be known sufficiently from ground modeling and testing. The computational requirements for ID of flexible structures such as the space station (SS) or the large deployable reflectors (LDR) are however, extensive due to the large number of modes, sensors, and actuators. For these systems the ID algorithm operations need not be computed in real-time, only in near real-time, or an appropriate mission time. Consequently the space systems will need advanced processors and efficient parallel processing algorithm design and architectures to implement the identification algorithms in near real-time. The MAX computer currently being developed may handle such computational requirements. The purpose is to specify the on-board computational requirements for dynamic and static identification for large space structures. The computational requirements for six ID algorithms are presented in the context of three examples: the JPL/AFAL ground antenna facility, the space station (SS), and the large deployable reflector (LDR).

  17. Jupiter Laser Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Jupiter Laser Facility is an institutional user facility in the Physical and Life Sciences Directorate at LLNL. The facility is designed to provide a high degree...

  18. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  19. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  20. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  1. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  2. Aperture area measurement facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST has established an absolute aperture area measurement facility for circular and near-circular apertures use in radiometric instruments. The facility consists of...

  3. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology research The Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  4. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  5. The DOE ARM Aerial Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, Beat; Tomlinson, Jason M.; Hubbe, John M.; Comstock, Jennifer M.; Mei, Fan; Chand, Duli; Pekour, Mikhail S.; Kluzek, Celine D.; Andrews, Elisabeth; Biraud, S.; McFarquhar, Greg

    2014-05-01

    The Department of Energy Atmospheric Radiation Measurement (ARM) Program is a climate research user facility operating stationary ground sites that provide long-term measurements of climate relevant properties, mobile ground- and ship-based facilities to conduct shorter field campaigns (6-12 months), and the ARM Aerial Facility (AAF). The airborne observations acquired by the AAF enhance the surface-based ARM measurements by providing high-resolution in-situ measurements for process understanding, retrieval-algorithm development, and model evaluation that are not possible using ground- or satellite-based techniques. Several ARM aerial efforts were consolidated into the AAF in 2006. With the exception of a small aircraft used for routine measurements of aerosols and carbon cycle gases, AAF at the time had no dedicated aircraft and only a small number of instruments at its disposal. In this "virtual hangar" mode, AAF successfully carried out several missions contracting with organizations and investigators who provided their research aircraft and instrumentation. In 2009, AAF started managing operations of the Battelle-owned Gulfstream I (G-1) large twin-turboprop research aircraft. Furthermore, the American Recovery and Reinvestment Act of 2009 provided funding for the procurement of over twenty new instruments to be used aboard the G-1 and other AAF virtual-hangar aircraft. AAF now executes missions in the virtual- and real-hangar mode producing freely available datasets for studying aerosol, cloud, and radiative processes in the atmosphere. AAF is also engaged in the maturation and testing of newly developed airborne sensors to help foster the next generation of airborne instruments.

  6. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  7. A global approach to ground state solutions

    Directory of Open Access Journals (Sweden)

    Philip Korman

    2008-08-01

    Full Text Available We study radial solutions of semilinear Laplace equations. We try to understand all solutions of the problem, regardless of the boundary behavior. It turns out that one can study uniqueness or multiplicity properties of ground state solutions by considering curves of solutions of the corresponding Dirichlet and Neumann problems. We show that uniqueness of ground state solutions can sometimes be approached by a numerical computation.

  8. A global approach to ground state solutions

    OpenAIRE

    2008-01-01

    We study radial solutions of semilinear Laplace equations. We try to understand all solutions of the problem, regardless of the boundary behavior. It turns out that one can study uniqueness or multiplicity properties of ground state solutions by considering curves of solutions of the corresponding Dirichlet and Neumann problems. We show that uniqueness of ground state solutions can sometimes be approached by a numerical computation.

  9. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  10. The ESO Adaptive Optics Facility

    Science.gov (United States)

    Ströbele, S.; Arsenault, R.; Bacon, R.; Biasi, R.; Bonaccini-Calia, D.; Downing, M.; Conzelmann, R. D.; Delabre, B.; Donaldson, R.; Duchateau, M.; Esposito, S.; Fedrigo, E.; Gallieni, D.; Hackenberg, W. K. P.; Hubin, N.; Kasper, M.; Kissler-Patig, M.; Le Louarn, M.; McDermid, R.; Oberti, S.; Paufique, J.; Riccardi, A.; Stuik, R.; Vernet, E.

    2006-06-01

    The Adaptive Optics Facility is a project to convert one VLT-UT into a specialized Adaptive Telescope. The present secondary mirror (M2) will be replaced by a new M2-Unit hosting a 1170 actuators deformable mirror. The 3 focal stations will be equipped with instruments adapted to the new capability of this UT. Two instruments are in development for the 2 Nasmyth foci: Hawk-I with its AO module GRAAL allowing a Ground Layer Adaptive Optics correction and MUSE with GALACSI for GLAO correction and Laser Tomography Adaptive Optics correction. A future instrument still needs to be defined for the Cassegrain focus. Several guide stars are required for the type of adaptive corrections needed and a four Laser Guide Star facility (4LGSF) is being developed in the scope of the AO Facility. Convex mirrors like the VLT M2 represent a major challenge for testing and a substantial effort is dedicated to this. ASSIST, is a test bench that will allow testing of the Deformable Secondary Mirror and both instruments with simulated turbulence. This article describes the Adaptive Optics facility systems composing associated with it.

  11. The Rainwater Memorial Calibration Facility for X-Ray Optics

    DEFF Research Database (Denmark)

    Brejnholt, Nicolai; Christensen, Finn Erland; Hailey, Charles J.;

    2011-01-01

    and the energy range of interest were unique requirements not met by any existing facility. In this paper we present the requirements for the NuSTAR optics ground calibration, and how the Rainwater Memorial Calibration Facility, RaMCaF, is designed to meet the calibration requirements. The nearly 175 m long...

  12. Hanford site ground water protection management plan

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-01

    Ground water protection at the Hanford Site consists of preventative and remedial measures that are implemented in compliance with a variety of environmental regulations at local, state, and federal levels. These measures seek to ensure that the resource can sustain a broad range of beneficial uses. To effectively coordinate and ensure compliance with applicable regulations, the U.S. Department of Energy has issued DOE Order 5400.1 (DOE 1988a). This order requires all U.S. Department of Energy facilities to prepare separate ground water protection program descriptions and plans. This document describes the Ground Water Protection Management Plan (GPMP) for the Hanford Site located in the state of Washington. DOE Order 5400.1 specifies that the GPMP covers the following general topical areas: (1) documentation of the ground water regime; (2) design and implementation of a ground water monitoring program to support resource management and comply with applicable laws and regulations; (3) a management program for ground water protection and remediation; (4) a summary and identification of areas that may be contaminated with hazardous waste; (5) strategies for controlling hazardous waste sources; (6) a remedial action program; and (7) decontamination, decommissioning, and related remedial action requirements. Many of the above elements are currently covered by existing programs at the Hanford Site; thus, one of the primary purposes of this document is to provide a framework for coordination of existing ground water protection activities. The GPMP provides the ground water protection policy and strategies for ground water protection/management at the Hanford Site, as well as an implementation plan to improve coordination of site ground water activities.

  13. Guide to research facilities

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This Guide provides information on facilities at US Department of Energy (DOE) and other government laboratories that focus on research and development of energy efficiency and renewable energy technologies. These laboratories have opened these facilities to outside users within the scientific community to encourage cooperation between the laboratories and the private sector. The Guide features two types of facilities: designated user facilities and other research facilities. Designated user facilities are one-of-a-kind DOE facilities that are staffed by personnel with unparalleled expertise and that contain sophisticated equipment. Other research facilities are facilities at DOE and other government laboratories that provide sophisticated equipment, testing areas, or processes that may not be available at private facilities. Each facility listing includes the name and phone number of someone you can call for more information.

  14. Ground motion estimation and nonlinear seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, D.B.; Hutchings, L.J.

    1995-08-14

    Site specific predictions of the dynamic response of structures to extreme earthquake ground motions are a critical component of seismic design for important structures. With the rapid development of computationally based methodologies and powerful computers over the past few years, engineers and scientists now have the capability to perform numerical simulations of many of the physical processes associated with the generation of earthquake ground motions and dynamic structural response. This paper describes application of a physics based, deterministic, computational approach for estimation of earthquake ground motions which relies on site measurements of frequently occurring small (i.e. M < 3 ) earthquakes. Case studies are presented which illustrate application of this methodology for two different sites, and nonlinear analyses of a typical six story steel frame office building are performed to illustrate the potential sensitivity of nonlinear response to site conditions and proximity to the causative fault.

  15. 45 CFR 3.41 - Admission to facilities or grounds.

    Science.gov (United States)

    2010-10-01

    ....41 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CONDUCT OF PERSONS... visiting hours and for approved public events. The enclave is closed to the public at all other times, and the Director may also officially close all or part of the enclave, or any building, in...

  16. Tissue Engineering of Cartilage on Ground-Based Facilities

    DEFF Research Database (Denmark)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth

    2016-01-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free threedimensional (3D) tissue formation. To investigate the shortterm influence, human chondrocytes were cultivated for 2h, 4 h, 16 h...

  17. 9 CFR 416.2 - Establishment grounds and facilities.

    Science.gov (United States)

    2010-01-01

    ... FSIS with the letter of approval from that authority upon request. (g) Water supply and water, ice, and... must make available to FSIS, upon request, a water report, issued under the authority of the State or... uses a private well for its water supply, it must make available to FSIS, upon request,...

  18. Preliminary results of ground-motion characteristics

    Directory of Open Access Journals (Sweden)

    Francesca Bozzoni

    2012-10-01

    Full Text Available The preliminary results are presented herein for the engineering applications of the characteristics of the ground motion induced by the May 20, 2012, Emilia earthquake. Shake maps are computed to provide estimates of the spatial distribution of the induced ground motion. The signals recorded at the Mirandola (MRN station, the closest to the epicenter, have been processed to obtain acceleration, velocity and displacement response spectra. Ground-motion parameters from the MRN recordings are compared with the corresponding estimates from recent ground-motion prediction equations, and with the spectra prescribed by the current Italian Building Code for different return periods. The records from the MRN station are used to plot the particle orbit (hodogram described by the waveform. The availability of results from geotechnical field tests that were performed at a few sites in the Municipality of Mirandola prior to this earthquake of May 2012 has allowed preliminary assessment of the ground response. The amplification effects at Mirandola are estimated using fully stochastic site-response analyses. The seismic input comprises seven actual records that are compatible with the Italian code-based spectrum that refers to a 475-year return period. The computed acceleration response spectrum and the associated dispersion are compared to the spectra calculated from the recordings of the MRN station. Good agreement is obtained for periods up to 1 s, especially for the peak ground acceleration. For the other periods, the spectral acceleration of the MRN recordings exceeds that of the computed spectra.

  19. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  20. Bucharest heavy ion accelerator facility

    Energy Technology Data Exchange (ETDEWEB)

    Ceausescu, V.; Dobrescu, S.; Duma, M.; Indreas, G.; Ivascu, M.; Papureanu, S.; Pascovici, G.; Semenescu, G.

    1986-02-15

    The heavy ion accelerator facility of the Heavy Ion Physics Department at the Institute of Physics and Nuclear Engineering in Bucharest is described. The Tandem accelerator development and the operation of the first stage of the heavy ion postaccelerating system are discussed. Details are given concerning the resonance cavities, the pulsing system matching the dc beam to the RF cavities and the computer control system.

  1. The Ground State of (CS) 4 Is Different from That of (CO) 4 : An Experimental Test of a Computational Prediction by Negative Ion Photoelectron Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jian; Hrovat, David A.; Sun, Zhenrong; Bao, Xiaoguang; Borden, Weston Thatcher; Wang, Xue-Bin

    2013-08-22

    Cyclobutane-1,2,3,4-tetrathione, (CS)4, has recently been calculated to have a singlet ground state, 1A1g, in which the highest b2g MO is doubly occupied and the lowest a2u MO is empty. Thus, (CS)4 is predicted to have a different ground state than its lighter congener, (CO)4, which has a triplet ground state, 3B1u, in which these two MOs are each singly occupied. Here we report the results of a negative ion photoelectron spectroscopy (NIPES) study of the radical anion (CS)4∙-, designed to test the prediction that (CS)4 has a singlet ground state. The NIPE spectrum reveals that (CS)4 does, indeed, have a singlet ground state with electron affinity (EA) = 3.75 eV. The lowest triplet state is found to lie 0.31 eV higher in energy than the ground state, and the open-shell singlet is 0.14 eV higher in energy than the triplet state. Calculations at the (U)CCSD(T)/aug-cc-pVTZ//(U)B3LYP/6-311+G(2df) level support the spectral assignments, giving EA = 3.71 eV, EST = 0.44 eV. These calculated values are, respectively, 0.04 eV (0.9 kcal/mol) smaller, and 0.13 eV (3.0 kcal/mol) larger than the corresponding experimental values. In addition, RASPT2 calculations with various active spaces converge on a 1B1u-3B1u energy gap of 0.137 eV, in excellent agreement with the 0.14 eV energy difference obtained from the NIPE spectrum. Finally, calculations of the Franck-Condon factors for transitions from the ground state of (CS)4∙- to the ground (1A1g) and two excited states (3B1u, 1B1u) of (CS)4 account for all of the major spectral peaks, and nicely reproduce vibrational structure observed in each electronic transition. The close correspondence between the calculated and the observed features in the NIPE spectrum of (CS)4∙- provides unequivocal proof that (CS)4, unlike (CO)4, has a singlet ground state.

  2. 40 CFR 264.97 - General ground-water monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Releases From Solid Waste Management Units § 264.97 General ground-water monitoring requirements. The owner or operator must comply with the following requirements for any ground-water monitoring... 40 Protection of Environment 25 2010-07-01 2010-07-01 false General ground-water...

  3. Radiation protection at synchrotron radiation facilities.

    Science.gov (United States)

    Liu, J C; Vylet, V

    2001-01-01

    A synchrotron radiation (SR) facility typically consists of an injector, a storage ring, and SR beamlines. The latter two features are unique to SR facilities, when compared to other types of accelerator facilities. The SR facilities have the characteristics of low injection beam power, but high stored beam power. The storage ring is generally above ground with people occupying the experimental floor around a normally thin concrete ring wall. This paper addresses the radiation issues, in particular the shielding design, associated with the storage ring and SR beamlines. Normal and abnormal beam losses for injection and stored beams, as well as typical storage ring operation, are described. Ring shielding design for photons and neutrons from beam losses in the ring is discussed. Radiation safety issues and shielding design for SR beamlines, considering gas bremsstrahlung and synchrotron radiation, are reviewed. Radiation source terms and the methodologies for shielding calculations are presented.

  4. Airport Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    travels safely and efficiently through the airport. When an aircraft lands, a significant number of tasks must be performed by different groups of ground crew, such as fueling, baggage handling and cleaning. These tasks must be complete before the aircraft is able to depart, as well as check......-in and security services. These tasks are collectively known as ground handling, and are the major source of activity with airports. The business environments of modern airports are becoming increasingly competitive, as both airports themselves and their ground handling operations are changing to private...... ownership. As airports are in competition to attract airline routes, efficient and reliable ground handling operations are imperative for the viability and continued growth of both airports and airlines. The increasing liberalization of the ground handling market prompts ground handling operators...

  5. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  6. Ground Vehicle Robotics

    Science.gov (United States)

    2013-08-20

    Ground Vehicle Robotics Jim Parker Associate Director, Ground Vehicle Robotics UNCLASSIFIED: Distribution Statement A. Approved for public...DATE 20 AUG 2013 2. REPORT TYPE Briefing Charts 3. DATES COVERED 09-05-2013 to 15-08-2013 4. TITLE AND SUBTITLE Ground Vehicle Robotics 5a...Willing to take Risk on technology -User Evaluated -Contested Environments -Operational Data Applied Robotics for Installation & Base Ops -Low Risk

  7. The Grounded Theory Bookshelf

    Directory of Open Access Journals (Sweden)

    Vivian B. Martin, Ph.D.

    2005-03-01

    Full Text Available Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory.Reworking Qualitative Data, Janet Heaton (Sage, 2004. Paperback, 176 pages, $29.95. Hardcover also available.

  8. A geophysical shock and air blast simulator at the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, K. B.; Brown, C. G.; May, M. J.; Compton, S.; Walton, O. R.; Shingleton, N.; Kane, J. O.; Holtmeier, G.; Loey, H.; Mirkarimi, P. B.; Dunlop, W. H. [Lawrence Livermore National Laboratory, P.O. Box 808, L-481, Livermore, California 94550 (United States); Guyton, R. L.; Huffman, E. [National Securities Technologies, Vasco Rd., Livermore, California 94551 (United States)

    2014-09-15

    The energy partitioning energy coupling experiments at the National Ignition Facility (NIF) have been designed to measure simultaneously the coupling of energy from a laser-driven target into both ground shock and air blast overpressure to nearby media. The source target for the experiment is positioned at a known height above the ground-surface simulant and is heated by four beams from the NIF. The resulting target energy density and specific energy are equal to those of a low-yield nuclear device. The ground-shock stress waves and atmospheric overpressure waveforms that result in our test system are hydrodynamically scaled analogs of full-scale seismic and air blast phenomena. This report summarizes the development of the platform, the simulations, and calculations that underpin the physics measurements that are being made, and finally the data that were measured. Agreement between the data and simulation of the order of a factor of two to three is seen for air blast quantities such as peak overpressure. Historical underground test data for seismic phenomena measured sensor displacements; we measure the stresses generated in our ground-surrogate medium. We find factors-of-a-few agreement between our measured peak stresses and predictions with modern geophysical computer codes.

  9. A geophysical shock and air blast simulator at the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, K. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brown, C. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); May, M. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Compton, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walton, O. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shingleton, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, J. O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Holtmeier, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Loey, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mirkarimi, P. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dunlop, W. H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Guyton, R. L. [National Security Technologies, Livermore, CA (United States); Huffman, E. [National Security Technologies, Livermore, CA (United States)

    2014-09-01

    The energy partitioning energy coupling experiments at the National Ignition Facility (NIF) have been designed to measure simultaneously the coupling of energy from a laser-driven target into both ground shock and air blast overpressure to nearby media. The source target for the experiment is positioned at a known height above the ground-surface simulant and is heated by four beams from the NIF. The resulting target energy density and specific energy are equal to those of a low-yield nuclear device. The ground-shock stress waves and atmospheric overpressure waveforms that result in our test system are hydrodynamically scaled analogs of full-scale seismic and air blast phenomena. This report summarizes the development of the platform, the simulations, and calculations that underpin the physics measurements that are being made, and finally the data that were measured. Agreement between the data and simulation of the order of a factor of two to three is seen for air blast quantities such as peak overpressure. Historical underground test data for seismic phenomena measured sensor displacements; we measure the stresses generated in our ground-surrogate medium. We find factors-of-a-few agreement between our measured peak stresses and predictions with modern geophysical computer codes.

  10. Mine-detection test facilities at TNO-FEL test location "Waalsdorp"

    NARCIS (Netherlands)

    Rhebergen, J.B.; Zwamborn, A.P.M.

    1998-01-01

    As part of the TNO-FEL Ultra-Wide-Band Ground-Penetrating-Radar (UWB-GPR) project, a test facility for controlled GPR experiments was planned. Construction of this sand-box test facility has recently been completed. At the same site another test facility, for evaluating various commercial of the

  11. Mine-detection test facilities at TNO-FEL test location "Waalsdorp"

    NARCIS (Netherlands)

    Rhebergen, J.B.; Zwamborn, A.P.M.

    1998-01-01

    As part of the TNO-FEL Ultra-Wide-Band Ground-Penetrating-Radar (UWB-GPR) project, a test facility for controlled GPR experiments was planned. Construction of this sand-box test facility has recently been completed. At the same site another test facility, for evaluating various commercial of the she

  12. Pesticides in Ground Water

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup

    1996-01-01

    Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588.......Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588....

  13. Pesticides in Ground Water

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup

    1996-01-01

    Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588.......Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588....

  14. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to:Evaluate and characterize the effect of flame and thermal...

  15. Cold Vacuum Drying Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the K-Basins (see K-Basins link) in Hanford's 100 Area is a facility called the Cold Vacuum Drying Facility (CVDF).Between 2000 and 2004, workers at the...

  16. Dialysis Facility Compare

    Data.gov (United States)

    U.S. Department of Health & Human Services — Dialysis Facility Compare helps you find detailed information about Medicare-certified dialysis facilities. You can compare the services and the quality of care that...

  17. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  18. Materiel Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CRREL's Materiel Evaluation Facility (MEF) is a large cold-room facility that can be set up at temperatures ranging from −20°F to 120°F with a temperature change...

  19. Armament Technology Facility (ATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Armament Technology Facility is a 52,000 square foot, secure and environmentally-safe, integrated small arms and cannon caliber design and evaluation facility....

  20. Integrated Disposal Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Located near the center of the 586-square-mile Hanford Site is the Integrated Disposal Facility, also known as the IDF.This facility is a landfill similar in concept...

  1. Facilities for US Radioastronomy.

    Science.gov (United States)

    Thaddeus, Patrick

    1982-01-01

    Discusses major developments in radioastronomy since 1945. Topics include proposed facilities, very-long-baseline interferometric array, millimeter-wave telescope, submillimeter-wave telescope, and funding for radioastronomy facilities and projects. (JN)

  2. Wastewater Treatment Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Individual permits for municipal, industrial, and semi-public wastewater treatment facilities in Iowa for the National Pollutant Discharge Elimination System (NPDES)...

  3. Facility Response Plan (FRP)

    Data.gov (United States)

    U.S. Environmental Protection Agency — A Facility Response Plan (FRP) demonstrates a facility's preparedness to respond to a worst case oil discharge. Under the Clean Water Act, as amended by the Oil...

  4. Financing Professional Sports Facilities

    OpenAIRE

    Baade, Robert A.; Victor A. Matheson

    2011-01-01

    This paper examines public financing of professional sports facilities with a focus on both early and recent developments in taxpayer subsidization of spectator sports. The paper explores both the magnitude and the sources of public funding for professional sports facilities.

  5. FDA Certified Mammography Facilities

    Science.gov (United States)

    ... Program Consumer Information (MQSA) Search for a Certified Facility Share Tweet Linkedin Pin it More sharing options ... Email Print This list of FDA Certified Mammography Facilities is updated weekly. If you click on Search ...

  6. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  7. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  8. Environmental Toxicology Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Fully-equipped facilities for environmental toxicology researchThe Environmental Toxicology Research Facility (ETRF) located in Vicksburg, MS provides over 8,200 ft...

  9. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to: Evaluate and characterize the effect of flame and thermal...

  10. Projectile Demilitarization Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Projectile Wash Out Facility is US Army Ammunition Peculiar Equipment (APE 1300). It is a pilot scale wash out facility that uses high pressure water and steam...

  11. Communication, concepts and grounding.

    Science.gov (United States)

    van der Velde, Frank

    2015-02-01

    This article discusses the relation between communication and conceptual grounding. In the brain, neurons, circuits and brain areas are involved in the representation of a concept, grounding it in perception and action. In terms of grounding we can distinguish between communication within the brain and communication between humans or between humans and machines. In the first form of communication, a concept is activated by sensory input. Due to grounding, the information provided by this communication is not just determined by the sensory input but also by the outgoing connection structure of the conceptual representation, which is based on previous experiences and actions. The second form of communication, that between humans or between humans and machines, is influenced by the first form. In particular, a more successful interpersonal communication might require forms of situated cognition and interaction in which the entire representations of grounded concepts are involved.

  12. Stochastic ground motion simulation

    Science.gov (United States)

    Rezaeian, Sanaz; Xiaodan, Sun; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan

    2014-01-01

    Strong earthquake ground motion records are fundamental in engineering applications. Ground motion time series are used in response-history dynamic analysis of structural or geotechnical systems. In such analysis, the validity of predicted responses depends on the validity of the input excitations. Ground motion records are also used to develop ground motion prediction equations(GMPEs) for intensity measures such as spectral accelerations that are used in response-spectrum dynamic analysis. Despite the thousands of available strong ground motion records, there remains a shortage of records for large-magnitude earthquakes at short distances or in specific regions, as well as records that sample specific combinations of source, path, and site characteristics.

  13. Nuclear fuel cycle facility accident analysis handbook

    Energy Technology Data Exchange (ETDEWEB)

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  14. Ground vibration test and flutter analysis of air sampling probe

    Science.gov (United States)

    Ellison, J. F.

    1986-01-01

    The Dryden Flight Research Facility of NASA Ames Research Center conducted a ground vibration test and a flutter analysis of an air sampling probe that was to be mounted on a Convair 990 airplane. The probe was a steel, wing-shaped structure used to gather atmospheric data. The ground vibration test was conducted to update the finite-element model used in the flutter analysis. The analysis predicted flutter speeds well outside the operating flight envelope of the Convair 990 airplane.

  15. WSO-UV ground segment for observation optimisation

    Science.gov (United States)

    Basargina, O.; Sachkov, M.; Kazakevich, Y.; Kanev, E.; Sichevskij, S.

    2016-07-01

    The World Space Observatory-Ultraviolet (WSO-UV) is a Russian-Spanish space mission born as a response to the growing up demand for UV facilities by the astronomical community. Main components of the WSO-UV Ground Segment, Mission Control Centre and Science Operation Centre, are being developed by international cooperation In this paper the fundamental components of WSO-UV ground segment are described. Also approaches to optimize observatory scheduling problem are discussed.

  16. Hypergravity facilities in the ESA ground-based facility program: current research activities and future tasks

    NARCIS (Netherlands)

    Frett, T.; Petrat, G.; van Loon, J.J.W.A.; Hemmersbach, R.; Anken, R.

    2016-01-01

    Research on Artificial Gravity (AG) created by linear acceleration or centrifugation has a long history and could significantly contribute to realize long-term human spaceflight in the future. Employing centrifuges plays a prominent role in human physiology and gravitational biology. This article

  17. Pressurized burner test facility

    Energy Technology Data Exchange (ETDEWEB)

    Maloney, D.J.; Norton, T.S.; Hadley, M.A. [Morgantown Energy Technology Center, WV (United States)

    1993-06-01

    The Morgantown Energy Technology Center (METC) is currently fabricating a high-pressure burner test facility. The facility was designed to support the development of gas turbine combustion systems fired on natural gas and coal-derived gaseous fuels containing fuel-bound nitrogen. Upon completion of fabrication and shake-down testing in October 1993, the facility will be available for use by industrial and university partners through Cooperative Research and Development Agreements (CRADAs) or through other cooperative arrangements. This paper describes the burner test facility and associated operating parameter ranges and informs interested parties of the availability of the facility.

  18. Inhibitory Competition between Shape Properties in Figure-Ground Perception

    Science.gov (United States)

    Peterson, Mary A.; Skow, Emily

    2008-01-01

    Theories of figure-ground perception entail inhibitory competition between either low-level units (edge or feature units) or high-level shape properties. Extant computational models instantiate the 1st type of theory. The authors investigated a prediction of the 2nd type of theory: that shape properties suggested on the ground side of an edge are…

  19. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITY Evaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  20. JPL control/structure interaction test bed real-time control computer architecture

    Science.gov (United States)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  1. Steady State Vacuum Ultraviolet Exposure Facility With Automated Calibration Capability

    Science.gov (United States)

    Stueber, Thomas J.; Sechkar, Edward A.; Dever, Joyce A.; Banks, Bruce A.

    2000-01-01

    NASA Glenn Research Center at Lewis Field designed and developed a steady state vacuum ultraviolet automated (SSVUVa) facility with in situ VUV intensity calibration capability. The automated feature enables a constant accelerated VUV radiation exposure over long periods of testing without breaking vacuum. This test facility is designed to simultaneously accommodate four isolated radiation exposure tests within the SSVUVa vacuum chamber. Computer-control of the facility for long, term continuous operation also provides control and recording of thermocouple temperatures, periodic recording of VUV lamp intensity, and monitoring of vacuum facility status. This paper discusses the design and capabilities of the SSVUVa facility.

  2. Guide to computing at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Peavler, J. (ed.)

    1979-06-01

    This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)

  3. Environmental practices for biomedical research facilities.

    Science.gov (United States)

    Medlin, E L; Grupenhoff, J T

    2000-12-01

    As a result of the Leadership Conference on Biomedical Research and the Environment, the Facilities Committee focused its work on the development of best environmental practices at biomedical research facilities at the university and independent research facility level as well as consideration of potential involvement of for-profit companies and government agencies. The designation "facilities" includes all related buildings and grounds, "green auditing" of buildings and programs, purchasing of furnishings and sources, energy efficiency, and engineering services (lighting, heating, air conditioning), among other activities. The committee made a number of recommendations, including development of a national council for environmental stewardship in biomedical research, development of a system of green auditing of such research facilities, and creation of programs for sustainable building and use. In addition, the committee recommended extension of education and training programs for environmental stewardship, in cooperation with facilities managers, for all research administrators and researchers. These programs would focus especially on graduate fellows and other students, as well as on science labs at levels K--12.

  4. Ground State Spin Logic

    CERN Document Server

    Whitfield, J D; Biamonte, J D

    2012-01-01

    Designing and optimizing cost functions and energy landscapes is a problem encountered in many fields of science and engineering. These landscapes and cost functions can be embedded and annealed in experimentally controllable spin Hamiltonians. Using an approach based on group theory and symmetries, we examine the embedding of Boolean logic gates into the ground state subspace of such spin systems. We describe parameterized families of diagonal Hamiltonians and symmetry operations which preserve the ground state subspace encoding the truth tables of Boolean formulas. The ground state embeddings of adder circuits are used to illustrate how gates are combined and simplified using symmetry. Our work is relevant for experimental demonstrations of ground state embeddings found in both classical optimization as well as adiabatic quantum optimization.

  5. FDTD simulation of LEMP propagation over lossy ground: Influence of distance, ground conductivity, and source parameters

    Science.gov (United States)

    Aoki, Masanori; Baba, Yoshihiro; Rakov, Vladimir A.

    2015-08-01

    We have computed lightning electromagnetic pulses (LEMPs), including the azimuthal magnetic field Hφ, vertical electric field Ez, and horizontal (radial) electric field Eh that propagated over 5 to 200 km of flat lossy ground, using the finite difference time domain (FDTD) method in the 2-D cylindrical coordinate system. This is the first systematic full-wave study of LEMP propagation effects based on a realistic return-stroke model and including the complete return-stroke frequency range. Influences of the return-stroke wavefront speed (ranging from c/2 to c, where c is the speed of light), current risetime (ranging from 0.5 to 5 µs), and ground conductivity (ranging from 0.1 mS/m to ∞) on Hφ, Ez, and Eh have been investigated. Also, the FDTD-computed waveforms of Eh have been compared with the corresponding ones computed using the Cooray-Rubinstein formula. Peaks of Hφ, Ez, and Eh are nearly proportional to the return-stroke wavefront speed. The peak of Eh decreases with increasing current risetime, while those of Hφ and Ez are only slightly influenced by it. The peaks of Hφ and Ez are essentially independent of the ground conductivity at a distance of 5 km. Beyond this distance, they appreciably decrease relative to the perfectly conducting ground case, and the decrease is stronger for lower ground conductivity values. The peak of Eh increases with decreasing ground conductivity. The computed Eh/Ez is consistent with measurements of Thomson et al. (1988). The observed decrease of Ez peak and increase of Ez risetime due to propagation over 200 km of Florida soil are reasonably well reproduced by the FDTD simulation with ground conductivity of 1 mS/m.

  6. Ground Vehicle Robotics Presentation

    Science.gov (United States)

    2012-08-14

    Mr. Jim Parker Associate Director Ground Vehicle Robotics Distribution Statement A. Approved for public release Report Documentation Page...Briefing 3. DATES COVERED 01-07-2012 to 01-08-2012 4. TITLE AND SUBTITLE Ground Vehicle Robotics Presentation 5a. CONTRACT NUMBER 5b. GRANT...ABSTRACT Provide Transition-Ready, Cost-Effective, and Innovative Robotics and Control System Solutions for Manned, Optionally-Manned, and Unmanned

  7. The ISOLDE Facility: Radioactive beams at CERN

    CERN Document Server

    CERN. Geneva

    2007-01-01

    The Isope Separation On-Line (ISOL) technique evolved from chemical techniques used to separate radioactive isotopes off-line from irradiated "targets". The ISOL targets of today, used at e.g. ISOLDE, can be of many different types and in different phases but the isotopes are always delivered at very low energies making the technique ideal for study of ground state properties and collections for other applications such as solid state physics and medical physics. The possibility of accelerating these low energy beams for nuclear structure studies, and in the long term future for neutrino physics, is now being explored at first generation radioactive beam facilities. The upgrade towards HIE-ISOLDE aim to consolidate ISOLDE's position as a world leading radioactive nuclear beam facility and it will be a pre-cursor to a future all European ISOL facility, EURISOL, with order of magnitudes higher radioactive beam intensities and energies. Prerequisite knowledge and references: None

  8. Study on Storage Facilities of Agricultural Products in Courtyard in China

    Institute of Scientific and Technical Information of China (English)

    CHEN Li; LI Xi-Hong; XIA Qiu-Yu; HU Yun-feng; GUAN Wen-qiang

    2002-01-01

    Mini storage facilities applicable in rural areas in China have been developed after nine years of research. Optimal design of structure and refrigeration system, facilities optimization, computer control and management technology are studied and developed.

  9. Relational grounding facilitates development of scientifically useful multiscale models

    Directory of Open Access Journals (Sweden)

    Lam Tai

    2011-09-01

    Full Text Available Abstract We review grounding issues that influence the scientific usefulness of any biomedical multiscale model (MSM. Groundings are the collection of units, dimensions, and/or objects to which a variable or model constituent refers. To date, models that primarily use continuous mathematics rely heavily on absolute grounding, whereas those that primarily use discrete software paradigms (e.g., object-oriented, agent-based, actor typically employ relational grounding. We review grounding issues and identify strategies to address them. We maintain that grounding issues should be addressed at the start of any MSM project and should be reevaluated throughout the model development process. We make the following points. Grounding decisions influence model flexibility, adaptability, and thus reusability. Grounding choices should be influenced by measures, uncertainty, system information, and the nature of available validation data. Absolute grounding complicates the process of combining models to form larger models unless all are grounded absolutely. Relational grounding facilitates referent knowledge embodiment within computational mechanisms but requires separate model-to-referent mappings. Absolute grounding can simplify integration by forcing common units and, hence, a common integration target, but context change may require model reengineering. Relational grounding enables synthesis of large, composite (multi-module models that can be robust to context changes. Because biological components have varying degrees of autonomy, corresponding components in MSMs need to do the same. Relational grounding facilitates achieving such autonomy. Biomimetic analogues designed to facilitate translational research and development must have long lifecycles. Exploring mechanisms of normal-to-disease transition requires model components that are grounded relationally. Multi-paradigm modeling requires both hyperspatial and relational grounding.

  10. Adiabatic quantum computing

    OpenAIRE

    Lobe, Elisabeth; Stollenwerk, Tobias; Tröltzsch, Anke

    2015-01-01

    In the recent years, the field of adiabatic quantum computing has gained importance due to the advances in the realisation of such machines, especially by the company D-Wave Systems. These machines are suited to solve discrete optimisation problems which are typically very hard to solve on a classical computer. Due to the quantum nature of the device it is assumed that there is a substantial speedup compared to classical HPC facilities. We explain the basic principles of adiabatic ...

  11. Facility Effluent Monitoring Plan determinations for the 600 Area facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nickels, J.M.

    1991-08-01

    This document determines the need for Facility Effluent Monitoring Plans for Westinghouse Hanford Company's 600 Area facilities on the Hanford Site. The Facility Effluent Monitoring Plan determinations were prepared in accordance with A Guide For Preparing Hanford Site Facility Effluent Monitoring Plans (WHC 1991). Five major Westinghouse Hanford Company facilities in the 600 Area were evaluated: the Purge Water Storage Facility, 212-N, -P, and -R Facilities, the 616 Facility, and the 213-J K Storage Vaults. Of the five major facilities evaluated in the 600 Area, none will require preparation of a Facility Effluent Monitoring Plan.

  12. Ground-based Infrared Observations of Water Vapor and Hydrogen Peroxide in the Atmosphere of Mars

    Science.gov (United States)

    Encrenaz, T.; Greathouse, T. K.; Bitner, M.; Kruger, A.; Richter, M. J.; Lacy, J. H.; Bézard, B.; Fouchet, T.; Lefevre, F.; Forget, F.; Atreya, S. K.

    2008-11-01

    Ground-based observations of water vapor and hydrogen peroxide have been obtained in the thermal infrared range, using the TEXES instrument at the NASA Infrared Telescope Facility, for different times of the seasonal cycle.

  13. Miscellaneous information regarding operation and inventory of 618-11 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    Webb, C.R.

    1993-06-01

    This report is a compilation of inventories and radiation surveys taken for the 618-11 Burial Ground at Hanford. This report deals with waste management activities at the facility during the early to mid-1960s.

  14. Ground Enterprise Management System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Emergent Space Technologies Inc. proposes to develop the Ground Enterprise Management System (GEMS) for spacecraft ground systems. GEMS will provide situational...

  15. Integration and use of Microgravity Research Facility: Lessons learned by the crystals by vapor transport experiment and Space Experiments Facility programs

    Science.gov (United States)

    Heizer, Barbara L.

    1992-01-01

    The Crystals by Vapor Transport Experiment (CVTE) and Space Experiments Facility (SEF) are materials processing facilities designed and built for use on the Space Shuttle mid deck. The CVTE was built as a commercial facility owned by the Boeing Company. The SEF was built under contract to the UAH Center for Commercial Development of Space (CCDS). Both facilities include up to three furnaces capable of reaching 850 C minimum, stand-alone electronics and software, and independent cooling control. In addition, the CVTE includes a dedicated stowage locker for cameras, a laptop computer, and other ancillary equipment. Both systems are designed to fly in a Middeck Accommodations Rack (MAR), though the SEF is currently being integrated into a Spacehab rack. The CVTE hardware includes two transparent furnaces capable of achieving temperatures in the 850 to 870 C range. The transparent feature allows scientists/astronauts to directly observe and affect crystal growth both on the ground and in space. Cameras mounted to the rack provide photodocumentation of the crystal growth. The basic design of the furnace allows for modification to accommodate techniques other than vapor crystal growth. Early in the CVTE program, the decision was made to assign a principal scientist to develop the experiment plan, affect the hardware/software design, run the ground and flight research effort, and interface with the scientific community. The principal scientist is responsible to the program manager and is a critical member of the engineering development team. As a result of this decision, the hardware/experiment requirements were established in such a way as to balance the engineering and science demands on the equipment. Program schedules for hardware development, experiment definition and material selection, flight operations development and crew training, both ground support and astronauts, were all planned and carried out with the understanding that the success of the program science

  16. Instrumentation of the ESRF medical imaging facility

    CERN Document Server

    Elleaume, H; Berkvens, P; Berruyer, G; Brochard, T; Dabin, Y; Domínguez, M C; Draperi, A; Fiedler, S; Goujon, G; Le Duc, G; Mattenet, M; Nemoz, C; Pérez, M; Renier, M; Schulze, C; Spanne, P; Suortti, P; Thomlinson, W; Estève, F; Bertrand, B; Le Bas, J F

    1999-01-01

    At the European Synchrotron Radiation Facility (ESRF) a beamport has been instrumented for medical research programs. Two facilities have been constructed for alternative operation. The first one is devoted to medical imaging and is focused on intravenous coronary angiography and computed tomography (CT). The second facility is dedicated to pre-clinical microbeam radiotherapy (MRT). This paper describes the instrumentation for the imaging facility. Two monochromators have been designed, both are based on bent silicon crystals in the Laue geometry. A versatile scanning device has been built for pre-alignment and scanning of the patient through the X-ray beam in radiography or CT modes. An intrinsic germanium detector is used together with large dynamic range electronics (16 bits) to acquire the data. The beamline is now at the end of its commissioning phase; intravenous coronary angiography is intended to start in 1999 with patients and the CT pre-clinical program is underway on small animals. The first in viv...

  17. Synchrotron radiation facilities

    CERN Multimedia

    1972-01-01

    Particularly in the past few years, interest in using the synchrotron radiation emanating from high energy, circular electron machines has grown considerably. In our February issue we included an article on the synchrotron radiation facility at Frascati. This month we are spreading the net wider — saying something about the properties of the radiation, listing the centres where synchrotron radiation facilities exist, adding a brief description of three of them and mentioning areas of physics in which the facilities are used.

  18. Thermal distortion test facility

    Science.gov (United States)

    Stapp, James L.

    1995-02-01

    The thermal distortion test facility (TDTF) at Phillips Laboratory provides precise measurements of the distortion of mirrors that occurs when their surfaces are heated. The TDTF has been used for several years to evaluate mirrors being developed for high-power lasers. The facility has recently undergone some significant upgrades to improve the accuracy with which mirrors can be heated and the resulting distortion measured. The facility and its associated instrumentation are discussed.

  19. Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Materials Characterization Facility enables detailed measurements of the properties of ceramics, polymers, glasses, and composites. It features instrumentation...

  20. Mobile Solar Tracker Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NIST's mobile solar tracking facility is used to characterize the electrical performance of photovoltaic panels. It incorporates meteorological instruments, a solar...