WorldWideScience

Sample records for central computing facility

  1. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  2. On-line satellite/central computer facility of the Multiparticle Argo Spectrometer System

    International Nuclear Information System (INIS)

    An on-line satellite/central computer facility has been developed at Brookhaven National Laboratory as part of the Multiparticle Argo Spectrometer System (MASS). This facility consisting of a PDP-9 and a CDC-6600, has been successfully used in study of proton-proton interactions at 28.5 GeV/c. (U.S.)

  3. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  4. Advantages for the introduction of computer techniques in centralized supervision of radiation levels in nuclear facilities

    International Nuclear Information System (INIS)

    A new computerized information system at the Saclay Center comprising 120 measuring channels is described. The advantages offered by this system with respect to the systems in use up to now are presented. Experimental results are given which support the argument that the system can effectively supervise the radioisotope facility at the Center. (B.G.)

  5. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  6. Securing a HENP Computing Facility

    CERN Document Server

    Misawa, S; Throwe, T

    2003-01-01

    Traditionally, HENP computing facilities have been open facilities that are accessed in many different ways by users that are both internal and external to the facility. However, the need to protect the facility from cybersecurity threats has made it difficult to maintain the openness of the facility to off-site and on-site users. In this paper, we discuss the strategy we have used and the architecture we have developed and deployed to increase the security the US ATLAS and RHIC Computing Facilities, while trying to maintain the openness and accessibility that our user community has come to expect. Included in this discussion are the tools that we have used and the operational experience we have had with the deployed architecture.

  7. Physics Division computer facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cyborski, D.R.; Teh, K.M.

    1995-08-01

    The Physics Division maintains several computer systems for data analysis, general-purpose computing, and word processing. While the VMS VAX clusters are still used, this past year saw a greater shift to the Unix Cluster with the addition of more RISC-based Unix workstations. The main Divisional VAX cluster which consists of two VAX 3300s configured as a dual-host system serves as boot nodes and disk servers to seven other satellite nodes consisting of two VAXstation 3200s, three VAXstation 3100 machines, a VAX-11/750, and a MicroVAX II. There are three 6250/1600 bpi 9-track tape drives, six 8-mm tapes and about 9.1 GB of disk storage served to the cluster by the various satellites. Also, two of the satellites (the MicroVAX and VAX-11/750) have DAPHNE front-end interfaces for data acquisition. Since the tape drives are accessible cluster-wide via a software package, they are, in addition to replay, used for tape-to-tape copies. There is however, a satellite node outfitted with two 8 mm drives available for this purpose. Although not part of the main cluster, a DEC 3000 Alpha machine obtained for data acquisition is also available for data replay. In one case, users reported a performance increase by a factor of 10 when using this machine.

  8. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  9. Facility for positron computed tomography

    International Nuclear Information System (INIS)

    The positron computed tomography facility has got scintillator detector rings simultaneously recording more than one tomogrphic image of different cross-sections of the patient. The detectors in neighboring rings are staggered and can be rotated with respect to each other in order to increase the count rate without loss of efficiency. (DG)

  10. Facility for positron computed tomography

    International Nuclear Information System (INIS)

    For positron computed tomography two or more rings of scintillation detectors are used by which three or more sections of the object may be determined at a time. The rings are placed in parallel planes having got some distance from each other, axially movable collimator rings being provided for. Each collimator can be moved towards the opposite collimator and towards a central collimator which also is ring-shaped and is placed between the rows of detectors. The external and internal collimator are used for data selection and image-forming. (DG)

  11. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Giesbrecht, Alan [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    The Central Facilities Area (CFA) located in Butte County, Idaho at Idaho National Laboratory (INL) has an existing wastewater system to collect and treat sanitary wastewater and non contact cooling water from the facility. The existing treatment facility consists of three cells: Cell 1 has a surface area of 1.7 acres, Cell 2 has a surface area of 10.3 acres, and Cell 3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5 acre land application site that utilizes a center pivot irrigation sprinkler system. The purpose of this current study is to update the analysis and conclusions of the December 2013 study. In this current study, the new seepage rate and influent flow rate data have been used to update the calculations, model, and analysis.

  12. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  13. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  14. SERC Central Laser Facility annual report 1992

    International Nuclear Information System (INIS)

    In this 1992 Annual Report to the Laser Facility Committee of the Science and Engineering Research Council, the Central Laser Facility at Rutherford Appleton Laboratory, technical progress is described and mid-term organizational goals outlined. Outstanding among recent achievements is the work on plasma heating being undertaken on the Sprite facility using the ultra-bright KrF laser pumped Raman beams. Two-beam operation at power levels approaching 2 TW in 10 ps are hoped for. On a four year timescale the Titania system will provide four Raman beams of exceptional brightness and power up to 20TW in 10ps. The other high power laser facility, Vulcan is also producing exciting work. Progress in nanosecond studies using Raman spectroscopy have produced the first Raman spectrum of solvated Buckmister fullerene and direct observation of the separation of germinate ion pairs, as well as information on the behaviour of a single base in an oligonuclide chain. Phase boundaries for the solidification of a two dimensional electron fluid have been determined in a Gallium Arsenide heterojunction. Despite staff number attrition, operation and development of the facilities have continued successfully. (UK)

  15. Science and Engineering Research Council Central Laser Facility

    International Nuclear Information System (INIS)

    This report covers the work done at, or in association with, the Central Laser Facility during the year April 1980 to March 1981. In the first chapter the major reconstruction and upgrade of the glass laser, which has been undertaken in order to increase the versatility of the facility, is described. The work of the six groups of the Glass Laser Scientific Progamme and Scheduling Committee is described in further chapters entitled; glass laser development, laser plasma interactions, transport and particle emission studies, ablative acceleration and compression studies, spectroscopy and XUV lasers, and theory and computation. Publications based on the work of the facility which have either appeared or been accepted for publication during the year are listed. (U.K.)

  16. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mark R. Cole

    2013-12-01

    The Central Facilities Area (CFA), located in Butte County, Idaho, at the Idaho National Laboratory has an existing wastewater system to collect and treat sanitary wastewater and non-contact cooling water from the facility. The existing treatment facility consists of three cells: Cell #1 has a surface area of 1.7 acres, Cell #2 has a surface area of 10.3 acres, and Cell #3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5-acre land application site that uses a center-pivot irrigation sprinkler system. As flows at CFA have decreased in recent years, the amount of wastewater discharged to the land application site has decreased from 13.64 million gallons in 2004 to no discharge in 2012 and 2013. In addition to the decreasing need for land application, approximately 7.7 MG of supplemental water was added to the system in 2013 to maintain a water level and prevent the clay soil liners in the cells from drying out and “cracking.” The Idaho National Laboratory is concerned that the sewage lagoons and land application site may be oversized for current and future flows. A further concern is the sustainability of the large volumes of supplemental water that are added to the system according to current operational practices. Therefore, this study was initiated to evaluate the system capacity, operational practices, and potential improvement alternatives, as warranted.

  17. Computer facilities at the Research Centre Seibersdorf

    International Nuclear Information System (INIS)

    The computer facilities available at the Mathematics Division of the Institute are outlined including their development since 1966. The major areas of use of the computers by the science divisions and in administration are described as well as the tasks performed for industry. Two examples of the computer applications are considered in some detail: 1) A system developed for control and data acquisition in asbestos-cement plate production; 2) A model treatment of safety calculations for the steam generating systems of light-water reactors. (S.R.)

  18. Computer simulation of tritium removal facility design

    International Nuclear Information System (INIS)

    In this study, a computer simulation of tritium diffusion out of molten salt is performed using COMSOL Multiphysics. The purpose of the simulation is to investigate the efficiency of the permeation window type tritium removal facility, which is proposed for tritium control in FHRs. The result of the simulation suggests a large surface area is one of the key issues in the design of the tritium removal facility, and the simple tube bundle concept is insufficient to provide the surface area needed for an efficient tritium removal process. (author)

  19. The Central Laser Facility at the Pierre Auger Observatory

    CERN Document Server

    Arqueros, F; Covault, C; D'Urso, D; Giulio, C D; Facal, P; Fick, B; Guarino, F; Malek, M; Matthews, J A J; Matthews, J; Meyhandan, R; Monasor, M; Mostafa, M; Petrinca, P; Roberts, M; Sommers, P; Travnicek, P; Valore, L; Verzi, V; Wiencke, L

    2005-01-01

    The Central Laser Facility is located near the middle of the Pierre Auger Observatory in Argentina. It features a UV laser and optics that direct a beam of calibrated pulsed light into the sky. Light scattered from this beam produces tracks in the Auger optical detectors which normally record nitrogen fluorescence tracks from cosmic ray air showers. The Central Laser Facility provides a "test beam" to investigate properties of the atmosphere and the fluorescence detectors. The laser can send light via optical fiber simultaneously to the nearest surface detector tank for hybrid timing analyses. We describe the facility and show some examples of its many uses.

  20. The Central laser facility at the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Arqueros, F.; Bellido, J.; Covault, C.; D' Urso, D.; Di Giulio, C.; Facal, P.; Fick, B.; Guarino, F.; Malek, M.; Matthews, J.A.J.; Matthews, J.; Meyhandan, R.; Monasor,; Mostafa, M.; Petrinca, P.; Roberts, M.; Sommers, P.; Travnicek, P.; Valore, L.; Verzi, V.; Wiencke, Lawrence; /Utah U.

    2005-07-01

    The Central Laser Facility is located near the middle of the Pierre Auger Observatory in Argentina. It features a UV laser and optics that direct a beam of calibrated pulsed light into the sky. Light scattered from this beam produces tracks in the Auger optical detectors which normally record nitrogen fluorescence tracks from cosmic ray air showers. The Central Laser Facility provides a ''test beam'' to investigate properties of the atmosphere and the fluorescence detectors. The laser can send light via optical fiber simultaneously to the nearest surface detector tank for hybrid timing analyses. We describe the facility and show some examples of its many uses.

  1. Computing betweenness centrality in external memory

    DEFF Research Database (Denmark)

    Arge, Lars; Goodrich, Michael T.; Walderveen, Freek van

    2013-01-01

    Betweenness centrality is one of the most well-known measures of the importance of nodes in a social-network graph. In this paper we describe the first known external-memory and cache-oblivious algorithms for computing betweenness centrality. We present four different external-memory algorithms...... exhibiting various tradeoffs with respect to performance. Two of the algorithms are cache-oblivious. We describe general algorithms for networks with weighted and unweighted edges and a specialized algorithm for networks with small diameters, as is common in social networks exhibiting the “small worlds...

  2. Do central bank liquidity facilities affect interbank lending rates?

    OpenAIRE

    Jens H. E. Christensen; Lopez, Jose A.; Rudebusch, Glenn D.

    2009-01-01

    In response to the global financial crisis that started in August 2007, central banks provided extraordinary amounts of liquidity to the financial system. To investigate the effect of central bank liquidity facilities on term interbank lending rates, we estimate a six-factor arbitrage-free model of U.S. Treasury yields, financial corporate bond yields, and term interbank rates. This model can account for fluctuations in the term structure of credit risk and liquidity risk. A significant shift...

  3. Conceptual design report for Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    The permanent facilities are defined, and cost estimates are provided for the disposal of Low-Level Radioactive Wastes (LLW) at the Central Waste Disposal Facility (CWDF). The waste designated for the Central Waste Disposal Facility will be generated by the Y-12 Plant, the Oak Ridge Gaseous Diffusion Plant, and the Oak Ridge National Laboratory. The facility will be operated by ORNL for the Office of Defense Waste and By-Products Management of the Deparment of Energy. The CWDF will be located on the Department of Energy's Oak Ridge Reservation, west of Highway 95 and south of Bear Creek Road. The body of this Conceptual Design Report (CDR) describes the permanent facilities required for the operation of the CWDF. Initial facilities, trenches, and minimal operating equipment will be provided in earlier projects. The disposal of LLW will be by shallow land burial in engineered trenches. DOE Order 5820 was used as the performance standard for the proper disposal of radioactive waste. The permanent facilities are intended for beneficial occupancy during the first quarter of fiscal year 1989. 3 references, 9 figures, 7 tables

  4. Centralization and Decentralization of Schools' Physical Facilities Management in Nigeria

    Science.gov (United States)

    Ikoya, Peter O.

    2008-01-01

    Purpose: This research aims to examine the difference in the availability, adequacy and functionality of physical facilities in centralized and decentralized schools districts, with a view to making appropriate recommendations to stakeholders on the reform programmes in the Nigerian education sector. Design/methodology/approach: Principals,…

  5. Computational evaluation oa a neutron field facility

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Jose Julio de O.; Pazianotto, Mauricio T., E-mail: jjfilos@hotmail.com, E-mail: mpazianotto@gmail.com [Instituto Tecnologico de Aeronautica (ITA/DCTA), Sao Jose dos Campos, SP (Brazil); Federico, Claudio A.; Passaro, Angelo, E-mail: claudiofederico@ieav.cta.br, E-mail: angelo@ieav.cta.br [Instituto de Estudos Avancados (IEAv/DCTA), Sao Jose dos Campos, SP (Brazil)

    2015-07-01

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction {sup 3}H(d,n){sup 4}He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 10{sup 8} n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  6. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    Science.gov (United States)

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  7. Technical evaluation of proposed Ukrainian Central Radioactive Waste Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gates, R.; Glukhov, A.; Markowski, F.

    1996-06-01

    This technical report is a comprehensive evaluation of the proposal by the Ukrainian State Committee on Nuclear Power Utilization to create a central facility for radioactive waste (not spent fuel) processing. The central facility is intended to process liquid and solid radioactive wastes generated from all of the Ukrainian nuclear power plants and the waste generated as a result of Chernobyl 1, 2 and 3 decommissioning efforts. In addition, this report provides general information on the quantity and total activity of radioactive waste in the 30-km Zone and the Sarcophagus from the Chernobyl accident. Processing options are described that may ultimately be used in the long-term disposal of selected 30-km Zone and Sarcophagus wastes. A detailed report on the issues concerning the construction of a Ukrainian Central Radioactive Waste Processing Facility (CRWPF) from the Ukrainian Scientific Research and Design institute for Industrial Technology was obtained and incorporated into this report. This report outlines various processing options, their associated costs and construction schedules, which can be applied to solving the operating and decommissioning radioactive waste management problems in Ukraine. The costs and schedules are best estimates based upon the most current US industry practice and vendor information. This report focuses primarily on the handling and processing of what is defined in the US as low-level radioactive wastes.

  8. Technical evaluation of proposed Ukrainian Central Radioactive Waste Processing Facility

    International Nuclear Information System (INIS)

    This technical report is a comprehensive evaluation of the proposal by the Ukrainian State Committee on Nuclear Power Utilization to create a central facility for radioactive waste (not spent fuel) processing. The central facility is intended to process liquid and solid radioactive wastes generated from all of the Ukrainian nuclear power plants and the waste generated as a result of Chernobyl 1, 2 and 3 decommissioning efforts. In addition, this report provides general information on the quantity and total activity of radioactive waste in the 30-km Zone and the Sarcophagus from the Chernobyl accident. Processing options are described that may ultimately be used in the long-term disposal of selected 30-km Zone and Sarcophagus wastes. A detailed report on the issues concerning the construction of a Ukrainian Central Radioactive Waste Processing Facility (CRWPF) from the Ukrainian Scientific Research and Design institute for Industrial Technology was obtained and incorporated into this report. This report outlines various processing options, their associated costs and construction schedules, which can be applied to solving the operating and decommissioning radioactive waste management problems in Ukraine. The costs and schedules are best estimates based upon the most current US industry practice and vendor information. This report focuses primarily on the handling and processing of what is defined in the US as low-level radioactive wastes

  9. Computer software design description for the Treated Effluent Disposal Facility (TEDF), Project L-045H, Operator Training Station (OTS)

    International Nuclear Information System (INIS)

    The Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) is a computer-based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS)

  10. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  11. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  12. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  13. The Cost of Supplying Segmented Consumers From a Central Facility

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Klose, Andreas

    consider three measures of dispersion of demand points: the average distance between demand points, the maximum distance and the surface size.In our distribution model, all demand points are restocked from a central facility. The observed logistics costs are determined using the tour length estimations......Organizations regularly face the strategic marketing decision which groups of consumers they should target. A potential problem, highlighted in Steenkamp et al. (2002), is that the target consumers may be so widely dispersed that an organization cannot serve its customers cost-effectively. We...... described in Daganzo (2004). Normal, continuous travel distance estimates require that demand locations are uniformly distributed across the plane, but we also consider scenarios with non-uniformly distributed demand locations. The resulting travel distances are highly correlated with our surface size...

  14. Computational analysis of irradiation facilities at the JSI TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Snoj, Luka, E-mail: luka.snoj@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Zerovnik, Gasper, E-mail: gasper.zerovnik@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2012-03-15

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. - Highlights: Black-Right-Pointing-Pointer TRIGA reactor at JSI suitable for irradiation under well defined conditions. Black-Right-Pointing-Pointer It features irradiation channels of different fluxes, spectra, and dimensions. Black-Right-Pointing-Pointer Computational model has been developed and experimentally verified. Black-Right-Pointing-Pointer The model used for optimization of experiments and evaluation of uncertainties.

  15. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  16. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  17. National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P.J., LLNL

    1998-06-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance.

  18. Accelerator system for the Central Japan Synchrotron Radiation Facility

    International Nuclear Information System (INIS)

    Accelerator system for Central Japan Synchrotron Radiation Research Facility that consists of 50MeV electron S-band linac, 1.2GeV full energy booster synchrotron and 1.2GeV storage ring, has been constructed. Eight 1.4T bending magnets and four 5T superconducting magnet with compact refrigerator system provide beam lines. For top-up operation, the 1ns single bunch electron beam from 50MeV injector linac is injected by on-axis injection scheme and accelerated up to 1.2GeV at booster synchrotron. The timing system is designed for injection from booster ring is possible for any bunch position of storage ring. To improve efficiency of booster injection, the electron gun trigger and RF frequency of 2856MHz is synchronized with storage ring frequency of 499.654MHz. The EPICS control system is used with timing control system for linac, pulse magnet and also for booster pattern memory system. The beam commissioning for 1.2GeV storage ring has been progressing. (author)

  19. Computational Modeling in Support of National Ignition Facility Operations

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M J; Sacks, R A; Haynam, C A; Williams, W H

    2001-10-23

    Numerical simulation of the National Ignition Facility (NIF) laser performance and automated control of laser setup process are crucial to the project's success. These functions will be performed by two closely coupled computer codes: the virtual beamline (VBL) and the laser operations performance model (LPOM).

  20. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  1. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    Science.gov (United States)

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  2. Central Facilities Area Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Brion Bennett

    2011-11-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Central Facilities Area facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facilityspecific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  3. Empirical Foundation of Central Concepts for Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Spannagel, Christian

    2008-01-01

    The design of computer science curricula should rely on central concepts of the discipline rather than on technical short-term developments. Several authors have proposed lists of basic concepts or fundamental ideas in the past. However, these catalogs were based on subjective decisions without any empirical support. This article describes the…

  4. Distributed Computing with Centralized Support Works at Brigham Young.

    Science.gov (United States)

    McDonald, Kelly; Stone, Brad

    1992-01-01

    Brigham Young University (Utah) has addressed the need for maintenance and support of distributed computing systems on campus by implementing a program patterned after a national business franchise, providing the support and training of a centralized administration but allowing each unit to operate much as an independent small business.…

  5. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  6. A central tower solar test facility /RM/CTSTF/

    Science.gov (United States)

    Bevilacqua, S.; Gislon, R.

    The considered facility is intended for the conduction of test work in connection with studies of receivers, thermodynamic cycles, heliostats, components, and subassemblies. Major components of the test facility include a mirror field with a reflecting surface of 800 sq m, a 40 m tower, an electronic control system, a data-acquisition system, and a meteorological station. A preliminary experimental program is discussed, taking into account investigations related to facility characterization, an evaluation of advanced low-cost heliostats, materials and components tests, high-concentration photovoltaic experiments, and a study of advanced solar thermal cycles.

  7. Security central processing unit applications in the protection of nuclear facilities

    International Nuclear Information System (INIS)

    New or upgraded electronic security systems protecting nuclear facilities or complexes will be heavily computer dependent. Proper planning for new systems and the employment of new state-of-the-art 32 bit processors in the processing of subsystem reports are key elements in effective security systems. The processing of subsystem reports represents only a small segment of system overhead. In selecting a security system to meet the current and future needs for nuclear security applications the central processing unit (CPU) applied in the system architecture is the critical element in system performance. New 32 bit technology eliminates the need for program overlays while providing system programmers with well documented program tools to develop effective systems to operate in all phases of nuclear security applications

  8. New construction of an inside-container drying facility in the central decontamination and water treatment facility (ZDW)

    International Nuclear Information System (INIS)

    For the future conditioning of radioactive liquid waste during the proceeding dismantling of the NPP Greifswald the GNS company provides the an inside-container drying facility in the frame of the new construction of ZDW (central decontamination and water treatment facility) including related infrastructure and media supply. The concept of the FAVORIT facility which is in operation since years has been refined; a fully automated version was realized so that no handling by the personnel is necessary for loading and unloading of the container station. Components of the vacuum system were optimized.

  9. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  10. Central Design Team (CDT) for an experimental fast spectra transmutation facility

    Energy Technology Data Exchange (ETDEWEB)

    Perezagua Aguado, M.

    2011-07-01

    The Central Design Team project comes as part of the 7th Framework Programme of the EC. Its purpose is to develop the basic engineering of facilities for use as a test-bed for transmutation and fast spectrum irradiation installations operating as an accelerator-drive subcritical system and/or as a critical reactor, These facilities will be installed in the upcoming SCK complex in Mol (Belgium) called MYRRHA (a multi-purpose flexible irradiation facility).

  11. Computational investigations of low-emission burner facilities for char gas burning in a power boiler

    Science.gov (United States)

    Roslyakov, P. V.; Morozov, I. V.; Zaychenko, M. N.; Sidorkin, V. T.

    2016-04-01

    Various variants for the structure of low-emission burner facilities, which are meant for char gas burning in an operating TP-101 boiler of the Estonia power plant, are considered. The planned increase in volumes of shale reprocessing and, correspondingly, a rise in char gas volumes cause the necessity in their cocombustion. In this connection, there was a need to develop a burner facility with a given capacity, which yields effective char gas burning with the fulfillment of reliability and environmental requirements. For this purpose, the burner structure base was based on the staging burning of fuel with the gas recirculation. As a result of the preliminary analysis of possible structure variants, three types of early well-operated burner facilities were chosen: vortex burner with the supply of recirculation gases into the secondary air, vortex burner with the baffle supply of recirculation gases between flows of the primary and secondary air, and burner facility with the vortex pilot burner. Optimum structural characteristics and operation parameters were determined using numerical experiments. These experiments using ANSYS CFX bundled software of computational hydrodynamics were carried out with simulation of mixing, ignition, and burning of char gas. Numerical experiments determined the structural and operation parameters, which gave effective char gas burning and corresponded to required environmental standard on nitrogen oxide emission, for every type of the burner facility. The burner facility for char gas burning with the pilot diffusion burner in the central part was developed and made subject to computation results. Preliminary verification nature tests on the TP-101 boiler showed that the actual content of nitrogen oxides in burner flames of char gas did not exceed a claimed concentration of 150 ppm (200 mg/m3).

  12. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  13. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Blood Establishment Computer System... ``Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility'' dated April... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the...

  14. Status of the National Ignition Facility Integrated Computer Control System

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1ω, 2ω, and 3ω beamlines in the world. NIF's target experimental

  15. Improvement of the Computing - Related Procurement Process at a Government Research Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gittins, C.

    2000-04-03

    The purpose of the project was to develop, implement, and market value-added services through the Computing Resource Center in an effort to streamline computing-related procurement processes across the Lawrence Livermore National Laboratory (LLNL). The power of the project was in focusing attention on and value of centralizing the delivery of computer related products and services to the institution. The project required a plan and marketing strategy that would drive attention to the facility's value-added offerings and services. A significant outcome of the project has been the change in the CRC internal organization. The realignment of internal policies and practices, together with additions to its product and service offerings has brought an increased focus to the facility. This movement from a small, fractious organization into one that is still small yet well organized and focused on its mission and goals has been a significant transition. Indicative of this turnaround was the sharing of information. One-on-one and small group meetings, together with statistics showing work activity was invaluable in gaining support for more equitable workload distribution, and the removal of blame and finger pointing. Sharing monthly reports on sales and operating costs also had a positive impact.

  16. High resolution muon computed tomography at neutrino beam facilities

    International Nuclear Information System (INIS)

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pion decay pipe at a neutrino beam facility and what can be achieved for momentum resolution in a muon spectrometer. Such an imaging system can be applied in archaeology, art history, engineering, material identification and whenever there is a need to image inside a transportable object constructed of dense materials

  17. Status of the National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Bryant, R; Carey, R; Casavant, D; Edwards, O; Ferguson, W; Krammen, J; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Van Arsdall, P J; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1{omega}, 2{omega}, and 3{omega} beamlines in the world

  18. Safety assessments for centralized waste treatment and disposal facility in Puspokszilagy Hungary

    International Nuclear Information System (INIS)

    The centralized waste treatment and disposal facility Puspokszilagy is a shallow land, near surface engineered type disposal unit. The site, together with its geographic, geological and hydrogeological characteristics, is described. Data are given on the radioactive inventory. The operational safety assessment and the post-closure safety assessment is outlined. (author)

  19. Operation and Maintenance Manual for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Norm Stanley

    2011-02-01

    This Operation and Maintenance Manual lists operator and management responsibilities, permit standards, general operating procedures, maintenance requirements and monitoring methods for the Sewage Treatment Plant at the Central Facilities Area at the Idaho National Laboratory. The manual is required by the Municipal Wastewater Reuse Permit (LA-000141-03) the sewage treatment plant.

  20. Evaluation of Milking Facilities and Machinery in Dairy Operations of Central Anatolia

    Directory of Open Access Journals (Sweden)

    Cevdet Sağlam

    2014-09-01

    Full Text Available The present study was conducted to investigate the recent changes in dairy operations of Central Anatolia and to evaluate the milking facilities and machinery capacities of the facilities.The basic objective is to put forward the recent changes through the state supports provided to dairy facilities of the Central Anatolian Provinces. While investigating and evaluating the recent changes, milking parlors, milking machinery and livestock inventories of the facilities were taken into consideration. The relevant data were gathered from the recent records of Turkish Institute of Statistics (TUIK, Agriculture and Rural Development Support Institution (TKDK and the Ministry of Economy.As a result, in this study while in determining the recent changes in dairy facilities of the Central Anatolian Provinces, the number of milking machinery, milking parlors and livestock inventories were taken into consideration as well as milk yields per animal and number of animals per machine. According to TUIK records for last six years; 384 milking parlors were built, 8877 milking machinery were bought, the increment in the number of livestock inventories and milk yields per animal were taken place as 62.9% and 9.5% respectively. It is thought that supports provided to dairy operations had a great effect on the recent developments. Such supports should also be provided in the future to furnish entire dairy operations with modern machinery and equipment, thus to improve yields and product qualities.

  1. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  2. Computing collinear 4-Body Problem central configurations with given masses

    CERN Document Server

    Piña, E

    2011-01-01

    An interesting description of a collinear configuration of four particles is found in terms of two spherical coordinates. An algorithm to compute the four coordinates of particles of a collinear Four-Body central configuration is presented by using an orthocentric tetrahedron, which edge lengths are function of given masses. Each mass is placed at the corresponding vertex of the tetrahedron. The center of mass (and orthocenter) of the tetrahedron is at the origin of coordinates. The initial position of the tetrahedron is placed with two pairs of vertices each in a coordinate plan, the lines joining any pair of them parallel to a coordinate axis, the center of masses of each and the center of mass of the four on one coordinate axis. From this original position the tetrahedron is rotated by two angles around the center of mass until the direction of configuration coincides with one axis of coordinates. The four coordinates of the vertices of the tetrahedron along this direction determine the central configurati...

  3. CAD/CAM transtibial prosthetic sockets from central fabrication facilities: How accurate are they?

    Science.gov (United States)

    Sanders, Joan E.; Rogers, Ellen L.; Sorenson, Elizabeth A.; Lee, Gregory S.; Abrahamson, Daniel C.

    2014-01-01

    This research compares transtibial prosthetic sockets made by central fabrication facilities with their corresponding American Academy of Orthotists and Prosthetists (AAOP) electronic shape files and assesses the central fabrication process. We ordered three different socket shapes from each of 10 manufacturers. Then we digitized the sockets using a very accurate custom mechanical digitizer. Results showed that quality varied considerably among the different manufacturers. Four of the companies consistently made sockets within +/−1.1% volume (approximately 1 sock ply) of the AAOP electronic shape file, while six other companies did not. Six of the companies showed consistent undersizing or oversizing in their sockets, which suggests a consistent calibration or manufacturing error. Other companies showed inconsistent sizing or shape distortion, a difficult problem that represents a most challenging limitation for central fabrication facilities. PMID:18247236

  4. Central Laser Facility. Rutherford Appleton Laboratory; annual report 1996/97

    International Nuclear Information System (INIS)

    The Central Laser Facility (CLF) is one of the UK's major research facilities and is devoted to the provision of advanced laser systems for pure and applied research. Based at Rutherford Appleton Laboratory near Didcot in Oxfordshire, the CLF was set up twenty years ago to provide the UK university community with a world class high power laser system. The initial impetus behind the establishment of the CLF was the then recently released idea of laser induced thermonuclear fusion. These days, laser fusion is just one topic amongst many that is studied at the CLF and the extent and capabilities of the laser systems have expanded enormously as new discoveries in the rapidly changing field of laser science have been incorporated into the facilities provided. This overview looks at these facilities as they were at the end of March 1997. (author)

  5. Single-computer HWIL simulation facility for real-time vision systems

    Science.gov (United States)

    Fuerst, Simon; Werner, Stefan; Dickmanns, Ernst D.

    1998-07-01

    UBM is working on autonomous vision systems for aircraft for more than one and a half decades by now. The systems developed use standard on-board sensors and two additional monochrome cameras for state estimation of the aircraft. A common task is to detect and track a runway for an autonomous landing approach. The cameras have different focal lengths and are mounted on a special pan and tilt camera platform. As the platform is equipped with two resolvers and two gyros it can be stabilized inertially and the system has the ability to actively focus on the objects of highest interest. For verification and testing, UBM has a special HWIL simulation facility for real-time vision systems. Central part of this simulation facility is a three axis motion simulator (DBS). It is used to realize the computed orientation in the rotational degrees of freedom of the aircraft. The two-axis camera platform with its two CCD-cameras is mounted on the inner frame of the DBS and is pointing at the cylindrical projection screen with a synthetic view displayed on it. As the performance of visual perception systems has increased significantly in recent years, a new, more powerful synthetic vision system was required. A single Onyx2 machine replaced all the former simulation computers. This computer is powerful enough to simulate the aircraft, to generate a high-resolution synthetic view, to control the DBS and to communicate with the image processing computers. Further improvements are the significantly reduced delay times for closed loop simulations and the elimination of communication overhead.

  6. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  7. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  8. Recycled Water Reuse Permit Renewal Application for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2014-09-01

    This renewal application for a Recycled Water Reuse Permit is being submitted in accordance with the Idaho Administrative Procedures Act 58.01.17 “Recycled Water Rules” and the Municipal Wastewater Reuse Permit LA-000141-03 for continuing the operation of the Central Facilities Area Sewage Treatment Plant located at the Idaho National Laboratory. The permit expires March 16, 2015. The permit requires a renewal application to be submitted six months prior to the expiration date of the existing permit. For the Central Facilities Area Sewage Treatment Plant, the renewal application must be submitted by September 16, 2014. The information in this application is consistent with the Idaho Department of Environmental Quality’s Guidance for Reclamation and Reuse of Municipal and Industrial Wastewater and discussions with Idaho Department of Environmental Quality personnel.

  9. Characterization and reclamation assessment for the central shops diesel storage facility at Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Fliermans, C.B.; Hazen, T.C.; Bledsoe, H.W.

    1994-12-31

    The contamination of subsurface terrestrial environments by organic contaminants is a global phenomenon. The remediation of such environments requires innovative assessment techniques and strategies for successful cleanups. Using innovative approaches, the central Shops Diesel Storage Facility at the Savannah River Site (SRS) was characterized to determine the extent of subsurface diesel fuel contamination. Effective bioremediation techniques for cleaning up of the contaminant plume were established.

  10. Wastewater Land Application Permit LA-000141 Renewal Information for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Laboratory, Idaho National

    1999-02-01

    On July 25, 1994, the State ofldaho Division of Environmental Quality (DEQ) issued a Wastewater Land Application Permit (WLAP) for the Idaho National Engineering Laboratory's (INEL, now the Idaho National Engineering and Environmental Laboratory [INEEL]) Central Facilities Area (CFA) Sewage Treatment Plant (STP). The permit expires August 7, 1999. In addition to the renewal application, this report was prepared to provide the following information as requested by DEQ.

  11. Safety report for Central Interim Storage facility for radioactive waste from small producers

    International Nuclear Information System (INIS)

    In 1999 the Agency for Radwaste Management took over the management of the Central Interim Storage (CIS) in Brinje, intended only for radioactive waste from industrial, medical and research applications. With the transfer of the responsibilities for the storage operation, ARAO, the new operator of the facility, received also the request from the Slovenian Nuclear Safety Administration for refurbishment and reconstruction of the storage and for preparation of the safety report for the storage with the operational conditions and limitations. In order to fulfill these requirements ARAO first thoroughly reviewed the existing documentation on the facility, the facility itself and the stored inventory. Based on the findings of this review ARAO prepared several basic documents for improvement of the current conditions in the storage facility. In October 2000 the Plan for refurbishment and modernization of the CIS was prepared, providing an integral approach towards remediation and refurbishment of the facility, optimization of the inventory arrangement and modernization of the storage and storing utilization. In October 2001 project documentation for renewal of electric installations, water supply and sewage system, ventilation system, the improvements of the fire protection and remediation of minor defects discovered in building were completed according to the Act on Construction. In July 2003 the safety report was prepared, based on the facility status after the completion of the reconstruction works. It takes into account all improvements and changes introduced by the refurbishment and reconstruction of the facility according to project documentation. Besides the basic characteristics of the location and its surrounding, it also gives the technical description of the facility together with proposed solutions for the renewal of electric installations, renovation of water supply and sewage system, refurbishment of the ventilation system, the improvement of fire

  12. Evaluating Computer Technology Integration in a Centralized School System

    Science.gov (United States)

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  13. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  14. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  15. Techniques for Measuring Aerosol Attenuation using the Central Laser Facility at the Pierre Auger Observatory

    CERN Document Server

    ,

    2013-01-01

    The Pierre Auger Observatory in Malarg\\"ue, Argentina, is designed to study the properties of ultra-high energy cosmic rays with energies above 1018 eV. It is a hybrid facility that employs a Fluorescence Detector to perform nearly calorimetric measurements of Extensive Air Shower energies. To obtain reliable calorimetric information from the FD, the atmospheric conditions at the observatory need to be continuously monitored during data acquisition. In particular, light attenuation due to aerosols is an important atmospheric correction. The aerosol concentration is highly variable, so that the aerosol attenuation needs to be evaluated hourly. We use light from the Central Laser Facility, located near the center of the observatory site, having an optical signature comparable to that of the highest energy showers detected by the FD. This paper presents two procedures developed to retrieve the aerosol attenuation of fluorescence light from CLF laser shots. Cross checks between the two methods demonstrate that re...

  16. Implementation of computer security at nuclear facilities in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lochthofen, Andre; Sommer, Dagmar [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    2013-07-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  17. Wastewater Land Application Permit LA-000141 Renewal Information for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    None

    1999-02-01

    On July 25, 1994, the State of ldaho Division of Environmental Quality issued a Wastewater Land Application Permit, #LA-000141-01, for the Central Facilities Area Sewage Treatment Plant. The permit expires August 7, 1999. This report is being submitted with the renewal application and specifically addresses; Wastewater flow; Wastewater characteristics; Impacts to vegetation in irrigation area; Impacts to soil in irrigation area; Evaluation of groundwater monitoring wells for Wastewater Land Application Permit purposes; Summary of trends observed during the 5-year reporting period; and Projection of changes and new processes.

  18. Process as Content in Computer Science Education: Empirical Determination of Central Processes

    Science.gov (United States)

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2008-01-01

    Computer science education should not be based on short-term developments but on content that is observable in multiple domains of computer science, may be taught at every intellectual level, will be relevant in the longer term, and is related to everyday language and/or thinking. Recently, a catalogue of "central concepts" for computer science…

  19. 2012 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2013-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2011, through October 31, 2012. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2012 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant.

  20. 2010 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike lewis

    2011-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2009, through October 31, 2010. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of special compliance conditions • Discussion of the facility’s environmental impacts. During the 2010 permit year, approximately 2.2 million gallons of treated wastewater was land-applied to the irrigation area at Central Facilities Area Sewage Treatment plant.

  1. NNS computing facility manual P-17 Neutron and Nuclear Science

    International Nuclear Information System (INIS)

    This document describes basic policies and provides information and examples on using the computing resources provided by P-17, the Neutron and Nuclear Science (NNS) group. Information on user accounts, getting help, network access, electronic mail, disk drives, tape drives, printers, batch processing software, XSYS hints, PC networking hints, and Mac networking hints is given

  2. High Resolution Muon Computed Tomography at Neutrino Beam Facilities

    CERN Document Server

    Suerfu, Burkhant

    2015-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pio...

  3. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  4. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  5. Implementation of the Facility Integrated Inventory Computer System (FICS)

    International Nuclear Information System (INIS)

    This paper describes a computer system which has been developed for nuclear material accountability and implemented in an active radiochemical processing plant involving remote operations. The system posesses the following features: comprehensive, timely records of the location and quantities of special nuclear materials; automatically updated book inventory files on the plant and sub-plant levels of detail; material transfer coordination and cataloging; automatic inventory estimation; sample transaction coordination and cataloging; automatic on-line volume determination, limit checking, and alarming; extensive information retrieval capabilities; and terminal access and application software monitoring and logging

  6. Health Risks to Computer Workers in Various Indoor Facilities

    OpenAIRE

    Tint, Piia; Tuulik, Viiu; Karai, Deniss; Meigas, Kalju

    2014-01-01

    Musculoskeletal disorders (MSDs) are the main cause of occupational diseases in Estonia and Europe. The number of MSDs has increased with the increasing number of computer workers and the increasing workload overall. In the current paper, the survey of 295 workers in Estonian enterprises is carried out to clarify the reasons of occupational stress. The causes of occupational stress are non-ergonomically designed workplace, social and human factors. The main questionnaires used were KIVA and G...

  7. Computer Support for Document Management in the Danish Central Government

    DEFF Research Database (Denmark)

    Hertzum, Morten

    1995-01-01

    and organizational impact of document management systems in the Danish central government. The currently used systems unfold around the recording of incoming and outgoing paper mail and have typically not been accompanied by organizational changes. Rather, document management tends to remain an appendix......Document management systems are generally assumed to hold a potential for delegating the recording and retrieval of documents to professionals such as civil servants and for supporting the coordination and control of work, so-called workflow management. This study investigates the use...... is applied most extensively in an institution with certain mass production characteristics, and the systems do not address needs specific to the civil servants....

  8. Demonstration of neutron radiography and computed tomography at the University of Texas thermal neutron imaging facility

    International Nuclear Information System (INIS)

    A thermal neutron imaging facility for real-time neutron radiography and computed tomography has recently been developed and built at the University of Texas TRIGA reactor. Herein the authors present preliminary results of radiography and tomography test experiments. These preliminary results showed that the beam is of high quality and is suitable for radiography and tomography applications. A more detailed description of the facility is given elsewhere

  9. Performance of grid-tied PV facilities based on real data in Spain: Central inverter versus string system

    International Nuclear Information System (INIS)

    Highlights: • The operation of two grid-tied PV facilities over a two-year period is presented. • The central inverter system is compared to the string inverter system. • A procedure based on a small number of easily obtained parameters is used. • The string inverter outperforms the central inverter. • Conclusions of use to maintenance firms and operational facilities are obtained. - Abstract: Two complete years of operation of two grid-tied PV facilities is presented. Energetic and economic performance of both installations has been compared. Located in the same place, the installation of these facilities followed the same construction criteria – PV panels, panel support system and wiring – and the facilities are exposed to the same atmospheric temperature and solar radiation. They differ with regard to their inverter topology used: one facility uses a central inverter and the other a string inverter configuration. The performance of the facilities has been determined using a procedure based on a small number of easily obtained parameters and the knowledge of the analyzed system and its operation mode. Electrical losses have been calculated for both systems and a complete comparison between them has been carried out. The results have shown better performance for distributed system in economic and energetic terms

  10. Solar heating and hot water system for the central administrative office facility. Technical progress report

    Energy Technology Data Exchange (ETDEWEB)

    1978-11-01

    Progress on the solar heating and hot water system for the central administrative office facility of the Lincoln Housing Authority, Lincoln, NE is covered. An acceptance test plan is presented and the results of the test are tabulated. A complete blueprint of the system as built is provided. The monitoring system is drawn and settings and installation are described. An operation and maintenance manual discusses procedures for start up, shut down and seasonal changeover and include a valve list and pictures and specifications of components and materials used. Photographs of the final installation are included, and technical data and performance data are given. Finally, there is a brief description of system design and operation and a discussion of major maintenance problems encountered and their solutions. (LEW)

  11. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  12. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  13. Feasibility of a medium-size central cogenerated energy facility, energy management memorandum

    Science.gov (United States)

    Porter, R. W.

    1982-09-01

    The thermal-economic feasibility was studied of a medium-size central cogenerated energy facility designed to serve five varied industries. Generation options included one dual-fuel diesel and one gas turbine, both with waste heat boilers, and five fired boilers. Fuels included natural gas, and for the fired-boiler cases, also low-sulphur coal and municipal refuse. The fired-boiler cogeneration systems employed back-pressure steam turbines. For coal and refuse, the option of steam only without cogeneration was also assessed. The refuse-fired cases utilized modular incinerators. The options provided for a wide range of steam and electrical capacities. Deficient steam was assumed generated independently in existing equipment. Excess electrical power over that which could be displaced was assumed sold to Commonwealth Edison Company under PURPA (Public Utility Regulator Policies Act). The facility was assumed operated by a mutually owned corporation formed by the cogenerated power users. The economic analysis was predicted on currently applicable energy-investment tax credits and accelerated depreciation for a January 1985 startup date. Based on 100% equity financing, the results indicated that the best alternative was the modular-incinerator cogeneration system.

  14. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  15. Thermal analysis of the unloading cell of the Spanish centralized interim storage facility (CISF); Analisis termico de la celda de desarga del almacen temporal centralizado (ATC)

    Energy Technology Data Exchange (ETDEWEB)

    Perez Dominguez, J. R.; Perez Vara, R.; Huelamo Martinez, E.

    2016-08-01

    This article deals with the thermal analysis performed for the Untoading Cell of Spain Centralized Interim Storage Facility, CISF (ATC, in Spanish). The analyses are done using computational fluid dynamics (CFD) simulation, with the aim of obtaining the air flow required to remove the residual heat of the elements stored in the cell. Compliance with the admissible heat limits is checked with the results obtained in the various operation and accident modes. The calculation model is flexible enough to allow carrying out a number of sensitivity analyses with the different parameters involved in the process. (Author)

  16. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities Project

    Energy Technology Data Exchange (ETDEWEB)

    Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller; Laura L. Glaser; Kathryn L. Hanson; Ross D. Hartleb; William R. Lettis; Scott C. Lindvall; Stephen M. McDuffie; Robin K. McGuire; Gerry L. Stirewalt; Gabriel R. Toro; Robert R. Youngs; David L. Slayter; Serkan B. Bozkurt; Randolph J. Cumbest; Valentina Montaldo Falero; Roseanne C. Perman' Allison M. Shumway; Frank H. Syms; Martitia (Tish) P. Tuttle

    2012-01-31

    Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan

  17. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. Taking the classical large audience university lecture online using tablet computer and webconferencing facilities

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    During four offerings (September 2008 – May 2011) of the course 02402 Introduction to Statistics for Engineering students at DTU, with an average of 256 students, the lecturing was carried out 100% through a tablet computer combined with the web conferencing facility Adobe Connect (version 7...

  20. The development and operation of the international solar-terrestrial physics central data handling facility

    Science.gov (United States)

    Lehtonen, Kenneth

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) International Solar-Terrestrial Physics (ISTP) Program is committed to the development of a comprehensive, multi-mission ground data system which will support a variety of national and international scientific missions in an effort to study the flow of energy from the sun through the Earth-space environment, known as the geospace. A major component of the ISTP ground data system is an ISTP-dedicated Central Data Handling Facility (CDHF). Acquisition, development, and operation of the ISTP CDHF were delegated by the ISTP Project Office within the Flight Projects Directorate to the Information Processing Division (IPD) within the Mission Operations and Data Systems Directorate (MO&DSD). The ISTP CDHF supports the receipt, storage, and electronic access of the full complement of ISTP Level-zero science data; serves as the linchpin for the centralized processing and long-term storage of all key parameters generated either by the ISTP CDHF itself or received from external, ISTP Program approved sources; and provides the required networking and 'science-friendly' interfaces for the ISTP investigators. Once connected to the ISTP CDHF, the online catalog of key parameters can be browsed from their remote processing facilities for the immediate electronic receipt of selected key parameters using the NASA Science Internet (NSI), managed by NASA's Ames Research Center. The purpose of this paper is twofold: (1) to describe how the ISTP CDHF was successfully implemented and operated to support initially the Japanese Geomagnetic Tail (GEOTAIL) mission and correlative science investigations, and (2) to describe how the ISTP CDHF has been enhanced to support ongoing as well as future ISTP missions. Emphasis will be placed on how various project management approaches were undertaken that proved to be highly effective in delivering an operational ISTP CDHF to the Project on schedule and

  1. 2015 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Michael George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2014, through October 31, 2015.

  2. 2013 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2014-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2012, through October 31, 2013. The report contains, as applicable, the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2013 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. However, soil samples were collected in October from soil monitoring unit SU-014101.

  3. A Fruitful Collaboration between ESO and the Max Planck Computing and Data Facility

    Science.gov (United States)

    Fourniol, N.; Zampieri, S.; Panea, M.

    2016-06-01

    The ESO Science Archive Facility (SAF), contains all La Silla Paranal Observatory raw data, as well as, more recently introduced, processed data created at ESO with state-of-the-art pipelines or returned by the astronomical community. The SAF has been established for over 20 years and its current holding exceeds 700 terabytes. An overview of the content of the SAF and the preservation of its content is provided. The latest development to ensure the preservation of the SAF data, provision of an independent backup copy of the whole SAF at the Max Planck Computing and Data Facility in Garching, is described.

  4. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  5. The Organization and Evaluation of a Computer-Assisted, Centralized Immunization Registry.

    Science.gov (United States)

    Loeser, Helen; And Others

    1983-01-01

    Evaluation of a computer-assisted, centralized immunization registry after one year shows that 93 percent of eligible health practitioners initially agreed to provide data and that 73 percent continue to do so. Immunization rates in audited groups have improved significantly. (GC)

  6. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    A proposal has been made at LBL to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multilevel, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data through typical transformations and correlations in under 30 s. The throughput for such a facility, for five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600. 3 figures

  7. Opportunities for artificial intelligence application in computer- aided management of mixed waste incinerator facilities

    International Nuclear Information System (INIS)

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site. It is designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conservation and Recovery Act (RCRA). This facility, known as the TSCA Incinerator, services seven DOE/OR installations. This incinerator was recently authorized for production operation in the United States for the processing of mixed (radioactively contaminated-chemically hazardous) wastes as regulated under TSCA and RCRA. Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. These requirements impact the characteristics and disposition of incinerator residues, limits the quality of liquid and gaseous effluents, limit the characteristics and rates of waste feeds and operating conditions, and restrict the handling of the waste feed inventories. This incinerator facility presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. Demonstrated computer-aided management systems could be transferred to future mixed waste incinerator facilities

  8. Automation of a cryogenic facility by commercial process-control computer

    International Nuclear Information System (INIS)

    To insure that Brookhaven's superconducting magnets are reliable and their field quality meets accelerator requirements, each magnet is pre-tested at operating conditions after construction. MAGCOOL, the production magnet test facility, was designed to perform these tests, having the capacity to test ten magnets per five day week. This paper describes the control aspects of MAGCOOL and the advantages afforded the designers by the implementation of a commercial process control computer system

  9. Trends in health facility based maternal mortality in Central Region, Kenya: 2008-2012

    Science.gov (United States)

    Muchemi, Onesmus Maina; Gichogo, Agnes Wangechi; Mungai, Jane Githuku; Roka, Zeinab Gura

    2016-01-01

    Introduction WHO classifies Kenya as having a high maternal mortality. Regional data on maternal mortality trends is only available in selected areas. This study reviewed health facility maternal mortality trends, causes and distribution in Central Region of Kenya, 2008-2012. Methods We reviewed health records from July 2008 to June 2012. A maternal death was defined according to ICD-10 criterion. The variables reviewed included socio-demographic, obstetric characteristics, reasons for admission, causes of death and contributing factors. We estimated maternal mortality ratio for each year and overall for the four year period using a standard equation and used frequencies means/median and proportions for other descriptive variables. Results A total 421 deaths occurred among 344,191 live births; 335(80%) deaths were audited. Maternal mortality ratios were: 127/100,000 live births in 2008/09; 124/100,000 live births in 2009/2010; 129/100,000 live births in 2010/2011 and 111/100,000 live births in 2011/2012. Direct causes contributed majority of deaths (77%, n=234) including hemorrhage, infection and pre-eclampsia/eclampsia. Mean age was 30(±6) years; 147(71%) attended less than four antenatal visits and median gestation at birth was 38 weeks (IQR=9). One hundred ninety (59%) died within 24 hours after admission. There were 111(46%) caesarian births, 95(39%) skilled vaginal, 31(13%) unskilled 5(2%) vacuum deliveries and 1(<1%) destructive operation. Conclusion The region recorded an unsteady declining trend. Direct causes contributed to the majority deaths including hemorrhage, infection and pre-eclampsia/eclampsia. We recommend health education on individualized birth plan and mentorship on emergency obstetric care. Further studies are necessary to clarify and expand the findings of this study. PMID:27516824

  10. Central washout sign in computer-aided evaluation of breast MRI: preliminary results

    International Nuclear Information System (INIS)

    Background: Although computer-aided evaluation (CAE) programs were introduced to help differentiate benign tumors from malignant ones, the set of CAE-measured parameters that best predict malignancy have not yet been established. Purpose: To assess the value of the central washout sign on CAE color overlay images of breast MRI. Material and Methods: We evaluated the frequency of the central washout sign using CAE. The central washout sign was determined so that thin, rim-like, persistent kinetics were seen in the periphery of the tumor. Then, sequentially, plateau and washout kinetics appeared. Two additional CAE-delayed kinetic variables were compared with the central washout sign for assessment of diagnostic utility: the predominant enhancement type (washout, plateau, or persistent) and the most suspicious enhancement type (any washout > any plateau > any persistent kinetics). Results: One hundred and forty-nine pathologically proven breast lesions (130 malignant, 19 benign) were evaluated. A central washout sign was associated with 87% of malignant lesions but only 11% of benign lesions. Significant differences were found when delayed-phase kinetics were categorized by the most suspicious enhancement type (P< 0.001) and the presence of the central washout sign (P< 0.001). Under the criteria of the most suspicious kinetics, 68% of benign lesions were assigned as plateau or washout pattern. Conclusion: The central washout sign is a reliable indicator of malignancy on CAE color overlay images of breast MRI

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  12. Interventions on central computing services during the weekend of 21 and 22 August

    CERN Multimedia

    2004-01-01

    As part of the planned upgrade of the computer centre infrastructure to meet the LHC computing needs, approximately 150 servers, hosting in particular the NICE home directories, Mail services and Web services, will need to be physically relocated to another part of the computing hall during the weekend of the 21 and 22 August. On Saturday 21 August, starting from 8:30a.m. interruptions of typically 60 minutes will take place on the following central computing services: NICE and the whole Windows infrastructure, Mail services, file services (including home directories and DFS workspaces), Web services, VPN access, Windows Terminal Services. During any interruption, incoming mail from outside CERN will be queued and delivered as soon as the service is operational again. All Services should be available again on Saturday 21 at 17:30 but a few additional interruptions will be possible after that time and on Sunday 22 August. IT Department

  13. Nuclear fuel cycle facilities in the world (excluding the centrally planned economies)

    International Nuclear Information System (INIS)

    Information on the existing, under construction and planned fuel cycle facilities in the various countries is presented. Some thirty countries have activities related to different nuclear fuel cycle steps and the information covers the capacity, status, location, and the names of owners of the facilities

  14. Clinical radiodiagnosis of metastases of central lung cancer in regional lymph nodes using computers

    International Nuclear Information System (INIS)

    On the basis of literary data and clinical examination (112 patients) methods of clinical radiodiagnosis of metastases of central lung cancer in regional lymph nodes using computers are developed. Methods are tested on control clinical material (110 patients). Using computers (Bayes and Vald methods) 57.3% and 65.5% correct answers correspondingly are obtained, that is by 14.6% and 22.8% higher the level of clinical diagnosis of metastases. Diagnostic errors are analysed. Complexes of clinical-radiological signs of symptoms of metastases are outlined

  15. SynapSense Wireless Environmental Monitoring System of the RHIC & ATLAS Computing Facility at BNL

    Science.gov (United States)

    Casella, K.; Garcia, E.; Hogue, R.; Hollowell, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2014-06-01

    RHIC & ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  16. Central Nervous System Based Computing Models for Shelf Life Prediction of Soft Mouth Melting Milk Cakes

    Directory of Open Access Journals (Sweden)

    Gyanendra Kumar Goyal

    2012-04-01

    Full Text Available This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve good quality of food products, detection of shelf life is important. Central nervous system based intelligent computing model was developed which detected 19.82 days shelf life, as against 21 days experimental shelf life.

  17. Central Nervous System Based Computing Models for Shelf Life Prediction of Soft Mouth Melting Milk Cakes

    OpenAIRE

    Gyanendra Kumar Goyal; Sumit Goyal

    2012-01-01

    This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve g...

  18. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  19. Computational Analysis of Shock Layer Emission Measurements in an Arc-Jet Facility

    Science.gov (United States)

    Gokcen, Tahir; Park, Chung S.; Newfield, Mark E.; Fletcher, Douglas G.

    1998-01-01

    This paper reports computational analysis of radiation emission experiments in a high enthalpy arc-jet wind tunnel at NASA Ames Research Center. Recently, as part of ongoing arc-jet characterization work, spectroscopic radiation emission experiments have been conducted at the 20 MW NASA Ames arc-jet facility. The emission measurements were obtained from the arc-jet freestream and from a shock layer formed in front of flatfaced models. Analysis of these data is expected to provide valuable information about the thermodynamic state of the gas in the arc-jet freestream and in the shock layer as well as thermochemical equilibration processes behind the shock in arc-jet flows. Knowledge of the thermodynamic state of the gas in arc-jet test flows and especially within the shock layer is essential to interpret the heat transfer measurements such as in surface catalysis experiments. The present work is a continuation of previous work and focuses on analysis of the emission data obtained at relatively low-pressure conditions for which the arc-jet shock layer is expected to be in thermal and chemical nonequilibrium. Building blocks of the present computational analysis are: (1) simulation of nonequilibrium expanding flow in the converging-diverging conical nozzle and supersonic jet; (2) simulation of nonequilibrium shock layer formed in front of the flat-faced cylinder model; and (3) prediction of line-of-sight radiation from the computed flowfield. For computations of the nonequilibrium flow in the conical nozzle and shock layer, multi-temperature nonequilibrium codes with the axisymmetric formulation are used. For computations of line-of-sight radiation. a nonequilibrium radiation code (NEQAIR) is used to predict emission spectra from the computed flowfield. Computed line-of-sight averaged flow properties such as vibrational and rotational temperatures, species number densities within the shock layer will be compared with those deduced from the experimental spectra

  20. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    International Nuclear Information System (INIS)

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  1. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  2. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  3. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  4. Neuromotor recovery from stroke: Computational models at central, functional, and muscle synergy level

    Directory of Open Access Journals (Sweden)

    Maura eCasadio

    2013-08-01

    Full Text Available Computational models of neuromotor recovery after a stroke might help to unveil the underlying physiological mechanisms and might suggest how to make recovery faster and more effective. At least in principle, these models could serve: (i To provide testable hypotheses on the nature of recovery; (ii To predict the recovery of individual patients; (iii To design patient-specific ’optimal’ therapy, by setting the treatment variables for maximizing the amount of recovery or for achieving a better generalization of the learned abilities across different tasks.Here we review the state of the art of computational models for neuromotor recovery through exercise, and their implications for treatment. We show that to properly account for the computational mechanisms of neuromotor recovery, multiple levels of description need to be taken into account. The review specifically covers models of recovery at central, functional and muscle synergy level.

  5. 2014 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Mike [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2013, through October 31, 2014. The report contains, as applicable, the following information; Site description; Facility and system description; Permit required monitoring data and loading rates; Status of compliance conditions and activities; and Discussion of the facility’s environmental impacts. The current permit expires on March 16, 2015. A permit renewal application was submitted to Idaho Department of Environmental Quality on September 15, 2014. During the 2014 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. Seepage testing of the three lagoons was performed between August 26, 2014 and September 22, 2014. Seepage rates from Lagoons 1 and 2 were below the 0.25 inches/day requirement; however, Lagoon 3 was above the 0.25 inches/day. Lagoon 3 has been isolated and is being evaluated for future use or permanent removal from service.

  6. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  7. Computer-guided facility for the study of single crystals at the gamma diffractometer GADI

    International Nuclear Information System (INIS)

    In the study of solid-state properties it is in many cases necessary to work with single crystals. The increased requirement in the industry and research as well as the desire for better characterization by means of γ-diffractometry made it necessary to improve and to modernize the existing instrument. The advantages of a computer-guided facility against the conventional, semiautomatic operation are manifold. Not only the process guidance, but also the data acquisition and evaluation are performed by the computer. By a remote control the operator is able to find quickly a reflex and to drive the crystal in every desired measuring position. The complete protocollation of all important measuring parameters, the convenient data storage, as well as the automatic evaluation are much useful for the user. Finally the measuring time can be increased to practically 24 hours per day. By this the versed characterization by means of γ-diffractometry is put on a completely new level. (orig.)

  8. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  9. Stand alone computer system to aid the development of Mirror Fusion Test Facility rf heating systems

    International Nuclear Information System (INIS)

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  10. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.

    1999-08-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike.

  11. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  12. Current situation with the centralized storage facilities for non-power radioactive wastes in Latin American countries

    International Nuclear Information System (INIS)

    Full text: Several Latin American (LA) countries have been firmly committed to the peaceful applications of ionizing radiations in medicine, industry, agriculture and research in order to achieve socioeconomic development in diverse sectors. Consequently the use of radioactive materials and radiation sources as well as the production of radioisotopes and labeled compounds may always produce radioactive wastes which require adequate management and, in the end, disposal. However, there are countries in the Latin American region whose radioactive waste volumes do not easily justify a national repository. Moreover, such facilities are extremely expensive to develop. It is unlikely that such an option will become available in the foreseeable future for most of these countries, which do not have nuclear industries. Storage has long been incorporated as a step in the management of radioactive wastes. In the recent years, there have been developments that have led some countries to consider whether the roles of storage might be expanded to provide longer-term care of long-live radioactive wastes The aim of this paper is to discuss the current situation with the storage facilities/conditions for the radioactive wastes and disused sealed radioactive sources in Latin-American countries. In some cases a brief description of the existing facilities for certain countries are provided. In other cases, when no centralized facility exists, general information on the radioactive inventories and disused sealed sources is given. (author)

  13. Investigating Preterm Care at the Facility Level: Stakeholder Qualitative Study in Central and Southern Malawi.

    Science.gov (United States)

    Gondwe, Austrida; Munthali, Alister; Ashorn, Per; Ashorn, Ulla

    2016-07-01

    Objectives Malawi is estimated to have one of the highest preterm birth rates in the world. However, care of preterm infants at facility level in Malawi has not been explored. We aimed to explore the views of health stakeholders about the care of preterm infants in health facilities and the existence of any policy protocol documents guiding the delivery of care to these infants. Methods We conducted 16 in-depth interviews with health stakeholders (11 service providers and 5 policy makers) using an interview guide and asked for any existing policy protocol documents guiding care for preterm infants in the health facilities in Malawi. The collected documents were reviewed and all the interviews were digitally recorded, transcribed and translated. All data were analysed using content analysis approach. Results We identified four policy protocol documents and out of these, one had detailed information explaining the care of preterm infants. Policy makers reported that policy protocol documents to guide care for preterm infants were available in the health facilities but majority (63.6 %) of the service providers lacked knowledge about the existence of these documents. Health stakeholders reported several challenges in caring for preterm infants including lack of trained staff in preterm infant care, antibiotics, space, supervision and poor referral system. Conclusions Our study highlights that improving health care service provider knowledge of preterm infant care is an integral part in preterm child birth. Our findings suggests that policy makers and health decision makers should retain those trained in preterm new born care in the health facility's preterm unit.

  14. Investigating Preterm Care at the Facility Level: Stakeholder Qualitative Study in Central and Southern Malawi.

    Science.gov (United States)

    Gondwe, Austrida; Munthali, Alister; Ashorn, Per; Ashorn, Ulla

    2016-07-01

    Objectives Malawi is estimated to have one of the highest preterm birth rates in the world. However, care of preterm infants at facility level in Malawi has not been explored. We aimed to explore the views of health stakeholders about the care of preterm infants in health facilities and the existence of any policy protocol documents guiding the delivery of care to these infants. Methods We conducted 16 in-depth interviews with health stakeholders (11 service providers and 5 policy makers) using an interview guide and asked for any existing policy protocol documents guiding care for preterm infants in the health facilities in Malawi. The collected documents were reviewed and all the interviews were digitally recorded, transcribed and translated. All data were analysed using content analysis approach. Results We identified four policy protocol documents and out of these, one had detailed information explaining the care of preterm infants. Policy makers reported that policy protocol documents to guide care for preterm infants were available in the health facilities but majority (63.6 %) of the service providers lacked knowledge about the existence of these documents. Health stakeholders reported several challenges in caring for preterm infants including lack of trained staff in preterm infant care, antibiotics, space, supervision and poor referral system. Conclusions Our study highlights that improving health care service provider knowledge of preterm infant care is an integral part in preterm child birth. Our findings suggests that policy makers and health decision makers should retain those trained in preterm new born care in the health facility's preterm unit. PMID:26976282

  15. A framework design of computing environment for the material and life science facility at J-PARC

    International Nuclear Information System (INIS)

    We have started to design a framework of the computing environment for the Material and Life science facility at J-PARC. Some of developments are in progress. We also just started to collaborate with the muon group of J-PARC for common software developments and the computing center of JAERI for a large scale computing environments. Security control is always delicate and difficult problem. Not only network securities but also definitions of access privileges to database should be solved carefully. International collaboration is also under consideration. It not easy to have same computing environment with other facilities but data format can be standardized (ex. NeXus format) and basic usage of experiment control can be the same. Such standardization brings benefits to users who often use several facilities. (author)

  16. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  17. The Overview of the National Ignition Facility Distributed Computer Control System

    International Nuclear Information System (INIS)

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008

  18. The Overview of the National Ignition Facility Distributed Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L J; Bettenhausen, R C; Carey, R A; Estes, C M; Fisher, J M; Krammen, J E; Reed, R K; VanArsdall, P J; Woodruff, J P

    2001-10-15

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008.

  19. A centralized hazardous waste treatment plant: the facilities of the ZVSMM at Schwabach as an example

    Energy Technology Data Exchange (ETDEWEB)

    Amsoneit, Norbert [Zweckverband Sondermuell-Entsorgung Mittelfranken, Rednitzhembach (Germany)

    1993-12-31

    In this work a centralized hazardous waste treatment plant is described and its infra-structure is presented. Special emphasis is given to the handling of the residues produced and the different treatment processes at the final disposal. 2 refs., 4 figs.

  20. Techniques for measuring aerosol attenuation using the Central Laser Facility at the Pierre Auger Observatory

    OpenAIRE

    PIERRE AUGER Collaboration; Abreu, P; Pastor, Sergio

    2013-01-01

    The Pierre Auger Observatory in Malargue, Argentina, is designed to study the properties of ultra-high energy cosmic rays with energies above 10(18) eV. It is a hybrid facility that employs a Fluorescence Detector to perform nearly calorimetric measurements of Extensive Air Shower energies. To obtain reliable calorimetric information from the FD, the atmospheric conditions at the observatory need to be continuously monitored during data acquisition. In particular, light attenuation due to aer...

  1. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  2. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  3. [Elderlies in street situation or social vulnerability: facilities and difficulties in the use of computational tools].

    Science.gov (United States)

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Pereira, Valclei Aparecida Gandolpho; Negreiros, Maria Célia de; Paranhos, Wana Yeda; Leite, Maria Madalena Januário

    2014-01-01

    This study aimed to identify the advantages and difficulties encountered by older people living on the streets or social vulnerability, to use the computer or internet. It is an exploratory qualitative research, in which five elderlies, attended on a non-governmental organization located in the city of São Paulo, have participated. The discourses were analyzed by content analysis technique and showed, as facilities, among others, to clarify doubts with the monitors, the stimulus for new discoveries coupled with proactivity and curiosity, and develop new skills. The mentioned difficulties were related to physical or cognitive issues, lack of instructor, and lack of knowledge to interact with the machine. The studies focusing on the elderly population living on the streets or in social vulnerability may contribute with evidence to guide the formulation of public policies to this population. PMID:25517671

  4. Status Of The National Ignition Campaign And National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Brunton, G; Carey, R; Demaret, R; Fisher, J; Fishler, B; Ludwigsen, P; Marshall, C; Reed, R; Shelton, R; Townsend, S

    2011-03-18

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that will contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn. NIF is operated by the Integrated Computer Control System (ICCS) in an object-oriented, CORBA-based system distributed among over 1800 frontend processors, embedded controllers and supervisory servers. In the fall of 2010, a set of experiments began with deuterium and tritium filled targets as part of the National Ignition Campaign (NIC). At present, all 192 laser beams routinely fire to target chamber center to conduct fusion and high energy density experiments. During the past year, the control system was expanded to include automation of cryogenic target system and over 20 diagnostic systems to support fusion experiments were deployed and utilized in experiments in the past year. This talk discusses the current status of the NIC and the plan for controls and information systems to support these experiments on the path to ignition.

  5. Navier-Stokes Simulation of Airconditioning Facility of a Large Modem Computer Room

    Science.gov (United States)

    2005-01-01

    NASA recently assembled one of the world's fastest operational supercomputers to meet the agency's new high performance computing needs. This large-scale system, named Columbia, consists of 20 interconnected SGI Altix 512-processor systems, for a total of 10,240 Intel Itanium-2 processors. High-fidelity CFD simulations were performed for the NASA Advanced Supercomputing (NAS) computer room at Ames Research Center. The purpose of the simulations was to assess the adequacy of the existing air handling and conditioning system and make recommendations for changes in the design of the system if needed. The simulations were performed with NASA's OVERFLOW-2 CFD code which utilizes overset structured grids. A new set of boundary conditions were developed and added to the flow solver for modeling the roomls air-conditioning and proper cooling of the equipment. Boundary condition parameters for the flow solver are based on cooler CFM (flow rate) ratings and some reasonable assumptions of flow and heat transfer data for the floor and central processing units (CPU) . The geometry modeling from blue prints and grid generation were handled by the NASA Ames software package Chimera Grid Tools (CGT). This geometric model was developed as a CGT-scripted template, which can be easily modified to accommodate any changes in shape and size of the room, locations and dimensions of the CPU racks, disk racks, coolers, power distribution units, and mass-storage system. The compute nodes are grouped in pairs of racks with an aisle in the middle. High-speed connection cables connect the racks with overhead cable trays. The cool air from the cooling units is pumped into the computer room from a sub-floor through perforated floor tiles. The CPU cooling fans draw cool air from the floor tiles, which run along the outside length of each rack, and eject warm air into the center isle between the racks. This warm air is eventually drawn into the cooling units located near the walls of the room. One

  6. The effects of buffer strips and bioretention facilities on agricultural productivity and environmental quality in Central Africa

    Science.gov (United States)

    Gilroy, Kristin L.; McCuen, Richard H.

    2010-05-01

    SummaryLand degradation is a growing concern in Central Africa as poor management practices continue to cause erosion and increase water runoff in agricultural fields. The implementation of best management practices (BMPs) is needed; however, productivity is often indirectly related to the environmental benefits of such practices and resource constraints often exist. The purpose of this study was to determine the effects of bioretention facilities and buffer strips on environmental quality with productivity and resources as constraints. A water quantity and quality model for an agricultural field in Central Africa was developed. Analyses were conducted to assess the marginal benefits of each BMP, the effect of different BMP combinations on environmental quality and productivity, and the effect of data uncertainty and location uncertainty on model predictions. The results showed that bioretention pits were more effective than buffer strips in increasing environmental quality. Productivity was shown to be directly related to bioretention pits, thus environmental quality can be attained without sacrificing productivity. Data uncertainties resulted in changes in the environmental quality values, but trends remained the same. Guidelines were provided to assist design engineers in developing BMP scenarios that provide the greatest productivity and environmental quality for the constraints involved. The results of this study will bring awareness to the ability of attaining environmental quality without sacrificing productivity as well as the need for accurate data in Central Africa.

  7. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  8. Integrated assessment of a new Waste-to-Energy facility in Central Greece in the context of regional perspectives.

    Science.gov (United States)

    Perkoulidis, G; Papageorgiou, A; Karagiannidis, A; Kalogirou, S

    2010-07-01

    The main aim of this study is the integrated assessment of a proposed Waste-to-Energy facility that could contribute in the Municipal Solid Waste Management system of the Region of Central Greece. In the context of this paper alternative transfer schemes for supplying the candidate facility were assessed considering local conditions and economical criteria. A mixed-integer linear programming model was applied for the determination of optimum locations of Transfer Stations for an efficient supplying chain between the waste producers and the Waste-to-Energy facility. Moreover different Regional Waste Management Scenarios were assessed against multiple criteria, via the Multi Criteria Decision Making method ELECTRE III. The chosen criteria were total cost, Biodegradable Municipal Waste diversion from landfill, energy recovery and Greenhouse Gas emissions and the analysis demonstrated that a Waste Management Scenario based on a Waste-to-Energy plant with an adjacent landfill for disposal of the residues would be the best performing option for the Region, depending however on the priorities of the decision makers. In addition the study demonstrated that efficient planning is necessary and the case of three sanitary landfills operating in parallel with the WtE plant in the study area should be avoided. Moreover alternative cases of energy recovery of the candidate Waste-to-Energy facility were evaluated against the requirements of the new European Commission Directive on waste in order for the facility to be recognized as recovery operation. The latter issue is of high significance and the decision makers in European Union countries should take it into account from now on, in order to plan and implement facilities that recover energy efficiently. Finally a sensitivity check was performed in order to evaluate the effects of increased recycling rate, on the calorific value of treated Municipal Solid Waste and the gate fee of the candidate plant and found that increased

  9. Integrated assessment of a new Waste-to-Energy facility in Central Greece in the context of regional perspectives

    International Nuclear Information System (INIS)

    The main aim of this study is the integrated assessment of a proposed Waste-to-Energy facility that could contribute in the Municipal Solid Waste Management system of the Region of Central Greece. In the context of this paper alternative transfer schemes for supplying the candidate facility were assessed considering local conditions and economical criteria. A mixed-integer linear programming model was applied for the determination of optimum locations of Transfer Stations for an efficient supplying chain between the waste producers and the Waste-to-Energy facility. Moreover different Regional Waste Management Scenarios were assessed against multiple criteria, via the Multi Criteria Decision Making method ELECTRE III. The chosen criteria were total cost, Biodegradable Municipal Waste diversion from landfill, energy recovery and Greenhouse Gas emissions and the analysis demonstrated that a Waste Management Scenario based on a Waste-to-Energy plant with an adjacent landfill for disposal of the residues would be the best performing option for the Region, depending however on the priorities of the decision makers. In addition the study demonstrated that efficient planning is necessary and the case of three sanitary landfills operating in parallel with the WtE plant in the study area should be avoided. Moreover alternative cases of energy recovery of the candidate Waste-to-Energy facility were evaluated against the requirements of the new European Commission Directive on waste in order for the facility to be recognized as recovery operation. The latter issue is of high significance and the decision makers in European Union countries should take it into account from now on, in order to plan and implement facilities that recover energy efficiently. Finally a sensitivity check was performed in order to evaluate the effects of increased recycling rate, on the calorific value of treated Municipal Solid Waste and the gate fee of the candidate plant and found that increased

  10. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  11. Techno-economic assessment of central sorting at material recovery facilities

    DEFF Research Database (Denmark)

    Cimpan, Ciprian; Maul, Anja; Wenzel, Henrik;

    2016-01-01

    by documenting typical steps taken in a techno-economic assessment of MRFs, using the specific example of lightweight packaging waste (LWP) sorting in Germany. Thus, the study followed the steps of dimensioning of buildings and equipment, calculation of processing costs and projections of revenues from material......Simulation of technical and economic performance for materials recovery facilities (MRFs) is a basic requirement for planning new, or evaluating existing, separate waste collection and recycling systems. This study mitigates the current pervasive scarcity of data on process efficiency and costs...... sales and sorting residues disposal costs. Material flows through the plants were simulated considering both optimal process conditions and real or typical conditions characterised by downtime and frequent operation at overcapacity. By modelling four plants of progressively higher capacity (size...

  12. Simulation of an EPRI-Nevada Test Site (NTS) hydrogen burn test at the Central Receiver Test Facility

    International Nuclear Information System (INIS)

    In order to augment results obtained from the hydrogen burn equipment survival tests performed by the Electric Power Research Institute (EPRI) at the Nevada Test Site (NTS), a series of tests was conducted at the Sandia National Laboratories Central Receiver Test Facility (CRTF). The CRTF tests simulated a 13 volume-percent burn from the EPRI-NTS series. During the CRTF tests, the thermal and operational responses of several specimens of nuclear power plant safety-related equipment were monitored when subjected to a solar heat flux simulation of a hydrogen burn. The simulation was conducted with and without steam in the vicinity of the test specimens. Prior to exposure, the specimens were preheated to temperatures corresponding to the precombustion environment in the EPRI-NTS test vessel

  13. Computational simulation of multi-strut central lobed injection of hydrogen in a scramjet combustor

    Directory of Open Access Journals (Sweden)

    Gautam Choubey

    2016-09-01

    Full Text Available Multi-strut injection is an approach to increase the overall performance of Scramjet while reducing the risk of thermal choking in a supersonic combustor. Hence computational simulation of Scramjet combustor at Mach 2.5 through multiple central lobed struts (three struts have been presented and discussed in the present research article. The geometry and model used here is slight modification of the DLR (German Aerospace Center scramjet model. Present results show that the presence of three struts injector improves the performance of scramjet combustor as compared to single strut injector. The combustion efficiency is also found to be highest in case of three strut fuel injection system. In order to validate the results, the numerical data for single strut injection is compared with experimental result which is taken from the literature.

  14. The National Ignition Facility: Status of the Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P J; Bryant, R; Carey, R; Casavant, D; Demaret, R; Edwards, O; Ferguson, W; Krammen, J; Lagin, L; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3{omega} single laser beam performance is 10.4 kJ, equivalent to 2 MJ

  15. The National Ignition Facility: Status of the Integrated Computer Control System

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3ω single laser beam performance is 10.4 kJ, equivalent to 2 MJ for a fully activated

  16. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: lagin1@llnl.gov; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)] (and others)

    2008-04-15

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including

  17. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    Science.gov (United States)

    Karasawa, Mizuka; Chan, Tony; Smith, Jason

    2010-04-01

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  18. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    International Nuclear Information System (INIS)

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  19. Surface Water Modeling Using an EPA Computer Code for Tritiated Waste Water Discharge from the heavy Water Facility

    International Nuclear Information System (INIS)

    Tritium releases from the D-Area Heavy Water Facilities to the Savannah River have been analyzed. The U.S. EPA WASP5 computer code was used to simulate surface water transport for tritium releases from the D-Area Drum Wash, Rework, and DW facilities. The WASP5 model was qualified with the 1993 tritium measurements at U.S. Highway 301. At the maximum tritiated waste water concentrations, the calculated tritium concentration in the Savannah River at U.S. Highway 301 due to concurrent releases from D-Area Heavy Water Facilities varies from 5.9 to 18.0 pCi/ml as a function of the operation conditions of these facilities. The calculated concentration becomes the lowest when the batch releases method for the Drum Wash Waste Tanks is adopted

  20. Analysis of shielding calculation methods for 16- and 64-slice computed tomography facilities

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, C; Cenizo, E; Bodineau, C; Mateo, B; Ortega, E M, E-mail: c_morenosaiz@yahoo.e [Servicio de RadiofIsica Hospitalaria, Hospital Regional Universitario Carlos Haya, Malaga (Spain)

    2010-09-15

    The new multislice computed tomography (CT) machines require some new methods of shielding calculation, which need to be analysed. NCRP Report No. 147 proposes three shielding calculation methods based on the following dosimetric parameters: weighted CT dose index for the peripheral axis (CTDI{sub w,per}), dose-length product (DLP) and isodose maps. A survey of these three methods has been carried out. For this analysis, we have used measured values of the dosimetric quantities involved and also those provided by the manufacturer, making a comparison between the results obtained. The barrier thicknesses when setting up two different multislice CT instruments, a Philips Brilliance 16 or a Philips Brilliance 64, in the same room, are also compared. Shielding calculation from isodose maps provides more reliable results than the other two methods, since it is the only method that takes the actual scattered radiation distribution into account. It is concluded therefore that the most suitable method for calculating the barrier thicknesses of the CT facility is the one based on isodose maps. This study also shows that for different multislice CT machines the barrier thicknesses do not necessarily become bigger as the number of slices increases, because of the great dependence on technique used in CT protocols for different anatomical regions.

  1. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  2. Development of a computer code for shielding calculation in X-ray facilities

    International Nuclear Information System (INIS)

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011

  3. Guide to computing at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Peavler, J. (ed.)

    1979-06-01

    This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)

  4. In-Core Computation of Geometric Centralities with HyperBall: A Hundred Billion Nodes and Beyond

    CERN Document Server

    Boldi, Paolo

    2013-01-01

    Given a social network, which of its nodes are more central? This question has been asked many times in sociology, psychology and computer science, and a whole plethora of centrality measures (a.k.a. centrality indices, or rankings) were proposed to account for the importance of the nodes of a network. In this paper, we approach the problem of computing geometric centralities, such as closeness and harmonic centrality, on very large graphs; traditionally this task requires an all-pairs shortest-path computation in the exact case, or a number of breadth-first traversals for approximated computations, but these techniques yield very weak statistical guarantees on highly disconnected graphs. We rather assume that the graph is accessed in a semi-streaming fashion, that is, that adjacency lists are scanned almost sequentially, and that a very small amount of memory (in the order of a dozen bytes) per node is available in core memory. We leverage the newly discovered algorithms based on HyperLogLog counters, making...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  6. Computer based plant display and digital control system of Wolsong NPP Tritium Removal Facility

    International Nuclear Information System (INIS)

    The Wolsong Tritium Removal Facility (WTRF) is an AECL-designed, first-of-a-kind facility that removes tritium from the heavy water that is used in systems of the CANDUM reactors in operation at the Wolsong Nuclear Power Plant in South Korea. The Plant Display and Control System (PDCS) provides digital plant monitoring and control for the WTRF and offers the advantages of state-of-the-art digital control system technologies for operations and maintenance. The overall features of the PDCS will be described and some of the specific approaches taken on the project to save construction time and costs, to reduce in-service life-cycle costs and to improve quality will be presented. The PDCS consists of two separate computer sub-systems: the Digital Control System (DCS) and the Plant Display System (PDS). The PDS provides the computer-based Human Machine Interface (HMI) for operators, and permits efficient supervisory or device level monitoring and control. A System Maintenance Console (SMC) is included in the PDS for the purpose of software and hardware configuration and on-line maintenance. A Historical Data System (HDS) is also included in the PDS as a data-server that continuously captures and logs process data and events for long-term storage and on-demand selective retrieval. The PDCS of WTRF has been designed and implemented based on an off-the-self PDS/DCS product combination, the Delta-V System from Emerson. The design includes fully redundant Ethernet network communications, controllers, power supplies and redundancy on selected I/O modules. The DCS provides field bus communications to interface with 3rd party controllers supplied on specialized skids, and supports HART communication with field transmitters. The DCS control logic was configured using a modular and graphical approach. The control strategies are primarily device control modules implemented as autonomous control loops, and implemented using IEC 61131-3 Function Block Diagram (FBD) and Structured

  7. Comparison of measured and computed plasma loading resistance in the tandem mirror experiment-upgrade (TMX-U) central cell

    International Nuclear Information System (INIS)

    The plasma loading resistance vs density plots computed with McVey's Code XANTENA1, agree well with experimental measurements in the TMX-U central cell. The agreement is much better for frequencies where ω/ω/sub ci/ <1 than for ω/ω/sub ci/ greater than or equal to 1

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. Radiation exposure and central nervous system cancers: A case-control study among workers at two nuclear facilities

    International Nuclear Information System (INIS)

    A nested case-control study was conducted among workers employed between 1943 and 1977 at two nuclear facilities to investigate the possible association of primary malignant neoplasms of the central nervous system (CNS) with occupational exposure to ionizing radiation from external and internal sources. Eighty-nine white male and female workers, who according to the information on death certificates dies of primary CNS cancers, were identified as cases. Four matched controls were selected for each case. External radiation exposure data were available from film badge readings for individual workers, whereas radiation dose to lung from internally deposited radionuclides, mainly uranium, was estimated from area and personnel monitoring data and was used in analyses in lieu of the dose to the brain. Matched sets were included in the analyses only if information was available for the case and at least one of the corresponding controls. Thus, the analyses of external radiation included 27 cases and 90 matched controls, and 47 cases and 120 matched controls were analyzed for the effects of radiation from internally deposited uranium. No association was observed between deaths fron CNS cancers and occupational exposure to ionizing radiation from external or internal sources. However, due to the small number of monitored subjects and low doses, a weak association could not be ruled out. 43 refs., 1 fig., 15 tabs

  10. Safety Assessment Of The Centralized Storage Facility For Disused Sealed Radioactive Sources In United Republic Of Tanzania

    Energy Technology Data Exchange (ETDEWEB)

    Abel, Vitus [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, JaeSeong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    SRS are no longer in use, they are declared as disused, and they are transferred to the central radioactive management facility (CRMF) belonging to Tanzania Atomic Energy Commission (regulatory body) and managed as radioactive waste. In order to reduce the risk associated with disused sealed radioactive sources (DSRS), the first priority would be to bring them to appropriate controls under the regulatory body. When DSRS are safely managed, regulatory body need to make assessment of the likelihood and potential impact of incidents, accidents and hazards for proper management plan. The paper applies Analytical Hierarchy Process (AHP) for assessing and allocating weights and priorities for solving the problem of mult criteria consideration for management plan. Using pairwise comparisons, the relative importance of one criterion over another can be expressed. The method allows decision makers to provide judgments about the relative importance of each criterion and to estimate radiological risk by using expert's judgments or probability of occurrence. AHP is the step by step manner where the resulting priorities are shown and the possible inconsistencies are determined. The Information provided by experts helps to rank hazards according to probability of occurrence, potential impact and mitigation cost. The strength of the AHP method lies in its ability to incorporate both qualitative and quantitative data in decision making. AHP present a powerful tool for weighting and prioritizing hazards in terms of occurrence probability. However, AHP also has some weak points. AHP requires data based on experience, knowledge and judgment which are subjective for each decision-maker.

  11. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    International Nuclear Information System (INIS)

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab

  12. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, A.L.; Singh, S.P.N.; Ferrada, J.J.

    1991-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab.

  13. Central Weighted Non-Oscillatory (CWENO) and Operator Splitting Schemes in Computational Astrophysics

    Science.gov (United States)

    Ivanovski, Stavro

    2011-05-01

    High-resolution shock-capturing schemes (HRSC) are known to be the most adequate and advanced technique used for numerical approximation to the solution of hyperbolic systems of conservation laws. Since most of the astrophysical phenomena can be described by means of system of (M)HD conservation equations, finding most ac- curate, computationally not expensive and robust numerical approaches for their solution is a task of great importance for numerical astrophysics. Based on the Central Weighted Non-Oscillatory (CWENO) reconstruction approach, which relies on the adaptive choice of the smoothest stencil for resolving strong shocks and discontinuities in central framework on staggered grid, we present a new algorithm for systems of conservation laws using the key idea of evolving the intermediate stages in the Runge Kutta time discretization in primitive variables . In this thesis, we introduce a new so-called conservative-primitive variables strategy (CPVS) by integrating the latter into the earlier proposed Central Runge Kutta schemes (Pareschi et al., 2005). The advantages of the new shock-capturing algorithm with respect to the state-of-the-art HRSC schemes used in astrophysics like upwind Godunov-type schemes can be summarized as follows: (i) Riemann-solver-free central approach; (ii) favoring dissipation (especially needed for multidimensional applications in astrophysics) owing to the diffusivity coming from the design of the scheme; (iii) high accuracy and speed of the method. The latter stems from the fact that the advancing in time in the predictor step does not need inversion between the primitive and conservative variables and is essential in applications where the conservative variables are neither trivial to compute nor to invert in the set of primitive ones as it is in relativistic hydrodynamics. The main objective of the research adopted in the thesis is to outline the promising application of the CWENO (with CPVS) in the problems of the

  14. Transmutation of alloys in MFE facilities as calculated by REAC (a computer code system for activation and transmutation)

    International Nuclear Information System (INIS)

    A computer code system for fast calculation of activation and transmutation has been developed. The system consists of a driver code, cross-section libraries, flux libraries, a material library, and a decay library. The code is used to predict transmutations in a Ti-modified 316 stainless steel, a commercial ferritic alloy (HT9), and a V-15%Cr-5%Ti alloy in various magnetic fusion energy (MFE) test facilities and conceptual reactors

  15. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    levels in CFD based flowpath modeling of the facility. The analyses tools used here expand on the multi-element unstructured CFD which has been tailored and validated for impingement dynamics of dry plumes, complex valve/feed systems, and high pressure propellant delivery systems used in engine and component test stands at NASA SSC. The analyses performed in the evaluation of the sub-scale diffuser facility explored several important factors that influence modeling and understanding of facility operation such as (a) importance of modeling the facility with Real Gas approximation, (b) approximating the cluster of steam ejector nozzles as a single annular nozzle, (c) existence of mixed subsonic/supersonic flow downstream of the turning duct, and (d) inadequacy of two-equation turbulence models in predicting the correct pressurization in the turning duct and expansion of the second stage steam ejectors. The procedure used for modeling the facility was as follows: (i) The engine, test cell and first stage ejectors were simulated with an axisymmetric approximation (ii) the turning duct, second stage ejectors and the piping downstream of the second stage ejectors were analyzed with a three-dimensional simulation utilizing a half-plane symmetry approximation. The solution i.e. primitive variables such as pressure, velocity components, temperature and turbulence quantities were passed from the first computational domain and specified as a supersonic boundary condition for the second simulation. (iii) The third domain comprised of the exit diffuser and the region in the vicinity of the facility (primary included to get the correct shock structure at the exit of the facility and entrainment characteristics). The first set of simulations comprising the engine, test cell and first stage ejectors was carried out both as a turbulent real gas calculation as well as a turbulent perfect gas calculation. A comparison for the two cases (Real Turbulent and Perfect gas turbulent) of the Ma

  16. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  17. Radionuclide migration pathways analysis for the Oak Ridge Central Waste Disposal Facility on the West Chestnut Ridge site

    International Nuclear Information System (INIS)

    A dose-to-man pathways analysis is performed for disposal of low-level radioactive waste at the Central Waste Disposal Facility on the West Chestnut Ridge Site. Both shallow land burial (trench) and aboveground (tumulus) disposal methods are considered. The waste volumes, characteristics, and radionuclide concentrations are those of waste streams anticipated from the Oak Ridge National Laboratory, the Y-12 Plant, and the Oak Ridge Gaseous Diffusion Plant. The site capacity for the waste streams is determined on the basis of the pathways analysis. The exposure pathways examined include (1) migration and transport of leachate from the waste disposal units to the Clinch River (via the groundwater medium for trench disposal and Ish Creek for tumulus disposal) and (2) those potentially associated with inadvertent intrusion following a 100-year period of institutional control: an individual resides on the site, inhales suspended particles of contaminated dust, ingests vegetables grown on the plot, consumes contaminated water from either an on-site well or from a nearby surface stream, and receives direct exposure from the contaminated soil. It is found that either disposal method would provide effective containment and isolation for the anticipated waste inventory. However, the proposed trench disposal method would provide more effective containment than tumuli because of sorption of some radionuclides in the soil. Persons outside the site boundary would receive radiation doses well below regulatory limits if they were to ingest water from the Clinch River. An inadvertent intruder could receive doses that approach regulatory limits; however, the likelihood of such intrusions and subsequent exposures is remote. 33 references, 31 figures, 28 tables

  18. Intensive archaeological survey of the proposed Central Sanitary Wastewater Treatment Facility, Savannah River Site, Aiken and Barnwell Counties, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    Stephenson, D.K.; Sassaman, K.E.

    1993-11-01

    The project area for the proposed Central Sanitary Wastewater Treatment Facility on the Savannah River Site includes a six-acre tract along Fourmile Branch and 18 mi of trunk line corridors. Archaeological investigations of the six-acre parcel resulted in the discovery of one small prehistoric site designated 38AK465. This cultural resource does not have the potential to add significantly to archaeological knowledge of human occupation in the region. The Savannah River Archaeological Research Program (SRARP) therefore recommends that 38AK465 is not eligible for nomination to the National Register of Historic Places (NRHP) and further recommends a determination of no effect. Archaeological survey along the trunk line corridors implicated previously recorded sites 38AK92, 38AK145, 38AK415, 38AK417, 38AK419, and 38AK436. Past disturbance from construction had severely disturbed 38AK92 and no archaeological evidence of 38AK145, 38AK419, and 38AK436 was recovered during survey. Lacking further evidence for the existence of these sites, the SRARP recommends that 38AK92, 38AK145, 38AK419, and 38AK436 are not eligible for nomination to the NRHP and thus warrant a determination of no effect. Two of these sites, 38Ak415 and 38AK417, required further investigation to evaluate their archaeological significance. Both of the sites have the potential to yield significant data on the prehistoric period occupation of the Aiken Plateau and the SRARP recommends that they are eligible for nomination to the NRHP. The Savannah River Archaeological Research Program recommends that adverse effects to sites 38AK415 and 38AK417 from proposed construction can be mitigated through avoidance.

  19. Geochemical information for the West Chestnut Ridge Central Waste Disposal Facility for low-level radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, F.G.; Kelmers, A.D.

    1984-06-01

    Geochemical support activities for the Central Waste Disposal Facility (CWDF) project included characterization of site materials, as well as measurement of radionuclide sorption and desorption isotherms and apparent concentration limit values under site-relevant laboratory test conditions. The radionuclide sorption and solubility information is needed as input data for the pathways analysis calculations to model expected radioactivity releases from emplaced waste to the accessible environment under various release scenarios. Batch contact methodology was used to construct sorption and desorption isotherms for a number of radionuclides likely to be present in waste to be disposed of at the site. The sorption rates for uranium and europium were rapid (> 99.8% of the total radionuclide present was adsorbed in approx. 30 min). With a constant-pH isotherm technique, uranium, strontium, cesium, and curium exhibited maximum Rs values of 4800 to > 30,000 L/kg throughout the pH range 5 to 7. Sorption ratios were generally lower at higher or lower pH levels. Retardation factors for uranium, strontium, and cesium, explored by column chromatographic tests, were consistent with the high sorption ratios measured in batch tests for these radionuclides. The addition of as little as 0.01 M organic reagent capable of forming strong soluble complexes with metals (e.g., ethylenediaminetetraacetic acid (EDTA) or citric acid) was found to reduce the sorption ratio for uranium by as much as two orders of magnitude. Substitution of an actual low-level waste site trench water for groundwater in these tests was found to give a similar reduction in the sorption ratio.

  20. Geochemical information for the West Chestnut Ridge Central Waste Disposal Facility for low-level radioactive waste

    International Nuclear Information System (INIS)

    Geochemical support activities for the Central Waste Disposal Facility (CWDF) project included characterization of site materials, as well as measurement of radionuclide sorption and desorption isotherms and apparent concentration limit values under site-relevant laboratory test conditions. The radionuclide sorption and solubility information is needed as input data for the pathways analysis calculations to model expected radioactivity releases from emplaced waste to the accessible environment under various release scenarios. Batch contact methodology was used to construct sorption and desorption isotherms for a number of radionuclides likely to be present in waste to be disposed of at the site. The sorption rates for uranium and europium were rapid (> 99.8% of the total radionuclide present was adsorbed in approx. 30 min). With a constant-pH isotherm technique, uranium, strontium, cesium, and curium exhibited maximum Rs values of 4800 to > 30,000 L/kg throughout the pH range 5 to 7. Sorption ratios were generally lower at higher or lower pH levels. Retardation factors for uranium, strontium, and cesium, explored by column chromatographic tests, were consistent with the high sorption ratios measured in batch tests for these radionuclides. The addition of as little as 0.01 M organic reagent capable of forming strong soluble complexes with metals [e.g., ethylenediaminetetraacetic acid (EDTA) or citric acid] was found to reduce the sorption ratio for uranium by as much as two orders of magnitude. Substitution of an actual low-level waste site trench water for groundwater in these tests was found to give a similar reduction in the sorption ratio

  1. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Duran, Felicia Angelica [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.; Waymire, Russell L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documents have also been provided to KHNP-CRI.

  2. Prevalence of enterobacteriaceae in Tupinambis merianae (Squamata: Teiidae) from a captive facility in Central Brazil, with a profile of antimicrobial drug resistance in Salmonella enterica

    OpenAIRE

    Andréa de Moraes Carvalho; Ayrton Klier Péres Júnior; Maria Auxiliadora Andrade; Valéria de Sá Jayme

    2013-01-01

    The present study reports the presence of enterobacteriaceae in Tegu Lizards (Tupinambis merianae)from a captive facility in central Brazil. From a total of 30 animals, 10 juveniles and 20 adults (10 males, 10 females), 60 samples were collected, in two periods separated by 15 days. The samples were cultivated in Xylose-lysine-deoxycholate agar (XLT4) and MacConkey agar. The Salmonella enterica were tested for antimicrobial susceptibility. A total of 78 bacteria was isolated, of wich 27 were ...

  3. Evaluation and use of geosphere flow and migration computer programs for near surface trench type disposal facilities

    International Nuclear Information System (INIS)

    This report describes calculations of groundwater flow and radionuclide migration for near surface trench type radioactive waste disposal facilities. Aspects covered are verification of computer programs, detailed groundwater flow calculations for the Elstow site, radionuclide migration for the Elstow site and the effects of using non-linear sorption models. The Elstow groundwater flows are for both the current situation and for projected developments to the site. The Elstow migration calculations serve to demonstrate a methodology for predicting radionuclide transport from near surface trench type disposal facilities. The majority of the work was carried out at the request of and in close collaboration with ANS, the coordinators for the preliminary assessment of a proposed radioactive waste disposal site at Elstow. Hence a large part of the report contains results which were generated for ANS to use in their assessment. (author)

  4. National Ignition Facility system design requirements NIF integrated computer controls SDR004

    International Nuclear Information System (INIS)

    This System Design Requirement document establishes the performance, design, development, and test requirements for the NIF Integrated Computer Control System. The Integrated Computer Control System (ICCS) is covered in NIF WBS element 1.5. This document responds directly to the requirements detailed in the NIF Functional Requirements/Primary Criteria, and is supported by subsystem design requirements documents for each major ICCS Subsystem

  5. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    International Nuclear Information System (INIS)

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above

  6. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [FRIB, MSU; Mokhov, Nikolai [FNAL; Niita, Koji [RIST, Japan

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  7. A Climatology of Midlatitude Continental Clouds from the ARM SGP Central Facility. Part II; Cloud Fraction and Radiative Forcing

    Science.gov (United States)

    Dong, Xiquan; Xi, Baike; Minnis, Patrick

    2006-01-01

    Data collected at the Department of Energy Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) central facility are analyzed for determining the variability of cloud fraction and radiative forcing at several temporal scales between January 1997 and December 2002. Cloud fractions are estimated for total cloud cover and for single-layer low (0-3 km), middle (3-6 km), and high clouds (greater than 6 km) using ARM SGP ground-based paired lidar-radar measurements. Shortwave (SW), longwave (LW), and net cloud radiative forcings (CRF) are derived from up- and down-looking standard precision spectral pyranometers and precision infrared radiometer measurements. The annual averages of total, and single-layer, nonoverlapped low, middle and high cloud fractions are 0.49, 0.11, 0.03, and 0.17, respectively. Total and low cloud amounts were greatest from December through March and least during July and August. The monthly variation of high cloud amount is relatively small with a broad maximum from May to August. During winter, total cloud cover varies diurnally with a small amplitude, mid-morning maximum and early evening minimum, and during summer it changes by more than 0.14 over the daily cycle with a pronounced early evening minimum. The diurnal variations of mean single-layer cloud cover change with season and cloud height. Annual averages of all-sky, total, and single-layer high, middle, and low LW CRFs are 21.4, 40.2, 16.7, 27.2, and 55.0 Wm(sup -2), respectively; and their SW CRFs are -41.5, -77.2, -37.0, -47.0, and -90.5 Wm(sup -2). Their net CRFs range from -20 to -37 Wm(sup -2). For all-sky, total, and low clouds, the maximum negative net CRFs of -40.1, -70, and -69.5 Wm(sup -2), occur during April; while the respective minimum values of -3.9, -5.7, and -4.6 Wm(sup -2), are found during December. July is the month having maximum negative net CRF of -46.2 Wm(sup -2) for middle clouds, and May has the maximum value of -45.9 Wm(sup -2) for high clouds. An

  8. Recent development of spent fuel management in the Central Storage Facility for Spent Fuel (CLAB) in Sweden

    International Nuclear Information System (INIS)

    The Swedish nuclear waste management programme was presented at the Advisory Group Meeting held 15-18 March 1988. As there have been very small changes since then in the main strategy this paper is focusing the ongoing work to expand the capacity of the Central Storage Facility for Spent Fuel, CLAB, from 3.000 tonnes of uranium (tU) to 5.000 tU within the existing storage pools. This expansion makes it possible to postpone the investment in new storage pools in a second rock cavern by 6-8 years which gives a considerable economic advantage. Preliminary studies of several methods to realize the expansion led to a decision to use storage canisters similar to the existing ones. The new BWR canister will be designed to hold 25 fuel assemblies instead of 16 and the new PWR canister 9 assemblies instead of 5. The main problem is to maintain a sufficient margin against criticality with the new and denser fuel pattern. Two methods to achieve this have been more closely investigated: Credit for the burnup of the fuel; Neutron absorbing material in the canisters for reactivity control. A calculational work has been performed in order to analyze the margins which must be applied in the first case. One of the more important factors proved to be the margin required to cover the influence of axial profiles of burnup and Pu-build up for BWR fuel. Together with other necessary reactivity margins the total penalty which has to be applied amounted to about 12 reactivity percent units. With the requirement of a final Keff value smaller than or equal to 0,95 this implied that a far to great proportion of the fuel arriving at CLAB would fall outside the acceptable region defined by initial enrichment and actual burnup. The method of taking full credit for fuel burnup therefore had to be abandoned in favour of the other method. A preliminary analysis has been performed involving different degrees of absorber material in the canister and the fact that burnable absorbers in the fuel

  9. Central Issues in the Use of Computer-Based Materials for High Volume Entrepreneurship Education

    Science.gov (United States)

    Cooper, Billy

    2007-01-01

    This article discusses issues relating to the use of computer-based learning (CBL) materials for entrepreneurship education at university level. It considers CBL as a means of addressing the increased volume and range of provision required in the current context. The issues raised in this article have importance for all forms of computer-based…

  10. Improvement of Auditing Data Centralization, in Particular by Using Electronic Computers

    International Nuclear Information System (INIS)

    Nuclear materials in France are distributed among numerous installations of all types: mines, factories, reactors, research centres, storage depots. One of the problems to be solved in the management of these materials is that of centralizing the auditing data produced by all these installations. The first part of the paper gives a brief account of the French system of nuclear materials management and of the problem of auditing data centralization as it arises in this system. The main difficulty in recent years has been the time required for the monthly closing and centralization of accounts for all the installations. The steps taken to cope with these difficulties are described. The second part describes the main method adopted, namely, the use of office machinery for centralization of data, the preparation of balances and the publication of a number of periodical documents. The paper explains why and under what conditions the use of office machinery was considered advantageous, and in particular it describes the coding system employed, the document circuits, and the time involved, mention also being made of the modifications which had to be made to the existing auditing system to meet the requirements imposed by the office machinery. The third part deals with various conclusions, resulting from the introduction of machine accounting, in respect of the cost in manpower and money of this centralization of auditing data and, more generally, the cost of nuclear materials management. (author)

  11. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  12. Cardiac tamponade in an infant during contrast infusion through central venous catheter for chest computed tomography; Tamponamento cardiaco durante infusao de contraste em acesso venoso central para realizacao de tomografia computadorizada do torax em lactente

    Energy Technology Data Exchange (ETDEWEB)

    Daud, Danilo Felix; Campos, Marcos Menezes Freitas de; Fleury Neto, Augusto de Padua [Hospital Geral de Palmas, TO (Brazil)

    2013-11-15

    Complications from central venous catheterization include infectious conditions, pneumothorax, hemothorax and venous thrombosis. Pericardial effusion with cardiac tamponade hardly occurs, and in infants is generally caused by umbilical catheterization. The authors describe the case of cardiac tamponade occurred in an infant during chest computed tomography with contrast infusion through a central venous catheter inserted into the right internal jugular vein. (author)

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  14. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. The significance of indirect costs—application to clinical laboratory test economics using computer facilities

    OpenAIRE

    Hindriks, F. R.; Bosman, A.; Rademaker, P. F.

    1989-01-01

    The significance of indirect costs in the cost price calculation of clinical chemistry laboratory tests by way of the production centres method has been investigated. A cost structure model based on the ‘production centres’ method, the Academisch Ziekenhuis Groningen (AZG) 1-2-3 model, is used for the calculation of cost and cost prices as an add-in tool to the spreadsheet program Lotus 1-2-3. The system specifications of the AZG 1-2-3 cost structure model have been extended with facilities t...

  16. Environmental assessment: Solid waste retrieval complex, enhanced radioactive and mixed waste storage facility, infrastructure upgrades, and central waste support complex, Hanford Site, Richland, Washington

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) needs to take action to: retrieve transuranic (TRU) waste because interim storage waste containers have exceeded their 20-year design life and could fail causing a radioactive release to the environment provide storage capacity for retrieved and newly generated TRU, Greater-than-Category 3 (GTC3), and mixed waste before treatment and/or shipment to the Waste Isolation Pilot Project (WIPP); and upgrade the infrastructure network in the 200 West Area to enhance operational efficiencies and reduce the cost of operating the Solid Waste Operations Complex. This proposed action would initiate the retrieval activities (Retrieval) from Trench 4C-T04 in the 200 West Area including the construction of support facilities necessary to carry out the retrieval operations. In addition, the proposed action includes the construction and operation of a facility (Enhanced Radioactive Mixed Waste Storage Facility) in the 200 West Area to store newly generated and the retrieved waste while it awaits shipment to a final disposal site. Also, Infrastructure Upgrades and a Central Waste Support Complex are necessary to support the Hanford Site's centralized waste management area in the 200 West Area. The proposed action also includes mitigation for the loss of priority shrub-steppe habitat resulting from construction. The estimated total cost of the proposed action is $66 million

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  18. Design of a central pattern generator using reservoir computing for learning human motion

    OpenAIRE

    Wyffels, Francis; Schrauwen, Benjamin

    2009-01-01

    To generate coordinated periodic movements, robot locomotion demands mechanisms which are able to learn and produce stable rhythmic motion in a controllable way. Because systems based on biological central pattern generators (CPGs) can cope with these demands, these kind of systems are gaining in success. In this work we introduce a novel methodology that uses the dynamics of a randomly connected recurrent neural network for the design of CPGs. When a randomly connected recurrent neural netwo...

  19. Crustal seismic velocity in the Marche region (Central Italy): computation of a minimum 1-D model with seismic station corrections.

    OpenAIRE

    Scarfì, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Catania, Catania, Italia; Imposa, S.; Dipartimento di Scienze Geologiche, University of Catania, Italy; Raffaele, R.; Dipartimento di Scienze Geologiche, Università di Catania, Italy; Scaltrito, A.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Catania, Catania, Italia

    2008-01-01

    A 1-D velocity model for the Marche region (central Italy) was computed by inverting P- and S-wave arrival times of local earthquakes. A total of 160 seismic events with a minimum of ten observations, a travel time residual ≤ 0.8 s and an azimuthal gap lower than 180° have been selected. This “minimum 1-D velocity model” is complemented by station corrections, which can be used to take into account possible near-surface velocity heterogeneities beneath each station. Using this new P-wave ...

  20. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    Science.gov (United States)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  1. Cambridge-Cranfield High Performance Computing Facility (HPCF) purchases ten Sun Fire(TM) 15K servers to dramatically increase power of eScience research

    CERN Multimedia

    2002-01-01

    "The Cambridge-Cranfield High Performance Computing Facility (HPCF), a collaborative environment for data and numerical intensive computing privately run by the University of Cambridge and Cranfield University, has purchased 10 Sun Fire(TM) 15K servers from Sun Microsystems, Inc.. The total investment, which includes more than $40 million in Sun technology, will dramatically increase the computing power, reliability, availability and scalability of the HPCF" (1 page).

  2. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. Hanford facility dangerous waste Part A, Form 3 and Part B permit application documentation, Central Waste Complex (WA7890008967)(TSD: TS-2-4)

    Energy Technology Data Exchange (ETDEWEB)

    Saueressig, D.G.

    1998-05-20

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998.

  4. Hanford facility dangerous waste Part A, Form 3, and Part B permit application documentation for the Central Waste Complex (WA7890008967) (TSD: TS-2-4)

    International Nuclear Information System (INIS)

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998

  5. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    International Nuclear Information System (INIS)

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs

  6. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  7. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    OpenAIRE

    S. A. McFarlane; K. L. Gaustad; E. J. Mlawer; C. N. Long; Delamere, J

    2011-01-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, partial vegetation...

  8. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    OpenAIRE

    S. A. McFarlane; K. L. Gaustad; E. J. Mlawer; C. N. Long; Delamere, J

    2011-01-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, ...

  9. Central nervous system involvement in systemic lupus erythematosus. The application of cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nagaoka, S.; Ishigatsubo, Y.; Katou, K.; Sakamoto, H.; Chiba, J. (Yokohama City Univ. (Japan). Faculty of Medicine)

    1982-06-01

    Cranial computed tomography scans were performed on 47 patients with systemic lupus erythematosus (SLE). Abnormal findings in the computed tomograms (CT) were observed in 17 patients (36.2%). Cerebral atrophy was the most common feature (eight cases), followed by abnormal high density areas (five cases), abnormal low density areas (three cases), sulcal enlargement (two cases), intracranial hemorrhage (one case) and others (two cases). The abnormal cranial CT group of SLE was associated with a significantly higher incidence of urinary casts and of thrombocytopenia. In particular, the frequency of urinary casts was greater in the group with cerebral atrophy than in the group with normal CT findings, and there was a higher incidence of alopecia, leukopenia and thrombocytopenia in the group with intracranial calcifications. Neuropsychiatric involvements were noted in 70.6% of patients with CT abnormalities, but neuropsychiatric features (20.7%) and electroencephalographic abnormalities (44.8%) were also observed in patients with normal CT findings. The age at onset of SLE, the mean duration of the disease and the survival rate were not significantly different between the groups with and without CT abnormalities, but the mortality rate was significantly greater in the group with CT abnormalities, especially among those with brain atrophy. Concerning the relationship between the findings of cranial CT and corticosteroid treatment, there was no significant difference in either the total dose or the mean duration of prednisolone therapy. Although SLE patients with cerebral atrophy were taking a larger maintenance dose of corticosteroids, the differences were not statistically significant.

  10. Simulation concept of NICA-MPD-SPD Tier0-Tier1 computing facilities

    Science.gov (United States)

    Korenkov, V. V.; Nechaevskiy, A. V.; Ososkov, G. A.; Pryahina, D. I.; Trofomov, V. V.; Uzhinskiy, A. V.

    2016-09-01

    The simulation concept for grid-cloud services of contemporary HENP experiments of the Big Data scale was formulated in practicing the simulation system developed in LIT JINR Dubna. This system is intended to improve the efficiency of the design and development of a wide class of grid-cloud structures by using the work quality indicators of some real system to design and predict its evolution. For these purposes the simulation program is combined with a real monitoring system of the grid-cloud service through a special database (DB). The DB accomplishes acquisition and analysis of monitoring data to carry out dynamical corrections of the simulation. Such an approach allows us to construct a general model pattern which should not depend on a specific simulated object, while the parameters describing this object can be used as input to run the pattern. The simulation of some processes of the NICA-MPD-SPD Tier0-Tier1 distributed computing is considered as an example of our approach applications.

  11. An Examination of Computer Attitudes, Anxieties, and Aversions Among Diverse College Populations: Issues Central to Understanding Information Sciences in the New Millennium

    OpenAIRE

    William H. Burkett; David M. Compton; Gail G. Burkett

    2001-01-01

    Studying the impact of computer attitudes on the production of knowledge is central to the understanding of information sciences in the new millennium. The major results from a survey of diverse college populations suggest that Liberal Arts College (LAC) students, in this demographic, have somewhat more ambivalence toward computers than students in a Community College (CC) or a nontraditional Business College (BC) environment. The respondents generally agreed that computers were an important ...

  12. Use of Monte Carlo simulation for computational analysis of critical systems on IPPE's facility addressing needs of nuclear safety

    Energy Technology Data Exchange (ETDEWEB)

    Pavlova, Olga; Tsibulya, Anatoly [FSUE ' SSC RF-IPPE' , 249033, Bondarenko Square 1, Obninsk (Russian Federation)

    2008-07-01

    The critical facility BFS-1 critical facility was built at the Institute of Physics and Power Engineering (Obninsk, Russia) for full-scale modeling of fast-reactor cores, blankets, in-vessel shielding, and storage. Whereas BFS-1 is a fast-reactor assembly; however, it is a very flexible assembly that can easily be reconfigured to represent numerous other types of reactor designs. This paper describes specific problems with calculation of evaluation neutron physics characteristics of integral experiments performed on BFS facility. The analysis available integral experiments performed on different critical configuration of BFS facility were performed. Calculations of criticality, central reaction rate ratios, and fission rate distributions were carried out by the MCNP5 Monte-Carlo code with different files of evaluated nuclear data. MCNP calculations with multigroup library with 299 energy groups were also made for comparison with pointwise library calculations. (authors)

  13. Animal facilities

    International Nuclear Information System (INIS)

    The animal facilities in the Division are described. They consist of kennels, animal rooms, service areas, and technical areas (examining rooms, operating rooms, pathology labs, x-ray rooms, and 60Co exposure facilities). The computer support facility is also described. The advent of the Conversational Monitor System at Argonne has launched a new effort to set up conversational computing and graphics software for users. The existing LS-11 data acquisition systems have been further enhanced and expanded. The divisional radiation facilities include a number of gamma, neutron, and x-ray radiation sources with accompanying areas for related equipment. There are five 60Co irradiation facilities; a research reactor, Janus, is a source for fission-spectrum neutrons; two other neutron sources in the Chicago area are also available to the staff for cell biology studies. The electron microscope facilities are also described

  14. Medication supply to residential aged care facilities in Western Australia using a centralized medication chart to replace prescriptions

    OpenAIRE

    Hoti Kreshnik; Hughes Jeffery; Sunderland Bruce

    2012-01-01

    Abstract Background Current model of medication supply to R(RACFs) in Australia is dependent on paper-based prescriptions. This study is aimed at assessing the use of a centralized medication chart as a prescription-less model for supplying medications to RACFs. Methods Two separate focus groups were conducted with general practitioners (GPs) and pharmacists, and another three with registered nurses (RNs) and carers combined. All focus group participants were working with RACFs. Audio-recorde...

  15. ECED 2013: Eastern and Central Europe Decommissioning. International Conference on Decommissioning of Nuclear Facilities. Conference Guide and Book of Abstracts

    International Nuclear Information System (INIS)

    The Conference included the following sessions: (I) Opening session (2 contributions); (II) Managerial and Funding Aspects of Decommissioning (5 contributions); (III) Technical Aspects of Decommissioning I (6 contributions); (IV) Experience with Present Decommissioning Projects (4 contributions); (V) Poster Session (14 contributions); (VI) Eastern and Central Europe Decommissioning - Panel Discussion; (VII) Release of Materials, Waste Management and Spent Fuel Management (6 contributions); (VIII) Technical Aspects of Decommissioning II (5 contributions).

  16. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda

    DEFF Research Database (Denmark)

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham;

    2016-01-01

    in diagnostic capabilities, operations and human resource in the management of malaria, pneumonia and diarrhoea. METHODS: A survey was conducted in pharmacies, private clinics and drug shops in Mukono district in October 2014. An assessment was done on availability of diagnostic equipment for malaria, record...... attended to at least one sick child in the week prior to the interview. CONCLUSION: There were big gaps between rural and urban private facilities with rural ones having less trained personnel and less zinc tablets' availability. In both rural and urban areas, record keeping was low. Child health...

  17. Study on cranial computed tomography in infants and children with central nervous system disorders, 2

    International Nuclear Information System (INIS)

    110 patients with cerebral palsy were studied by cranial computed tomography (CT) and electroencephalography (EEG) and the following results were obtained: 1) Abnormal brain findings on CT were present in 69% of spastic quadriplegia type, in 75% of spastic hemiplegia type, in 23% of athetotic type and in 50% of mixed type. 2) Most patients with spastic quadriplegia revealed diffuse cerebral atrophy and patients with spastic hemiplegia mostly showed hemispherial cerebral atrophy at the contralateral side to the motor paralysis on CT. Most patients with athetotis revealed normal CT-findings, but a few indicated slight diffuse cerebral atrophy on CT. 3) The severer was mental retardation of the patients, the more frequent and severer were CT-abnormalities. 4) Patients with epileptic seizure showed CT-abnormalities more often than patients without the seizure. 5) There was a good correlation between the abnormality of background activities on EEG and that on CT, in which their laterality coincided in most cases. 6) Sides of seizure discharges on EEG were the same as those of CT-abnormalities in 1/3 to 1/2 of patients, but the localization of seizure discharges corresponded to that of CT-abnormalities only in 11% of the cases. (author)

  18. Annual report to the Laser Facility Committee, 1982

    International Nuclear Information System (INIS)

    The report covers the work done at, or in association with, the Central Laser Facility during the year April 1981 to March 1982 under the headings; glass laser facility development, gas laser development, laser plasma interactions, transport and particle emission studies, ablative acceleration and compression studies, spectroscopy and XUV lasers, and, theory and computation. Publications based on the work of the facility which have either appeared or been accepted for publication during the year are listed. (U.K.)

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  20. Medication supply to residential aged care facilities in Western Australia using a centralized medication chart to replace prescriptions

    Directory of Open Access Journals (Sweden)

    Hoti Kreshnik

    2012-07-01

    Full Text Available Abstract Background Current model of medication supply to R(RACFs in Australia is dependent on paper-based prescriptions. This study is aimed at assessing the use of a centralized medication chart as a prescription-less model for supplying medications to RACFs. Methods Two separate focus groups were conducted with general practitioners (GPs and pharmacists, and another three with registered nurses (RNs and carers combined. All focus group participants were working with RACFs. Audio-recorded data were compared with field notes, transcribed and imported into NVivo® where it was thematically analyzed. Results A prescription-less medication chart model was supported and it appeared to potentially improve medication supply to RACF residents. Centralization of medication supply, clarification of medication orders and responding in real-time to therapy changes made by GPs were reasons for supporting the medication chart model. Pharmacists preferred an electronic version of this model. All health professionals cautioned against the need for GPs regularly reviewing the medication chart and proposed a time interval of four to six months for this review to occur. Therapy changes during weekends appeared a potential difficulty for RNs and carers whereas pharmacists cautioned about legible writing and claiming of medications dispensed according to a paper-based model. GPs cautioned on the need to monitor the amount of medications dispensed by the pharmacy. Conclusion The current use of paper prescriptions in nursing homes was identified as burdensome. A prescription-less medication chart model was suggested to potentially improve medication supply to RACF residents. An electronic version of this model could address main potential difficulties raised.

  1. Overview on CSNI Separate Effects Test Facility Matrices for Validation of Best Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    An internationally agreed separate effects test (Set) Validation Matrix for thermal-hydraulic system codes has been established by a sub-group of the Task Group on Thermal Hydraulic System Behaviour as requested by the OECD/Nea Committee on Safety of Nuclear Installations (CSNI) Principal Working Group No. 2 on Coolant System Behaviour. The construction of such a Matrix is an attempt to collect together in a systematic way the best sets of openly available test data for code validation, assessment and improvement and also for quantitative code assessment with respect to quantification of uncertainties to the modelling of individual phenomena by the codes. The methodology that has been developed during the process of establishing CSNI-Set validation matrix was an important outcome of the work on Set matrix. In the paper, the methodology developed will be discussed in detail, together with the examples from the Set matrix. In addition, all the choices, which have been made from the 187 identified facilities covering the 67 phenomena, will be investigated together with some discussions on the data-base. Facilities and phenomena have been cross-referenced in a separate effects test cross reference matrix, while the selected separate effects tests themselves are listed against the thermal-hydraulic phenomena for which they can provide validation data. As a preliminary to the classification of facilities and test data, it was necessary to identify a sufficiently complete list of relevant phenomena for LOCA and non-LOCA transient applications of PWRs and BWRs. The majority of these phenomena are also relevant to Advanced Water Cooled Reactors. To this end, 67 phenomena were identified for inclusion in the Set matrix and, in all; about 2094 tests are included in the Set matrix. The Set matrix, as it stands, is representative of the major part of the experimental work, which has been carried out in the LWR-safety thermal hydraulics field, covering a large number of

  2. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    International Nuclear Information System (INIS)

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3 degree to 20 degree and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations

  3. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  4. Quality of Sulfadoxine-Pyrimethamine Given as Antimalarial Prophylaxis in Pregnant Women in Selected Health Facilities in Central Region of Ghana

    Directory of Open Access Journals (Sweden)

    Danny F. Yeboah

    2016-01-01

    Full Text Available The use of sulfadoxine-pyrimethamine (SP as an intermittent preventive treatment (IPT against malaria during pregnancy has become a policy in most sub-Sahara African countries and crucially depends on the efficacy of SP. This study sets out to evaluate the effectiveness of the SP given to the pregnant women in some selected health facilities in the Central Region of Ghana to prevent maternal malaria in pregnant women. A total of 543 pregnant women recruited from 7 selected health centres in Central Region of Ghana participated in the study. Parasite density of Plasmodium falciparum was determined from peripheral blood of the pregnant women using microscopy. High performance liquid chromatography (HPLC and dissolution tester were used to determine the quality of the SP. Malaria infection was recorded in 11.2% of pregnant women who had a history of SP consumption. SP failed the dissolution test. Pregnant women who did not receive IPT-SP were 44%. Low haemoglobin level was recorded in 73.5% of the pregnant women. The results indicated that SP was substandard. IPT-SP is ineffective in preventing malaria infection.

  5. Focal axonal swellings and associated ultrastructural changes attenuate conduction velocity in central nervous system axons: a computer modeling study.

    Science.gov (United States)

    Kolaric, Katarina V; Thomson, Gemma; Edgar, Julia M; Brown, Angus M

    2013-08-01

    The constancy of action potential conduction in the central nervous system (CNS) relies on uniform axon diameter coupled with fidelity of the overlying myelin providing high-resistance, low capacitance insulation. Whereas the effects of demyelination on conduction have been extensively studied/modeled, equivalent studies on the repercussions for conduction of axon swelling, a common early pathological feature of (potentially reversible) axonal injury, are lacking. The recent description of experimentally acquired morphological and electrical properties of small CNS axons and oligodendrocytes prompted us to incorporate these data into a computer model, with the aim of simulating the effects of focal axon swelling on action potential conduction. A single swelling on an otherwise intact axon, as occurs in optic nerve axons of Cnp1 null mice caused a small decrease in conduction velocity. The presence of single swellings on multiple contiguous internodal regions (INR), as likely occurs in advanced disease, caused qualitatively similar results, except the dimensions of the swellings required to produce equivalent attenuation of conduction were significantly decreased. Our simulations of the consequences of metabolic insult to axons, namely, the appearance of multiple swollen regions, accompanied by perturbation of overlying myelin and increased axolemmal permeability, contained within a single INR, revealed that conduction block occurred when the dimensions of the simulated swellings were within the limits of those measured experimentally, suggesting that multiple swellings on a single axon could contribute to axonal dysfunction, and that increased axolemmal permeability is the decisive factor that promotes conduction block. PMID:24303138

  6. Computer-Enriched Instruction (CEI) Is Better for Preview Material Instead of Review Material: An Example of a Biostatistics Chapter, the Central Limit Theorem

    Science.gov (United States)

    See, Lai-Chu; Huang, Yu-Hsun; Chang, Yi-Hu; Chiu, Yeo-Ju; Chen, Yi-Fen; Napper, Vicki S.

    2010-01-01

    This study examines the timing using computer-enriched instruction (CEI), before or after a traditional lecture to determine cross-over effect, period effect, and learning effect arising from sequencing of instruction. A 2 x 2 cross-over design was used with CEI to teach central limit theorem (CLT). Two sequences of graduate students in nursing…

  7. GammaCHI: a package for the inversion and computation of the gamma and chi-square cumulative distribution functions (central and noncentral)

    NARCIS (Netherlands)

    Gil, A.; Segura, J.; Temme, N.M.

    2015-01-01

    A Fortran 90 module GammaCHI for computing and inverting the gamma and chi-square cumulative distribution functions (central and noncentral) is presented. The main novelty of this package is the reliable and accurate inversion routines for the noncentral cumulative distribution functions. Additional

  8. Prevalence of enterobacteriaceae in Tupinambis merianae (Squamata: Teiidae from a captive facility in Central Brazil, with a profile of antimicrobial drug resistance in Salmonella enterica

    Directory of Open Access Journals (Sweden)

    Andréa de Moraes Carvalho

    2013-06-01

    Full Text Available The present study reports the presence of enterobacteriaceae in Tegu Lizards (Tupinambis merianaefrom a captive facility in central Brazil. From a total of 30 animals, 10 juveniles and 20 adults (10 males, 10 females, 60 samples were collected, in two periods separated by 15 days. The samples were cultivated in Xylose-lysine-deoxycholate agar (XLT4 and MacConkey agar. The Salmonella enterica were tested for antimicrobial susceptibility. A total of 78 bacteria was isolated, of wich 27 were from juveniles of T. merianae, 30 from adult males and 21 from adult females. Salmonella enterica was the most frequent bacteria followed by Citrobacter freundii, Escherichia coli, Enterobacter sakasakii, Kluivera sp., Citrobacter amalonaticus, Serratia marcescens, Citrobacter diversus, Yersinia frederiksenii, Serratia odorifera, and Serratia liquefaciens. Salmonella enterica subsp. diarizonae and houtenae showed resistance to cotrimoxazole, and serum Salmonella enterica Worthington showed resistance to tetracycline and gentamicin. Salmonella enterica Panama and S. enterica subsp. diarizonae showed intermediate sensitivity to cotrimoxazole. In addition to Enterobacteriaceae in the Tegu lizard, pathogenic serotypes of S. enterica also occur, and their antimicrobial resistance was confirmed.

  9. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  10. Review of recent progress in laser plasma interaction and laser compression work at the Science Research Council Rutherford Laboratory Central Laser Facility

    International Nuclear Information System (INIS)

    The SRC two beam Nd glass laser facility with output of up to 40J per beam in 100 ps pulses and 80J per beam in 1.6 ns pulses has been used to study interactions in the critical density region, energy transport in solid targets and compression of spherical targets. Magnetic field generation and ponderomotive force modifications to density gradients have been observed by optical probing using Faraday rotation and interferometry. Electron energy transport has been studied with layered targets. The ablation front plasma parameters and the ablation depth have been recorded for spherical targets and related to numerical modelling of the inhibited thermal transport. The temporal and spatial development of transmission in thin plane polymer film has been used as an indicator of lateral thermal transport. X-ray spectroscopy of compression cores in exploding pusher targets has been refined by new line profile calculations and improved computer fitting methods, and some further results are presented. Dense implosions created by exploding glass shell compression of a high density gas fill have been studied by pulsed X-ray shadowgraphy. 40 references. (UK)

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...

  13. COMPUTING

    CERN Document Server

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  14. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  15. A rare case of dilated invaginated odontome with talon cusp in a permanent maxillary central incisor diagnosed by cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jaya, Ranganathan; Kumar, Rangarajan Sundaresan Mohan; Srinivasan, Ramasamy [Dept. of Conservative Dentistry and Endodontics, Priyadarshini Dental College and Hospital, Chennai (India)

    2013-09-15

    It has been a challenge to establish the accurate diagnosis of developmental tooth anomalies based on periapical radiographs. Recently, three-dimensional imaging by cone beam computed tomography has provided useful information to investigate the complex anatomy of and establish the proper management for tooth anomalies. The most severe variant of dens invaginatus, known as dilated odontome, is a rare occurrence, and the cone beam computed tomographic findings of this anomaly have never been reported for an erupted permanent maxillary central incisor. The occurrence of talon cusp occurring along with dens invaginatus is also unusual. The aim of this report was to show the importance of cone beam computed tomography in contributing to the accurate diagnosis and evaluation of the complex anatomy of this rare anomaly.

  16. A rare case of dilated invaginated odontome with talon cusp in a permanent maxillary central incisor diagnosed by cone beam computed tomography

    International Nuclear Information System (INIS)

    It has been a challenge to establish the accurate diagnosis of developmental tooth anomalies based on periapical radiographs. Recently, three-dimensional imaging by cone beam computed tomography has provided useful information to investigate the complex anatomy of and establish the proper management for tooth anomalies. The most severe variant of dens invaginatus, known as dilated odontome, is a rare occurrence, and the cone beam computed tomographic findings of this anomaly have never been reported for an erupted permanent maxillary central incisor. The occurrence of talon cusp occurring along with dens invaginatus is also unusual. The aim of this report was to show the importance of cone beam computed tomography in contributing to the accurate diagnosis and evaluation of the complex anatomy of this rare anomaly.

  17. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda-a cross sectional study

    DEFF Research Database (Denmark)

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham;

    2016-01-01

    BACKGROUND: Private facilities are the first place of care seeking for many sick children. Involving these facilities in child health interventions may provide opportunities to improve child welfare. The objective of this study was to assess the potential of rural and urban private facilities...... attended to at least one sick child in the week prior to the interview. CONCLUSION: There were big gaps between rural and urban private facilities with rural ones having less trained personnel and less zinc tablets' availability. In both rural and urban areas, record keeping was low. Child health...... keeping, essential drugs for the treatment of malaria, pneumonia and diarrhoea; the sex, level of education, professional and in-service training of the persons found attending to patients in these facilities. A comparison was made between urban and rural facilities. Univariate and bivariate analysis...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  20. Expertise concerning the request by the ZWILAG Intermediate Storage Facility Wuerenlingen AG for granting of a licence for the building and operation of the Central Intermediate Storage Facility for radioactive wastes

    International Nuclear Information System (INIS)

    On July 15, 1993, the Intermediate Storage Facility Wuerenlingen AG (ZWILAG) submitted a request to the Swiss Federal Council for granting of a license for the construction and operation of a central intermediate storage facility for radioactive wastes. The project foresees intermediate storage halls as well as conditioning and incineration installations. The Federal Agency for the Safety of Nuclear Installations (HSK) has to examine the project from the point of view of nuclear safety. The present report presents the results of this examination. Different waste types have to be treated in ZWILAG: spent fuel assemblies from Swiss nuclear power plants (KKWs); vitrified, highly radioactive wastes from reprocessing; intermediate and low-level radioactive wastes from KKWs and from reprocessing; wastes from the dismantling of nuclear installations; wastes from medicine, industry and research. The wastes are partitioned into three categories: high-level (HAA) radioactive wastes containing, amongst others, α-active nuclides, intermediate-level (MAA) radioactive wastes and low-level (SAA) radioactive wastes. The projected installation consists of three repository halls for each waste category, a hot cell, a conditioning plant and an incineration and melting installation. The HAA repository can accept 200 transport and storage containers with vitrified high-level wastes or spent fuel assemblies. The expected radioactivity amounts to 1020 Bq, including 1018 Bq of α-active nuclides. The thermal power produced by decay is released to the environment by natural circulation of air. The ventilation system is designed for a maximum power of 5.8 MW. Severe conditions are imposed to the containers as far as tightness and shielding against radiation is concerned. In the repository for MAA wastes the maximum radioactivity is 1018 Bq with 1015 Bq of α-active nuclides. The maximum thermal power of 250 kW is removed by forced air cooling. Because of the high level of radiation the

  1. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda–a cross sectional study

    OpenAIRE

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham; Clarke, Sîan E.; Hansen, Kristian S; Magnussen, Pascal; LaRussa, Philip; Mbonye, Anthony K

    2016-01-01

    Background Private facilities are the first place of care seeking for many sick children. Involving these facilities in child health interventions may provide opportunities to improve child welfare. The objective of this study was to assess the potential of rural and urban private facilities in diagnostic capabilities, operations and human resource in the management of malaria, pneumonia and diarrhoea. Methods A survey was conducted in pharmacies, private clinics and drug shops in Mukono dist...

  2. Developing and implementing a computer assisted emergency facility for assessing off-site consequences due to accidents in UK nuclear power reactors

    International Nuclear Information System (INIS)

    This paper outlines considerations in the development of the RAD computer code as used in the Emergency Room at HM NII for assessing off-site consequences of accidents in UK civil nuclear power reactors. A wide range of requirements have been accommodated within the facility, particularly the need of HM NII to meet its responsibilities by producing realistic and timely estimates of a suitably high quality for propagating advice. The development of the computer code has required the balancing of many competing factors. Valuable experience has been gained by using the code during emergency exercises. Importance is laid on the feedback of field measurements to enhance the accuracy of estimated radiological consequences. (author)

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    CERN Document Server

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  7. COMPUTING

    CERN Document Server

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. 中部地区高校体育场馆资源建设现状的分析%Analysis on Present Situation of Universities' Sport Facility Resources Construction in Central China

    Institute of Scientific and Technical Information of China (English)

    张大超; 陈海青

    2012-01-01

    采用文献资料、访问调查、数理统计等方法,以中部地区6省的16所"211工程"院校和12所一本院校的体育场馆资源为研究对象,对其体育场馆资源建设现状进行研究分析。结果发现:1)我国中部地区"211工程"建设院校和一本院校的室外场地总体数量不足,配备率不高,学生人均面积较少,不能满足学生课余体育训练和普通学生课外体育锻炼的需要。2)一本院校体育场馆设施的配备率低于"211工程"院校的配备率。3)游泳馆、风雨操场的数量不但相对较少,而且游泳馆的使用率也比较低。4)以发展学生智力与体力为主的项目和激发学生勇于探险胆识的场所,像野外活动基地、攀岩场所严重匮乏。并提出加大建设、合理规划、提高利用率、拓展校外资源等促进中部地区高校学校体育场馆资源建设健康发展的建议。%Taking 16 "211 project" universities and 12 top universities of six provinces in Central China as research objects,we investigated and analyzed the present situation of sports facilities resources construction of universities in Central China through the methods of literature study,investigation and mathematical statistics.The results showed that: 1) At present,indoor sports facilities development in Central China is good and the per capita area of sports facilities reaches standards prescribed by the state.In general,quantity of outdoor sports facilities in Central China is inadequate,the per capita area of sport facilities can not reach the standards prescribed by the state and can't meet the students' extracurricular physical exercise and extracurricular sport training needs;2) provision rate of sports facilities of first-class universities is lower than the "211 project" university in Central China;3) quantity of swimming pools and indoor stadiums relatively is lower than other sport facilities and the utilization rate of swimming pools

  9. Estimating Groundwater Concentrations from Mass Releases to the Aquifer at Integrated Disposal Facility and Tank Farm Locations Within the Central Plateau of the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Marcel P.; Freeman, Eugene J.

    2005-06-09

    This report summarizes groundwater-related numerical calculations that will support groundwater flow and transport analyses associated with the scheduled 2005 performance assessment of the Integrated Disposal Facility (IDF) at the Hanford Site. The report also provides potential supporting information to other ongoing Hanford Site risk analyses associated with the closure of single-shell tank farms and related actions. The IDF 2005 performance assessment analysis is using well intercept factors (WIFs), as outlined in the 2001 performance assessment of the IDF. The flow and transport analyses applied to these calculations use both a site-wide regional-scale model and a local-scale model of the area near the IDF. The regional-scale model is used to evaluate flow conditions, groundwater transport, and impacts from the IDF in the central part of the Hanford Site, at the core zone boundary around the 200 East and 200 West Areas, and along the Columbia River. The local-scale model is used to evaluate impacts from transport of contaminants to a hypothetical well 100 m downgradient from the IDF boundaries. Analyses similar to the regional-scale analysis of IDF releases are also provided at individual tank farm areas as additional information. To gain insight on how the WIF approach compares with other approaches for estimating groundwater concentrations from mass releases to the unconfined aquifer, groundwater concentrations were estimated with the WIF approach for two hypothetical release scenarios and compared with similar results using a calculational approach (the convolution approach). One release scenario evaluated with both approaches (WIF and convolution) involved a long-term source release from immobilized low-activity waste glass containing 25,550 Ci of technetium-99 near the IDF; another involved a hypothetical shorter-term release of {approx}0.7 Ci of technetium over 600 years from the S-SX tank farm area. In addition, direct simulation results for both release

  10. FMIT facility control system

    International Nuclear Information System (INIS)

    The control system for the Fusion Materials Irradiation Test (FMIT) Facility, under construction at Richland, Washington, uses current techniques in distributed processing to achieve responsiveness, maintainability and reliability. Developmental experience with the system on the FMIT Prototype Accelerator (FPA) being designed at the Los Alamos National Laboratory is described as a function of the system's design goals and details. The functional requirements of the FMIT control system dictated the use of a highly operator-responsive, display-oriented structure, using state-of-the-art console devices for man-machine communications. Further, current technology has allowed the movement of device-dependent tasks into the area traditionally occupied by remote input-output equipment; the system's dual central process computers communicate with remote communications nodes containing microcomputers that are architecturally similar to the top-level machines. The system has been designed to take advantage of commercially available hardware and software

  11. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  14. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. Focal axonal swellings and associated ultrastructural changes attenuate conduction velocity in central nervous system axons: a computer modeling study

    OpenAIRE

    Kolaric, Katarina V; Thomson, Gemma; Edgar, Julia M; Brown, Angus M.

    2013-01-01

    The constancy of action potential conduction in the central nervous system (CNS) relies on uniform axon diameter coupled with fidelity of the overlying myelin providing high-resistance, low capacitance insulation. Whereas the effects of demyelination on conduction have been extensively studied/modeled, equivalent studies on the repercussions for conduction of axon swelling, a common early pathological feature of (potentially reversible) axonal injury, are lacking. The recent description of ex...

  18. Computational approaches to the prediction of blood-brain barrier permeability: A comparative analysis of central nervous system drugs versus secretase inhibitors for Alzheimer's disease.

    Science.gov (United States)

    Rishton, Gilbert M; LaBonte, Kristen; Williams, Antony J; Kassam, Karim; Kolovanov, Eduard

    2006-05-01

    This review summarizes progress made in the development of fully computational approaches to the prediction of blood-brain barrier (BBB) permeability of small molecules, with a focus on rapid computational methods suitable for the analysis of large compound sets and virtual screening. A comparative analysis using the recently developed Advanced Chemistry Development (ACD/Labs) Inc BBB permeability algorithm for the calculation of logBB values for known Alzheimer's disease medicines, selected central nervous system drugs and new secretase inhibitors for Alzheimer's disease, is presented. The trends in logBB values and the associated physiochemical properties of these agents as they relate to the potential for BBB permeability are also discussed. PMID:16729726

  19. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.

  20. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  1. The John von Neumann Institute for Computing (NIC): A survey of its supercomputer facilities and its Europe-wide computational science activities

    International Nuclear Information System (INIS)

    The John von Neumann Institute for Computing (NIC) at the Research Centre Juelich, Germany, is one of the leading supercomputing centres in Europe. Founded as a national centre in the mid-eighties it now provides more and more resources to European scientists. This happens within EU-funded projects (I3HP, DEISA) or Europe-wide scientific collaborations. Beyond these activities NIC started an initiative towards the new EU member states in summer 2004. Outstanding research groups are offered to exploit the supercomputers at NIC to accelerate their investigations on leading-edge technology. The article gives an overview of the organisational structure of NIC, its current supercomputer systems, and its user support. Transnational Access (TA) within I3HP is described as well as access by the initiative for new EU member states. The volume of these offers and the procedure of how to apply for supercomputer resources is introduced in detail

  2. The John von Neumann Institute for Computing (NIC): A survey of its supercomputer facilities and its Europe-wide computational science activities

    Science.gov (United States)

    Attig, N.

    2006-03-01

    The John von Neumann Institute for Computing (NIC) at the Research Centre Jülich, Germany, is one of the leading supercomputing centres in Europe. Founded as a national centre in the mid-eighties it now provides more and more resources to European scientists. This happens within EU-funded projects (I3HP, DEISA) or Europe-wide scientific collaborations. Beyond these activities NIC started an initiative towards the new EU member states in summer 2004. Outstanding research groups are offered to exploit the supercomputers at NIC to accelerate their investigations on leading-edge technology. The article gives an overview of the organisational structure of NIC, its current supercomputer systems, and its user support. Transnational Access (TA) within I3HP is described as well as access by the initiative for new EU member states. The volume of these offers and the procedure of how to apply for supercomputer resources is introduced in detail.

  3. Annual report to the Laser Facility Committee 1980

    International Nuclear Information System (INIS)

    The work done at, or in conjunction with, the Central Laser Facility during the year April 1979 to March 1980, is reported and discussed. The fields covered in the seven chapters are: (1) Glass laser facility development. (2) Gas laser development. (3) Laser plasma interaction. (4) Transport and particle emission. (5) Ablative compression studies. (6) X-ray spectroscopy and XUV lasers. (7) Theory and computation. Publications based on work at, or in conjunction with, the Facility, which have appeared or been accepted for publication during the year are listed in an appendix. (U.K.)

  4. Design/Installation and Structural Integrity Assessment of Bethel Valley Low-Level Waste collection and transfer system upgrade for Building 3092 (central off-gas scrubber facility) at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    This document describes and assesses planned modifications to be made to the Building 3092 Central Off-Gas Scrubber Facility of the Oak Ridge National Laboratory, Oak Ridge, Tennessee. The modifications are made in response to the requirements of 40CFR264 Subpart J, relating to environmental protection requirements for buried tank systems. The modifications include the provision of a new scrubber recirculation tank in a new, below ground, lined concrete vault, replacing an existing recirculation sump that does not provide double containment. A new buried, double contained pipeline is provided to permit discharge of spent scrubber recirculation fluid to the Central Waste Collection Header. The new vault, tank, and discharge line are provided with leak detection and provisions to remove accumulated liquid. Ne scrubber recirculation pumps, piping, and accessories are also provided. This assessment concludes that the planned modifications comply with applicable requirements of 40CFR264 Subpart J, as set forth in Appendix F to the Federal Facility Agreement, Docket No. 89-04-FF, covering the Oak Ridge Reservation. A formal design certification statement is included herein on Page 53, a certification covering the installation shall be executed prior to placing the modified facility into service

  5. Case of acute meningitis with clear cerebrospinal fluid: value of computed tomography for the diagnosis of central nervous system tuberculosis

    Energy Technology Data Exchange (ETDEWEB)

    Cesari, V.

    1986-11-06

    The author reports a case of acute meningitis with clear cerebrospinal fluid in which extensive bacteriologic investigations were negative making the etiologic diagnosis exceedingly difficult. Initiation of empiric antituberculous therapy was rapidly followed by clinical and biological improvement, without complications, and by resolution of abnormal findings on computed tomography of the brain. On these grounds, meningitis secondary to a tuberculoma in the temporal lobe was diagnosed. The author points out that tuberculous meningitis is still a severe, potentially fatal condition; this, together with the fact that tubercle bacilli are often very scarce or absent, requires that tuberculous meningitis be routinely considered in every patient with clear cerebrospinal fluid meningitis whose condition deteriorates. Computed tomography of the brain is essential to ensure rapid diagnosis and prompt initiation of antituberculous therapy. Lastly, the author points out that nowadays herpes simplex virus encephalopathy should also be considered.

  6. Liquid Effluent Retention Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Liquid Effluent Retention Facility (LERF) is located in the central part of the Hanford Site. LERF is permitted by the State of Washington and has three liquid...

  7. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    Energy Technology Data Exchange (ETDEWEB)

    Chan, M.K.; Ballinger, M.Y.; Owczarski, P.C.

    1989-02-01

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs.

  8. Central nervous system abnormalities on midline facial defects with hypertelorism detected by magnetic resonance image and computed tomography; Anomalias de sistema nervoso central em defeitos de linha media facial com hipertelorismo detectados por ressonancia magnetica e tomografia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, Vera Lucia Gil da Silva; Giffoni, Silvio David Araujo [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Ciencias Medicas. Dep. de Genetica Medica]. E-mail: vlopes@fcm.unicamp.br

    2006-10-15

    The aim of this study were to describe and to compare structural central nervous system (CNS) anomalies detected by magnetic resonance image (MRI) and computed tomography (CT) in individuals affected by midline facial defects with hypertelorism (MFDH) isolated or associated with multiple congenital anomalies (MCA). The investigation protocol included dysmorphological examination, skull and facial X-rays, brain CT and/or MRI. We studied 24 individuals, 12 of them had an isolated form (Group I) and the others, MCA with unknown etiology (Group II). There was no significant difference between Group I and II and the results are presented in set. In addition to the several CNS anomalies previously described, MRI (n=18) was useful for detection of neuronal migration errors. These data suggested that structural CNS anomalies and MFDH seem to have an intrinsic embryological relationship, which should be taken in account during the clinical follow-up. (author)

  9. A Study of Computer Facilities Share for the Teaching Purpose%教学为主的计算机设备共享研究

    Institute of Scientific and Technical Information of China (English)

    陶燕丽

    2014-01-01

    The essay tries to examine the utilization efficiency of computer facilities for the purpose of instruction to maximize the utilization of independent lab computer equipment. It uses the cost-sharing method based on Shapley Value in game-playing theory to work out a cost-sharing and-apportioning mechanism for independent labs resources.%本文通过对教学为主的计算机设备的利用率进行分析,对如何最大限度地提高独立性实验室计算机设备利用率进行了研究,提出利用博弈论中的Shapley值法,给出了成本分摊方法,合理设计各独立性实验室之间的成本分摊机制,提出了实验室资源共享管理中合理的成本摊派对策。

  10. A mathematical model using computed tomography for the diagnosis of metastatic central compartment lymph nodes in papillary thyroid carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Tianrun [The Sixth Affiliated Hospital of Sun Yat-sen University, Guangzhou (China); Su, Xuan; Chen, Weichao; Zheng, Lie; Li, Li; Yang, AnKui [Sun Yat-sen University Cancer Center, Guangzhou (China)

    2014-11-15

    The purpose of this study was to establish a potential mathematical model for the diagnosis of the central compartment lymph node (LN) metastases of papillary thyroid carcinoma (PTC) using CT imaging. 303 patients with PTC were enrolled. We determined the optimal cut-off points of LN size and nodal grouping by calculating the diagnostic value of each cut-off point. Then, calcification, cystic or necrotic change, abnormal enhancement, size, and nodal grouping were analysed by univariate and multivariate statistical methods. The mathematical model was obtained using binary logistic regression analysis, and a scoring system was developed for convenient use in clinical practice. 30 mm{sup 2} for LNs area (size) and two LNs as the nodal grouping criterion had the best diagnostic value. The mathematical model was: p = e{sup y} /(1+ e {sup y}), y = -0.670-0.087 x size + 1.010 x cystic or necrotic change + 1.371 x abnormal enhancement + 0.828 x nodal grouping + 0.909 x area. We assigned the value for cystic or necrotic change, abnormal enhancement, size and nodal grouping value as 25, 33, 20, and 22, respectively, yielding a scoring system. This mathematical model has a high diagnostic value and is a convenient clinical tool. (orig.)

  11. Development of a model for geomorphological assessment at U.S. DOE chemical/radioactive waste disposal facilities in the central and eastern United States; Weldon spring site remedial action project, Weldon Spring, Missouri

    International Nuclear Information System (INIS)

    Landform development and long-term geomorphic stability is the result of a complex interaction of a number of geomorphic processes. These processes may be highly variable in intensity and duration under different physiographic settings. This limitation has influenced the applicability of previous geomorphological stability assessments conducted in the arid or semi-arid western United States to site evaluations in more temperate and humid climates. The purpose of this study was to develop a model suitable for evaluating both long-term and short-term geomorphic processes which may impact landform stability and hence the stability of disposal facilities located in the central and eastern United States. The model developed for the geomorphological stability assessment at the Weldon Spring Site Remedial Action Project (WSSRAP) near St. Louis, Missouri, included an evaluation of existing landforms and consideration of the impact of both long-term and short-term geomorphic processes. These parameters were evaluated with respect to their impact and contribution to three assessment criteria considered most important with respect to the stability analysis; evaluation of landform age, evaluation of present geomorphic process activity and; determination of the impact of the completed facility on existing geomorphic processes. The geomorphological assessment at the Weldon Spring site indicated that the facility is located in an area of excellent geomorphic stability. The only geomorphic process determined to have a potential detrimental effect on long-term facility performance is an extension of the drainage network. A program of mitigating measures has been proposed to minimize the impact that future gully extension could have on the integrity of the facility

  12. Central odontogenic fibroma (simple type) in a four year old boy: Atypical cone-beam computed tomographic appearance with periosteal reaction

    Energy Technology Data Exchange (ETDEWEB)

    Anbiaee, Najme; Ebrahimnejad, Hamed; Sanaei, Alireza [Dept. of , Oral and Maxillofacial Radiology, Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2015-06-15

    Central odontogenic fibroma (COF) is a rare benign tumor that accounts for 0.1% of all odontogenic tumors. A case of COF (simple type) of the mandible in a four-year-old boy is described in this report. The patient showed asymptomatic swelling in the right inferior border of the lower jaw for one week. A panoramic radiograph showed a poorly-defined destructive unilocular radiolucent area. Cone-beam computed tomography showed expansion and perforation of the adjacent cortical bone plates. A periosteal reaction with the Codman triangle pattern was clearly visible in the buccal cortex. Since the tumor had destroyed a considerable amount of bone, surgical resection was performed. No recurrence was noted.

  13. Endodontic and Esthetic Management of a Dilacerated Maxillary Central Incisor Having Two Root Canals Using Cone Beam Computed Tomography as a Diagnostic Aid

    Directory of Open Access Journals (Sweden)

    Sarang Sharma

    2014-01-01

    Full Text Available Traumatic injuries to the primary dentition are quite common. When primary teeth are subjected to trauma, force transmission and/or invasion of the underlying tooth germs lying in close proximity can result in a variety of disturbances in the permanent successors. Few of these disturbances include hypoplasia, dilaceration, or alteration in the eruption sequence and pattern. Dilaceration is defined as an angulation or sharp bend or curve in the linear relationship of the crown of a tooth to its root. A rare case of maxillary left central incisor having crown dilaceration and Vertucci’s type II canal configuration with symptomatic periapical periodontitis is reported. Cone beam computed tomography was used for better understanding of the anomaly and complicated root canal morphology. The tooth was successfully managed by nonsurgical root canal therapy and restoration with resin composite to restore esthetics.

  14. Evaluation of the DYMAC demonstration program. Phase III report. [LASL Plutonium Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Malanify, J.J.; Bearse, R.C. (comps.)

    1980-12-31

    An accountancy system based on the Dynamic Materials Accountability (DYMAC) System has been in operation at the Plutonium Processing Facility at the Los Alamos National Laboratory since January 1978. This system, now designated the Plutonium Facility/Los Alamos Safeguards System (PF/LASS), has enhanced nuclear material accountability and process control at the Los Alamos facility. The nondestructive assay instruments and the central computer system are operating accurately and reliably. As anticipated, several uses of the system, notably scrap control and quality control, have developed in addition to safeguards. The successes of this experiment strongly suggest that implementation of DYMAC-based systems should be attempted at other facilities.

  15. Procedures and techniques for monitoring the radiation detection, signalization and alarm systems in the centralized ambience monitoring systems of the basic nuclear facilities of the CEN Saclay

    International Nuclear Information System (INIS)

    After referring to the regulations governing the 'systematic ambience monitoring' in the basic nuclear facilities, the main radiation detection, signalization and alarm devices existing at present in these facilities of the Saclay Nuclear Study Centre are described. The analysis of the operating defects of the measuring channels and detection possibilities leads to the anomalies being classified in two separate groups: the anomalies of the logical 'all or nothing' type of which all the possible origins are integrated into a so-called 'continuity' line and the evolutive anomalies of various origins corresponding to poor functioning extending possibly to a complete absence of signal. The techniques for testing the detection devices of the radiation monitoring board set up in the 'Departement de Rayonnements' at the Saclay Nuclear Study Centre are also described

  16. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  17. Using On-line Analytical Processing (OLAP) and data mining to estimate emergency room activity in DoD Medical Treatment Facilities in the Tricare Central Region

    OpenAIRE

    Ferguson, Cary V.

    2001-01-01

    On-line Analytical Processing (OLAP) and data mining can greatly enhance the ability of the Military Medical Treatment Facility (MTF) emergency room (ER) manager to improve ER staffing and utilization. MTF ER managers use statistical data analysis to help manage the efficient operation and use of ERs. As the size and complexity of databases increase, traditional statistical analysis becomes limited in the amount and type of information it can extract. OLAP tools enable the analysis of multi-d...

  18. The Remote Centralized Control Management Strategy of the Campus Network-based Multi-media Facilities%基于校园网的多媒体设备远程集控管理策略

    Institute of Scientific and Technical Information of China (English)

    李伽

    2012-01-01

    The remote centralized control management system which is based on the campus network was designed.Its structure,function and mainly introduces operating mode which is managed by the central control unit and the remote control of Intel AMT technology is descripted.In connection with the experience in the management of the multimedia facilities,a series of measures to perfect the management system of multimedia facilities,such as building the classroom aided teaching platform,teaching management platform,and realizing the integration of multi-system,etc.,are presented.%针对某高校较多的多媒体设备,设计了基于校园网的多媒体设备远程集控管理系统,并介绍了该系统主要的结构、功能,着重介绍了中央控制器管理和利用Intel AMT技术实现远程管理的工作方式.结合多年从事多媒体设备管理实践,对学校多媒体设备管理系统提出了建立教室辅助教学平台、教学管理平台、实现多系统整合等一系列完善措施.

  19. Design/installation and structural integrity assessment of Bethel Valley low-level waste collection and transfer system upgrade for Building 3092 (Central Off-Gas Scrubber Facility) at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    This document describes and assesses planned modifications to be made to the Building 3092 Central Off-Gas Scrubber Facility of the Oak Ridge National Laboratory, Oak Ridge, Tennessee. The modifications are made in responsible to the requirements of 40CFR264 Subpart J, relating to environmental protection requirements for buried tank systems. The modifications include the provision of a new scrubber recirculation tank in a new, below ground, lines concrete vault, replacing and existing recirculation sump that does not provide double containment. A new buried, double contained pipeline is provided to permit discharge of spent scrubber recirculation fluid to the Central Waste Collection Header. The new vault, tank, and discharge line are provided with leak detection and provisions to remove accumulated liquid. New scrubber recirculation pumps, piping, and accessories are also provided. This assessment concludes that the planned modifications comply with applicable requirements of 40CFR264 Subpart J, as set forth in Appendix F to the Federal Facility Agreement, Docket No. 89-04-FF, covering the Oak Ridge Reservation

  20. Reliable Facility Location Problem with Facility Protection.

    Science.gov (United States)

    Tang, Luohao; Zhu, Cheng; Lin, Zaili; Shi, Jianmai; Zhang, Weiming

    2016-01-01

    This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed. PMID:27583542

  1. Computational and biochemical docking of the irreversible cocaine analog RTI 82 directly demonstrates ligand positioning in the dopamine transporter central substrate-binding site.

    Science.gov (United States)

    Dahal, Rejwi Acharya; Pramod, Akula Bala; Sharma, Babita; Krout, Danielle; Foster, James D; Cha, Joo Hwan; Cao, Jianjing; Newman, Amy Hauck; Lever, John R; Vaughan, Roxanne A; Henry, L Keith

    2014-10-24

    The dopamine transporter (DAT) functions as a key regulator of dopaminergic neurotransmission via re-uptake of synaptic dopamine (DA). Cocaine binding to DAT blocks this activity and elevates extracellular DA, leading to psychomotor stimulation and addiction, but the mechanisms by which cocaine interacts with DAT and inhibits transport remain incompletely understood. Here, we addressed these questions using computational and biochemical methodologies to localize the binding and adduction sites of the photoactivatable irreversible cocaine analog 3β-(p-chlorophenyl)tropane-2β-carboxylic acid, 4'-azido-3'-iodophenylethyl ester ([(125)I]RTI 82). Comparative modeling and small molecule docking indicated that the tropane pharmacophore of RTI 82 was positioned in the central DA active site with an orientation that juxtaposed the aryliodoazide group for cross-linking to rat DAT Phe-319. This prediction was verified by focused methionine substitution of residues flanking this site followed by cyanogen bromide mapping of the [(125)I]RTI 82-labeled mutants and by the substituted cysteine accessibility method protection analyses. These findings provide positive functional evidence linking tropane pharmacophore interaction with the core substrate-binding site and support a competitive mechanism for transport inhibition. This synergistic application of computational and biochemical methodologies overcomes many uncertainties inherent in other approaches and furnishes a schematic framework for elucidating the ligand-protein interactions of other classes of DA transport inhibitors. PMID:25179220

  2. Computational and Biochemical Docking of the Irreversible Cocaine Analog RTI 82 Directly Demonstrates Ligand Positioning in the Dopamine Transporter Central Substrate-binding Site*

    Science.gov (United States)

    Dahal, Rejwi Acharya; Pramod, Akula Bala; Sharma, Babita; Krout, Danielle; Foster, James D.; Cha, Joo Hwan; Cao, Jianjing; Newman, Amy Hauck; Lever, John R.; Vaughan, Roxanne A.; Henry, L. Keith

    2014-01-01

    The dopamine transporter (DAT) functions as a key regulator of dopaminergic neurotransmission via re-uptake of synaptic dopamine (DA). Cocaine binding to DAT blocks this activity and elevates extracellular DA, leading to psychomotor stimulation and addiction, but the mechanisms by which cocaine interacts with DAT and inhibits transport remain incompletely understood. Here, we addressed these questions using computational and biochemical methodologies to localize the binding and adduction sites of the photoactivatable irreversible cocaine analog 3β-(p-chlorophenyl)tropane-2β-carboxylic acid, 4′-azido-3′-iodophenylethyl ester ([125I]RTI 82). Comparative modeling and small molecule docking indicated that the tropane pharmacophore of RTI 82 was positioned in the central DA active site with an orientation that juxtaposed the aryliodoazide group for cross-linking to rat DAT Phe-319. This prediction was verified by focused methionine substitution of residues flanking this site followed by cyanogen bromide mapping of the [125I]RTI 82-labeled mutants and by the substituted cysteine accessibility method protection analyses. These findings provide positive functional evidence linking tropane pharmacophore interaction with the core substrate-binding site and support a competitive mechanism for transport inhibition. This synergistic application of computational and biochemical methodologies overcomes many uncertainties inherent in other approaches and furnishes a schematic framework for elucidating the ligand-protein interactions of other classes of DA transport inhibitors. PMID:25179220

  3. Discussion paper: siting a low-level radioactive waste disposal facility in the Central Interstate Low-Level Radioactive Waste Compact region. A model for other regions

    International Nuclear Information System (INIS)

    While this paper is intended to be broad enough in scope to cover the concerns and implementation guidelines of any compact commission, the focus will be on the procedures and guidelines that must be followed under the terms of the Central Interstate Low-Level Radioactive Waste Compact. This is not necessarily an endorsement of the Central States Compact approach to siting but is rather an attempt to discuss the siting process by using an existing compact as an example. As stated earlier, each compact commission will have to follow the specific guidelines of its compact. Many of the procedures to be followed and technical standards to be considered, however, apply to all the compacts

  4. Discussion paper: siting a low-level radioactive waste disposal facility in the Central Interstate Low-Level Radioactive Waste Compact region. A model for other regions

    Energy Technology Data Exchange (ETDEWEB)

    Peery, R.J.

    1984-07-01

    While this paper is intended to be broad enough in scope to cover the concerns and implementation guidelines of any compact commission, the focus will be on the procedures and guidelines that must be followed under the terms of the Central Interstate Low-Level Radioactive Waste Compact. This is not necessarily an endorsement of the Central States Compact approach to siting but is rather an attempt to discuss the siting process by using an existing compact as an example. As stated earlier, each compact commission will have to follow the specific guidelines of its compact. Many of the procedures to be followed and technical standards to be considered, however, apply to all the compacts.

  5. Development of a computer code for shielding calculation in X-ray facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em salas radiograficas

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Diogo da S.; Lava, Deise D.; Affonso, Renato R.W.; Moreira, Maria de L.; Guimaraes, Antonio C.F., E-mail: diogosb@outlook.com, E-mail: deise_dy@hotmail.com, E-mail: raoniwa@yahoo.com.br, E-mail: malu@ien.gov.br, E-mail: tony@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011.

  6. Development of a computational code for calculations of shielding in dental facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em instalacoes odontologicas

    Energy Technology Data Exchange (ETDEWEB)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L., E-mail: deise_dy@hotmail.com, E-mail: diogosb@outlook.com, E-mail: raoniwa@yahoo.com.br, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report.

  7. 10-MWe solar-thermal central-receiver pilot plant, solar facilities design integration: collector-field optimization report (RADL item 2-25)

    Energy Technology Data Exchange (ETDEWEB)

    1981-01-01

    Appropriate cost and performance models and computer codes have been developed to carry out the collector field optimization, as well as additional computer codes to define the actual heliostat locations in the optimized field and to compute in detail the performance to be expected of the defined field. The range of capabilities of the available optimization and performance codes is described. The role of the optimization code in the definition of the pilot plant is specified, and a complete description of the optimization process itself is given. The detailed cost model used by the optimizer for the commercial system optimization is presented in the form of equations relating the cost element to each of the factors that determine it. The design basis for the commercial system is presented together with the rationale for its selection. The development of the individual heliostat performance code is presented. Use of the individual heliostat code in a completed study of receiver panel power under sunrise startup conditions is described. The procedure whereby performance and heliostat spacing data from the representative commercial-scale system are converted into coefficients of use in the layout processor is described, and the actual procedure used in the layout processor is described. Numerous special studies in support of the pilot plant design are described. (LEW)

  8. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  9. Computer programming to calculate the variations of characteristic angles of heliostats as a function of time and position in a central receiver solar power plant

    Energy Technology Data Exchange (ETDEWEB)

    Mehrabian, M.A.; Aseman, R.D. [Mechanical Engineering Dept., Shahid Bahonar Univ., Kerman (Iran, Islamic Republic of)

    2008-07-01

    The central receiver solar power plant is composed of a large number of individually stirred mirrors (heliostats), focusing the solar radiation onto a tower-mounted receiver. In this paper, an algorithm is developed based on vector geometry to pick an individual heliostat and calculate its characteristic angles at different times of the day and different days of the year. The algorithm then picks the other heliostats one by one and performs the same calculations as did for the first one. These data are used to control the orientation of heliostats for improving the performance of the field. This procedure is relatively straight-forward, and quite suitable for computer programming. The effect of major parameters such as shading and blocking on the performance of the heliostat field is also studied using this algorithm. The results of computer simulation are presented in three sections: (1) the characteristic angles of individual heliostats, (2) the incidence angle of the sun rays striking individual heliostats, and (3) the blocking and shading effect of each heliostat. The calculations and comparisons of results show that: (a) the average incidence angle in the northern hemisphere at the north side of the tower is less than that at its south side, (b) the cosine losses are reduced as the latitude is increased or the tower height is increased, (c) the blocking effect is more important in winter and its effect is much more noticeable than shading for large fields, (d) the height of the tower does not considerably affect shading; but significantly reduces the blocking effect, and (e) to have no blocking effect throughout the year, the field design should be performed for the winter solstice noon. (orig.)

  10. FRS (Facility Registration System) Sites, Geographic NAD83, EPA (2007) [facility_registration_system_sites_LA_EPA_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — This dataset contains locations of Facility Registry System (FRS) sites which were pulled from a centrally managed database that identifies facilities, sites or...

  11. Reference design for a centralized waste processing and storage facility. Technical manual for the management of low and intermediate level wastes generated at small nuclear research centres and by radioisotope users in medicine, research and industry

    International Nuclear Information System (INIS)

    The objective of this report is to present the generic reference design of a centralized waste processing and storage facility (WPSF) intended for countries producing small but significant quantities of liquid and solid radioactive wastes. These wastes are generated through the use of radionuclides for research, medical, industrial and other institutional activities in IAEA Member States that have not yet developed the infrastructure for a complete nuclear fuel cycle. The WPSF comprises two separate buildings. The first, for receiving and processing waste from the producers, includes the necessary equipment and support services for treating and conditioning the waste. The second building acts as a simple but adequate warehouse for storing a ten year inventory of the conditioned waste. In developing the design, it was a requirement of the IAEA that options for waste management techniques for each of the waste streams should be evaluated, in order to demonstrate that the reference design is based on the most appropriate technology. Refs, figs and tabs

  12. Screening and life-cycle cost models for new pulverized-coal heating plants: An integrated computer-based module for the central heating plant economic evaluation program (CHPECON). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, R.; Kinast, J.A.; Biederman, R.; Blazek, C.F.; Lin, M.C.

    1995-07-01

    Public Law 99-190 requires the Department of Defense (DOD) to increase the use of coal for steam generation, but DOD also has an obligation to use the most economical fuel. In support of the coal conversion effort, the U.S. Army Construction Engineering Research Laboratories (USACERL) has been tasked to develop a series of screening and life-cycle cost models to determine when and where specific coal-combustion technologies can economically be implemented in Army central heating plants. This report documents a pulverized coal-fired boiler analysis model, part of the USACERL-developed Central Heating Plant Economics model (CHPECON). The model is divided into two parts. A preliminary screening model contains options for evaluating new heating plants and cogeneration facilities fueled with pulverized coal, as well as the previous options. A cost model uses the entries provided by the screening model to provide a conceptual facility design, capital (installed) costs of the facility, operation and maintenance costs over the life of the facility, and life-cycle costs. Using these numbers the model produces a summary value for the total life-cycle cost of the plant, and a levelized cost of service.

  13. RCRA Facility Investigation/Remedial Investigation Report with Baseline Risk Assessment for the Central Shops Burning/Rubble Pit (631-6G), Volume 1 Final

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    The Burning/Rubble Pits at the Savannah River Site were usually shallow excavations approximately 3 to 4 meters in depth. Operations at the pits consisted of collecting waste on a continuous basis and burning on a monthly basis. The Central Shops Burning/Rubble Pit 631- 6G (BRP6G) was constructed in 1951 as an unlined earthen pit in surficial sediments for disposal of paper, lumber, cans and empty galvanized steel drums. The unit may have received other materials such as plastics, rubber, rags, cardboard, oil, degreasers, or drummed solvents. The BRP6G was operated from 1951 until 1955. After disposal activities ceased, the area was covered with soil. Hazardous substances, if present, may have migrated into the surrounding soil and/or groundwater. Because of this possibility, the United States Environmental Protection Agency (EPA) has designated the BRP6G as a Solid Waste Management Unit (SWMU) subject to the Resource Conservation Recovery Act/Comprehensive Environmental Response, Compensation and Liability Act (RCRA/CERCLA) process.

  14. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  15. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  16. 中央维护计算机系统中的故障处理技术%Fault Data Processing Technology Applied in Central Maintenance Computer System

    Institute of Scientific and Technical Information of China (English)

    李文娟; 贺尔铭; 马存宝

    2014-01-01

    Based on knowledge on how failures are generated and processed onboard the aircraft,Implement functions,modeling strategy and experienced procedure of fault data processing including in central maintenance computer system (CMCS)are analyzed.Fault message generation,cascaded fault screening,multiple fault consolidation and flight deck effect (FDE)correlation with maintenance message are present respectively.It is critical that logic equation-based fault isolation technology is discussed for correlation design strategy to fault diag-nosis module by taking foreign advanced airplane model as the example.With the development and maturity of CMCS,maintenance and man-agement mode for commercial vehicle are changed inevitability.%针对飞机故障的产生逻辑和处理方法,详细分析了中央维护计算机系统故障数据处理模块的主要功能,建模思想和处理流程;再现了故障信息产生、级联效应删除、重复故障合并以及飞机驾驶舱效应与维护信息关联的全过程。并以国外先进机型为例,对基于逻辑方程的故障隔离技术进行了深入分析,使得系统设计思路与机载故障诊断模型相结合;随着 CMCS技术的不断发展和成熟,必将对商用飞机传统维护模式和运营模式产生深远的影响。

  17. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  18. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    Science.gov (United States)

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  19. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Science.gov (United States)

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-01-01

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities. PMID:27548197

  20. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  1. A computer control system for a research reactor

    International Nuclear Information System (INIS)

    Most reactor applications until now, have not required computer control of core output. Commercial reactors are generally operated at a constant power output to provide baseline power. However, if commercial reactor cores are to become load following over a wide range, then centralized digital computer control is required to make the entire facility respond as a single unit to continual changes in power demand. Navy and research reactors are much smaller and simpler and are operated at constant power levels as required, without concern for the number of operators required to operate the facility. For navy reactors, centralized digital computer control may provide space savings and reduced personnel requirements. Computer control offers research reactors versatility to efficiently change a system to develop new ideas. The operation of any reactor facility would be enhanced by a controller that does not panic and is continually monitoring all facility parameters. Eventually very sophisticated computer control systems may be developed which will sense operational problems, diagnose the problem, and depending on the severity of the problem, immediately activate safety systems or consult with operators before taking action

  2. Directory of computer users in nuclear medicine

    International Nuclear Information System (INIS)

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included

  3. Directory of computer users in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, J.J.; Gurney, J.; McClain, W.J. (eds.)

    1979-09-01

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included. (PCS)

  4. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  5. Experimental facilities

    International Nuclear Information System (INIS)

    We have completed an engineering feasibility study of a major modification of the HFIR facility and are now beginning a similar study of an entirely new facility. The design of the reactor itself is common to both options. In this paper, a general description of the modified HFIR is presented with some indications of the additional facilities that might be available in an entirely new facility

  6. Facility Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Z.; Walling, R.; Miller, N.; Du, P.; Nelson, K.

    2005-05-01

    Microgrids are receiving a considerable interest from the power industry, partly because their business and technical structure shows promise as a means of taking full advantage of distributed generation. This report investigates three issues associated with facility microgrids: (1) Multiple-distributed generation facility microgrids' unintentional islanding protection, (2) Facility microgrids' response to bulk grid disturbances, and (3) Facility microgrids' intentional islanding.

  7. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  8. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  9. 6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

    CERN Multimedia

    Teams : M. Brice, JC Gadmer

    2010-01-01

    6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

  10. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 2; Unsteady Analyses and Risk Assessment

    Science.gov (United States)

    Ahuja, Vineet; Hosangadi, Ashvin; Allgood, Daniel

    2008-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Design Support of the feasibility of operating conditions and procedures is critical in such cases due to the possibility of startup/shutdown transients, moving shock structures, unsteady shock-boundary layer interactions and engine and diffuser unstart modes that can result in catastrophic failure. Analyses of such systems is difficult due to resolution requirements needed to accurately capture moving shock structures, shock-boundary layer interactions, two-phase flow regimes and engine unstart modes. In a companion paper, we will demonstrate with the use of CFD, steady analyses advanced capability to evaluate supersonic diffuser and steam ejector performance in the sub-scale A-3 facility. In this paper we will address transient issues with the operation of the facility especially at startup and shutdown, and assess risks related to afterburning due to the interaction of a fuel rich plume with oxygen that is a by-product of the steam ejectors. The primary areas that will be addressed in this paper are: (1) analyses of unstart modes due to flow transients especially during startup/ignition, (2) engine safety during the shutdown process (3) interaction of steam ejectors with the primary plume i.e. flow transients as well as probability of afterburning. In this abstract we discuss unsteady analyses of the engine shutdown process. However, the final paper will include analyses of a staged startup, drawdown of the engine test cell pressure, and risk assessment of potential afterburning in the facility. Unsteady

  11. Fifty cell test facility

    Energy Technology Data Exchange (ETDEWEB)

    Arntzen, J. D.; Kolba, V. M.; Miller, W. E.; Gay, E. C.

    1980-07-01

    This report describes the design of a facility capable of the simultaneous testing of up to 50 high-temperature (400 to 500/sup 0/C) lithium alloy/iron sulfide cells; this facility is located in the Chemical Engineering Division of Argonne National Laboratory (ANL). The emphasis will be on the lifetime testing of cells fabricated by ANL and industrial contractors to acquire statistical data on the performance of cells of various designs. A computer-based data-acquisition system processes the cell performance data generated from the cells on test. The terminals and part of the data-acquisition equipment are housed in an air-conditioned enclosure adjacent to the testing facility; the computer is located remotely.

  12. Centralized digital control of accelerators

    International Nuclear Information System (INIS)

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors

  13. The CUTLASS database facilities

    International Nuclear Information System (INIS)

    The enhancement of the CUTLASS database management system to provide improved facilities for data handling is seen as a prerequisite to its effective use for future power station data processing and control applications. This particularly applies to the larger projects such as AGR data processing system refurbishments, and the data processing systems required for the new Coal Fired Reference Design stations. In anticipation of the need for improved data handling facilities in CUTLASS, the CEGB established a User Sub-Group in the early 1980's to define the database facilities required by users. Following the endorsement of the resulting specification and a detailed design study, the database facilities have been implemented as an integral part of the CUTLASS system. This paper provides an introduction to the range of CUTLASS Database facilities, and emphasises the role of Database as the central facility around which future Kit 1 and (particularly) Kit 6 CUTLASS based data processing and control systems will be designed and implemented. (author)

  14. High-speed packet switching network to link computers

    CERN Document Server

    Gerard, F M

    1980-01-01

    Virtually all of the experiments conducted at CERN use minicomputers today; some simply acquire data and store results on magnetic tape while others actually control experiments and help to process the resulting data. Currently there are more than two hundred minicomputers being used in the laboratory. In order to provide the minicomputer users with access to facilities available on mainframes and also to provide intercommunication between various experimental minicomputers, CERN opted for a packet switching network back in 1975. It was decided to use Modcomp II computers as switching nodes. The only software to be taken was a communications-oriented operating system called Maxcom. Today eight Modcomp II 16-bit computers plus six newer Classic minicomputers from Modular Computer Services have been purchased for the CERNET data communications networks. The current configuration comprises 11 nodes connecting more than 40 user machines to one another and to the laboratory's central computing facility. (0 refs).

  15. Survey of solar thermal test facilities

    Energy Technology Data Exchange (ETDEWEB)

    Masterson, K.

    1979-08-01

    The facilities that are presently available for testing solar thermal energy collection and conversion systems are briefly described. Facilities that are known to meet ASHRAE standard 93-77 for testing flat-plate collectors are listed. The DOE programs and test needs for distributed concentrating collectors are identified. Existing and planned facilities that meet these needs are described and continued support for most of them is recommended. The needs and facilities that are suitable for testing components of central receiver systems, several of which are located overseas, are identified. The central contact point for obtaining additional details and test procedures for these facilities is the Solar Thermal Test Facilities Users' Association in Albuquerque, N.M. The appendices contain data sheets and tables which give additional details on the technical capabilities of each facility. Also included is the 1975 Aerospace Corporation report on test facilities that is frequently referenced in the present work.

  16. Centralized Fabric Management Using Puppet, Git, and GLPI

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of automated services. Puppet is a seasoned, open-source tool designed for enterprise-class centralized configuration management. At the RHIC/ATLAS Computing Facility at Brookhaven National Laboratory, we have adopted Puppet as part of a suite of tools, including Git, GLPI, and some custom scripts, that comprise our centralized configuration management system. In this paper, we discuss the use of these tools for centralized configuration management of our servers and services; change management, which requires authorized approval of production changes; a complete, version-controlled history of all changes made; separation of production, testing, and development systems using Puppet environments; semi-automated server inventory using GLPI; and configuration change monitoring and reporting via the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a...

  17. Report on the control of the safety and security of nuclear facilities. Part 2: the reconversion of military plutonium stocks. The use of the helps given to central and eastern Europe countries and to the new independent states; Rapport sur le controle de la surete et de la securite des installations nucleaires. Deuxieme partie: la reconversion des stocks de plutonium militaire. L'utilisation des aides accordees aux pays d'Europe centrale et orientale et aux nouveaux etats independants

    Energy Technology Data Exchange (ETDEWEB)

    Birraux, C

    2002-07-01

    This report deals with two different aspects of the safety and security of nuclear facilities. The first aspect concerns the reconversion of weapon grade plutonium stocks: the plutonium in excess, plutonium hazards and nuclear fuel potentialities, the US program, the Russian program, the actions of European countries (France, Germany), the intervention of other countries, the unanswered questions (political aspects, uncertainties), the solutions of the future (improvement of reactors, the helium-cooled high temperature reactor technology (gas-turbine modular helium reactor: GT-MHR), the Carlo Rubbia's project). The second aspect concerns the actions carried out by the European Union in favor of the civil nuclear facilities of central and eastern Europe: the European Union competencies through the Euratom treaty, the conclusions of the European audit office about the PHARE and TACIS nuclear programs, the status of committed actions, the coming planned actions, and the critical analysis of the policy adopted so far. (J.S.)

  18. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  19. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  20. A case of acute meningitis with clear cerebrospinal fluid: value of computed tomography for the diagnosis of central nervous system tuberculosis

    International Nuclear Information System (INIS)

    The author reports a case of acute meningitis with clear cerebrospinal fluid in which extensive bacteriologic investigations were negative making the etiologic diagnosis exceedingly difficult. Initiation of empiric antituberculous therapy was rapidly followed by clinical and biological improvement, without complications, and by resolution of abnormal findings on computed tomography of the brain. On these grounds, meningitis secondary to a tuberculoma in the temporal lobe was diagnosed. The author points out that tuberculous meningitis is still a severe, potentially fatal condition; this, together with the fact that tubercle bacilli are often very scarce or absent, requires that tuberculous meningitis be routinely considered in every patient with clear cerebrospinal fluid meningitis whose condition deteriorates. Computed tomography of the brain is essential to ensure rapid diagnosis and prompt initiation of antituberculous therapy. Lastly, the author points out that nowadays herpes simplex virus encephalopathy should also be considered

  1. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  2. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, such as birthing centers and psychiatric care centers. When you ...

  3. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  4. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  6. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  7. Unique life sciences research facilities at NASA Ames Research Center

    Science.gov (United States)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  8. 20 CFR 654.413 - Cooking and eating facilities.

    Science.gov (United States)

    2010-04-01

    ... cleaned materials. (c) When central mess facilities are provided, the kitchen and mess hall shall be in... physical facilities, equipment and operation shall be in accordance with provisions of applicable...

  9. Canastota Renewable Energy Facility Project

    Energy Technology Data Exchange (ETDEWEB)

    Blake, Jillian; Hunt, Allen

    2013-12-13

    The project was implemented at the Madison County Landfill located in the Town of Lincoln, Madison County, New York. Madison County has owned and operated the solid waste and recycling facilities at the Buyea Road site since 1974. At the onset of the project, the County owned and operated facilities there to include three separate landfills, a residential solid waste disposal and recycled material drop-off facility, a recycling facility and associated administrative, support and environmental control facilities. This putrescible waste undergoes anaerobic decomposition within the waste mass and generates landfill gas, which is approximately 50% methane. In order to recover this gas, the landfill was equipped with gas collection systems on both the east and west sides of Buyea Road which bring the gas to a central point for destruction. In order to derive a beneficial use from the collected landfill gases, the County decided to issue a Request for Proposals (RFP) for the future use of the generated gas.

  10. Facility level thermal systems for the Advanced Technology Solar Telescope

    Science.gov (United States)

    Phelps, LeEllen; Murga, Gaizka; Fraser, Mark; Climent, Tània

    2012-09-01

    The management and control of the local aero-thermal environment is critical for success of the Advanced Technology Solar Telescope (ATST). In addition to minimizing disturbances to local seeing, the facility thermal systems must meet stringent energy efficiency requirements to minimize impact on the surrounding environment and meet federal requirements along with operational budgetary constraints. This paper describes the major facility thermal equipment and systems to be implemented along with associated energy management features. The systems presented include the central plant, the climate control systems for the computer room and coudé laboratory, the carousel cooling system which actively controls the surface temperature of the rotating telescope enclosure, and the systems used for active and passive ventilation of the telescope chamber.

  11. Expertise concerning the request by the ZWILAG Intermediate Storage Facility Wuerenlingen AG for granting of a licence for the building and operation of the Central Intermediate Storage Facility for radioactive wastes; Gutachten zum Gesuch der ZWILAG Zwischenlager Wuerenlingen AG um Erteilung der Bewilligung fuer Bau und Betrieb des Zentralen Zwischenlagers fuer radioaktive Abfaelle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-15

    On July 15, 1993, the Intermediate Storage Facility Wuerenlingen AG (ZWILAG) submitted a request to the Swiss Federal Council for granting of a license for the construction and operation of a central intermediate storage facility for radioactive wastes. The project foresees intermediate storage halls as well as conditioning and incineration installations. The Federal Agency for the Safety of Nuclear Installations (HSK) has to examine the project from the point of view of nuclear safety. The present report presents the results of this examination. Different waste types have to be treated in ZWILAG: spent fuel assemblies from Swiss nuclear power plants (KKWs); vitrified, highly radioactive wastes from reprocessing; intermediate and low-level radioactive wastes from KKWs and from reprocessing; wastes from the dismantling of nuclear installations; wastes from medicine, industry and research. The wastes are partitioned into three categories: high-level (HAA) radioactive wastes containing, amongst others, {alpha}-active nuclides, intermediate-level (MAA) radioactive wastes and low-level (SAA) radioactive wastes. The projected installation consists of three repository halls for each waste category, a hot cell, a conditioning plant and an incineration and melting installation. The HAA repository can accept 200 transport and storage containers with vitrified high-level wastes or spent fuel assemblies. The expected radioactivity amounts to 10{sup 20} Bq, including 10{sup 18} Bq of {alpha}-active nuclides. The thermal power produced by decay is released to the environment by natural circulation of air. The ventilation system is designed for a maximum power of 5.8 MW. Severe conditions are imposed to the containers as far as tightness and shielding against radiation is concerned. In the repository for MAA wastes the maximum radioactivity is 10{sup 18} Bq with 10{sup 15} Bq of {alpha}-active nuclides. The maximum thermal power of 250 kW is removed by forced air cooling. Because

  12. Multipurpose dry storage facilities

    International Nuclear Information System (INIS)

    SGN has gained considerable experience in the design and construction of interim storage facilities for spent fuel and various nuclear waste, and can therefore propose single product and multiproduct facilities capable of accommodating all types of waste in a single structure. The pooling of certain functions (transport cask reception, radiation protection), the choice of optimized technologies to meet the specific needs of the clients (automatic transfer by shielded cask or nuclearized crane) and the use of the same type of well to cool the heat releasing packages (glass canisters, fuel elements) make it possible to propose industrially proven and cost effective solutions. The studies conducted by SGN on behalf of the Dutch company COVRA (Centrale Organisatie Voor Radioactif Afval), offer an example of the application of this new concept. This paper first presents the SGN experience through a short description of reference storage facilities for various types of products (MLW, HLW and Spent Fuel). It goes on with a typical application to show how these proven technologies are combined to obtain single product or multiproduct facilities tailored to the client's specific requirements. (author)

  13. Facile synthesis of silver nanoparticles and its antibacterial activity against Escherichia coli and unknown bacteria on mobile phone touch surfaces/computer keyboards

    Science.gov (United States)

    Reddy, T. Ranjeth Kumar; Kim, Hyun-Joong

    2016-07-01

    In recent years, there has been significant interest in the development of novel metallic nanoparticles using various top-down and bottom-up synthesis techniques. Kenaf is a huge biomass product and a potential component for industrial applications. In this work, we investigated the green synthesis of silver nanoparticles (AgNPs) by using kenaf ( Hibiscus cannabinus) cellulose extract and sucrose, which act as stabilizing and reducing agents in solution. With this method, by changing the pH of the solution as a function of time, we studied the optical, morphological and antibacterial properties of the synthesized AgNPs. In addition, these nanoparticles were characterized by Ultraviolet-visible spectroscopy, transmission electron microscopy (TEM), field-emission scanning electron microscopy, Fourier transform infrared (FTIR) spectroscopy and energy-dispersive X-ray spectroscopy (EDX). As the pH of the solution varies, the surface plasmon resonance peak also varies. A fast rate of reaction at pH 10 compared with that at pH 5 was identified. TEM micrographs confirm that the shapes of the particles are spherical and polygonal. Furthermore, the average size of the nanoparticles synthesized at pH 5, pH 8 and pH 10 is 40.26, 28.57 and 24.57 nm, respectively. The structure of the synthesized AgNPs was identified as face-centered cubic (fcc) by XRD. The compositional analysis was determined by EDX. FTIR confirms that the kenaf cellulose extract and sucrose act as stabilizing and reducing agents for the silver nanoparticles. Meanwhile, these AgNPs exhibited size-dependent antibacterial activity against Escherichia coli ( E. coli) and two other unknown bacteria from mobile phone screens and computer keyboard surfaces.

  14. Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) System Configuration Management Plan

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.L. Jr.

    1994-06-01

    The Treated Effluent Disposal Facility Operator Training Station (TEDF OTS) is a computer based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS). It consists of PC compatible computers and a Programmable Logic Controller (PLC) designed to emulate the responses of various plant components connected to or under the control of the CCS. The system trains operators by simulating the normal operation but also has the ability to force failures of different equipment allowing the operator to react and observe the events. The paper describes organization, responsibilities, system configuration management activities, software, and action plans for fully utilizing the simulation program.

  15. Marina Facilities

    OpenAIRE

    2014-01-01

    The CIRPAS main facility and headquarters are at Marina Municipal Airport (formerly Fritchie Field, Fort Ord) in Marina, California. CIRPAS has a 30,000 sq. ft. maintenance hanger there, which houses staff offices, an instrument and calibration laboratory, maintenance and payload integration shops, conference rooms, and flight planning and operations control center.

  16. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Videos related to Computed Tomography (CT) - Head About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  17. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Images related to Computed Tomography (CT) - Sinuses About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  18. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.

  19. Computation of the time-varying flow rate from an artesian well in central Dade County, Florida, by analytical and numerical simulation methods

    Science.gov (United States)

    Merritt, Michael L.

    1995-01-01

    To construct a digital simulation of a plume of brackish water in the surficial Biscayne aquifer of central Dade County, Florida, that originated from a flowing artesian well, it was necessary to quantify the rate of spillage and the consequent point-source loading of the aquifer. However, a flow-rate measurement (2,350 gallons per minute) made 2 months after drilling of the well in 1944 was inconsistent with later measurements (1,170 gallons per minute) in 1964, 1965, and 1969. Possible explanations were the: (1) drawdown of the aquifer over time; (2) raising of the altitude at which the water was discharged; (3) installation of 80 feet of 8-inch liner; (4) an increase in the density of the flowing water; and (5) gradual deterioration of the well casing. The first approach to reconciling the measured flow rates was to apply a form of the equation for constant-drawdown analysis often used to estimate aquifer transmissivity. Next, a numerical simulation analysis was made that pro- vided the means to account for friction loss in the well and recharge across vertically adjacent con- fining layers and from lateral boundaries. The numerical analysis required the construction of a generalized model of the subsurface from the surficial Biscayne aquifer to the cavernous, dolomitic Boulder Zone at a depth of 3,000 feet. Calibration of the generalized flow model required that the moddle confining unit of the Floridan aquifer system separating the artesian flow zone in the Upper Floridan aquifer from the Lower Floridan aquifer (the Boulder Zone) have a vertical hydraulic conductivity of at least 1 foot per day. The intermediate confining unit separating the flow zone from the surficial Biscayne aquifer was assigned a much lower hydraulic conductivity (0.01 foot per day or less). The model indicated that the observed mounding of Upper Floridan aquifer heads along the axis of the Florida Peninsula was related to the variable depth of the freshwater and brackish-water zone

  20. Integral lightning protection system in petroleum facilities

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Horacio; Gallego, Luis; Montana, Johny; Younes, Camilo; Rondon, Daniel; Gonzalez, Diego; Herrera, Javier; Perez, Ernesto; Vargas, Mauricio; Quintana, Carlos; Salgado, Milton [Universidad Nacional de Colombia, Bogota (Colombia)]. E-mail: paas@paas.unal.edu.co

    2001-07-01

    This paper presents an Integral Lightning Protection System, focused mainly in petroleum facilities and applied to a real case in Colombia, South America. As introduction it is presented a summary of the incidents happened in last years, a diagnosis and the proposal of solution. Finally, as part of the analysis, a lightning risk assessment for the Central Process Facility is showed. (author)

  1. Central Libraries in Uncertain Times.

    Science.gov (United States)

    Kenney, Brian J.

    2001-01-01

    Discusses security and safety issues for public libraries, especially high-profile central facilities, in light of the September 11 terrorist attacks. Highlights include inspecting bags as patrons enter as well as exit; the need for security guidelines for any type of disaster or emergency; building design; and the importance of communication.…

  2. 中央执行负荷对成人估算策略运用的影响%The Effect of Central Executive Load on Adult's Strategy Using in Computational Estimation

    Institute of Scientific and Technical Information of China (English)

    司继伟; 杨佳; 贾国敬; 周超

    2012-01-01

    Many models on strategy choosing in arithmetic cognition have been proposed, however, the role of central executive load in strategy utilization is still far from clear. A previous research with the Choice Reaction Time task (CRT) found that working memory load didn't affect children's strategy utilization (Imbo & Vandierendonck, 2007). In another study, Logie, Gilhooly and Wynn (1994) reported that different sub-tasks affected mental arithmetic. More recent studies also revealed that the inhibition and shifting capacities mediated age-related differences in strategy selection (Lemaire & Lecacheur, 2011; Hodzik & Lemaire, 2011). Dual-task paradigm is commonly utilized in exploring working memory load in arithmetic performance, and the choice/no-choice is a standard method to obtain unbiased data about strategy utilization. In this study, we employed the dual-task paradigm and choice/no-choice method to investigate the influence of central executive load upon individual strategy utilization during arithmetic processing. 128 college students were tested by the dual-task paradigm and choice/no-choice method. They were askedto finish a two-digit addition computational estimate and a secondary task at the same time. The experimental design was as following: 5 (consistent -high load, consistent -low load, inconsistent -high load, inconsistent -low load, no load) x 4 (free-choice condition, best-strategy choice condition, no-choice/rounding-up condition, no-choice/rounding-down condition). The main task was to finish 30 two-digit addition questions, and the secondary task was Han and Kim's (2004) design with some modifications. Results showed that: 1) Central executive load did not affect adult's strategy distribution. But comparing with free-choice condition, adults used less rounding-down strategy under the best -strategy choice condition (F(4,123) = 0.58, p> 0.05); 2) Central executive load affected how participants selected (F(4,123) = 11.10, p < 0.05) and executed (F

  3. A Computable Economist’s Perspective on Computational Complexity

    OpenAIRE

    Vela Velupillai, K.

    2007-01-01

    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of...

  4. Main Facilities

    International Nuclear Information System (INIS)

    This chapter discuss on main nuclear facilities available in the Malaysian Institute for Nuclear Technology Research (MINT). As a national research institute whose core activities are nuclear science and technology, MINT are made up of main commercializable radiation irradiators, pilot plant and fully equipped laboratories. Well elaboration on its characteristics and functions explain for RTP (PUPSPATI TRIGA reactors), Cobalt-60 gamma irradiator, electron beam accelerators, and radioactive waste management center

  5. Nuclear facility

    International Nuclear Information System (INIS)

    In nuclear facilities with a fuel storage pool in a spent fuel pit building there is a filter to each pool through which the fuel pit water is pumped. According to the invention the filter is provided with an independently movable housing placed beneath the surface of the pool water and fixed to the lateral side of the pool by means of detachable fixtures. (orig./RW)

  6. Centrally Banked Cryptocurrencies

    OpenAIRE

    Danezis, George; Meiklejohn, Sarah

    2015-01-01

    Current cryptocurrencies, starting with Bitcoin, build a decentralized blockchain-based transaction ledger, maintained through proofs-of-work that also generate a monetary supply. Such decentralization has benefits, such as independence from national political control, but also significant limitations in terms of scalability and computational cost. We introduce RSCoin, a cryptocurrency framework in which central banks maintain complete control over the monetary supply, but rely on a distribut...

  7. Centrally Banked Cryptocurrencies

    OpenAIRE

    Danezis, G.; Meiklejohn, S.

    2016-01-01

    Current cryptocurrencies, starting with Bitcoin, build a decentralized blockchain-based transaction ledger, maintained through proofs-of-work that also serve to generate a monetary supply. Such decentralization has benefits, such as independence from national political control, but also significant limitations in terms of computational costs and scalability. We introduce RSCoin, a cryptocurrency framework in which central banks maintain complete control over the monetary supply, but rely on...

  8. Large-coil-test-facility fault-tree analysis

    International Nuclear Information System (INIS)

    An operating-safety study is being conducted for the Large Coil Test Facility (LCTF). The purpose of this study is to provide the facility operators and users with added insight into potential problem areas that could affect the safety of personnel or the availability of equipment. This is a preliminary report, on Phase I of that study. A central feature of the study is the incorporation of engineering judgements (by LCTF personnel) into an outside, overall view of the facility. The LCTF was analyzed in terms of 32 subsystems, each of which are subject to failure from any of 15 generic failure initiators. The study identified approximately 40 primary areas of concern which were subjected to a computer analysis as an aid in understanding the complex subsystem interactions that can occur within the facility. The study did not analyze in detail the internal structure of the subsystems at the individual component level. A companion study using traditional fault tree techniques did analyze approximately 20% of the LCTF at the component level. A comparison between these two analysis techniques is included in Section 7

  9. Enhancement of central heating plant economic evaluation program for retrofit to coal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kinast, J.J.; Biederman, R.; Blazek, C.F.; Lin, M.C.; Moshagge, R.E.

    1994-09-01

    Public Law 99-190 requires the Department of Defense (DOD) to increase the use of coal for steam generation. However, DOD also has an obligation to use the most economical fuel. Supporting the coal conversion effort, the U.S. Army Construction Engineering Research Laboratories (USACERL) has developed a computer program for engineering personnel at Major Army Commands, installations, the Defense Logistics Agency, and other DOD facilities to analyze the technical and economic feasibility of specific coal-combustion technologies at central heating plant facilities on military bases. The program, Central Heating Plant Economic Evaluation (CHPECON), can model plants that have a capacity of 50,000 to 600,000 MBtu/hr of steam, with individual boiler sizes from 25,000 to 200,000 MBtu/hr. The technologies examined include coal-fired stoker and fluidized-bed boilers, oil/natural gas boilers, and coal-slurry boilers. This report documents enhancements to existing CHPECON procedures for analyzing the retrofit or reconversion of a central heating plant to coal firing. They include: improving the screening and scoring process for boiler facilities considered for retrofit; adding options for converting a facility back to coal firing; detailing retrofit costs; upgrading economic analysis of a retrofit from an operating cost evaluation to a life-cycle cost analysis; and expanding the economic analysis to include examination of the condition of existing equipment.

  10. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  11. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  12. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  13. Women and Computers: An Introduction.

    Science.gov (United States)

    Perry, Ruth; Greber, Lisa

    1990-01-01

    Discusses women's central role in the development of the computer and their present day peripheral position, a progression paralleled in the fields of botany, medical care, and obstetrics. Affirms the importance of computer education to women. (DM)

  14. Facility Management

    OpenAIRE

    Král, David

    2012-01-01

    Tématem bakalářské práce je nalezení cesty ke zvýšení dlouhodobé efektivnosti a prosperity společnosti, která v rámci své podnikatelské činnosti spravuje a udržuje vlastní nemovitosti v centru Brna. Práce vychází z aktuálního stavu facility managementu společnosti a definování jejich silných a slabých stránek. Základem pro návrh efektivního řízení facility managementu je zpracování finanční analýzy společnosti a sledování nákladů včetně jejich optimalizace. Hlavním přínosem mé bakalářské prác...

  15. Central heating plant economic evaluation program. Volume 4. Coalfield properties information data management program. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lin, M.C.; Moshage, R.; Schanche, G.; Blazek, C.; Biederman, R.

    1995-01-01

    Public law has directed the Department of Defense (DoD) to rehabilitate and convert its existing domestic power plants to burn more coal. Other Federal legislation requires DoD to use the most economic fuel for any new heating system. This five-volume report discusses the Central Heating Plant Economic Evaluation Program (CHPECON), a computer program for screening potential new and retrofit steam/power generation facilities. Volume 4 is the Coalfield Properties Information Data Management Program. CHPECON provides screening criteria to evaluate competing combustion technologies using coal, gas, or oil; detailed conceptual facility design information; budgetary facility costs; and economic measures of project acceptability including total life cycle costs and levelized cost of service. The program provides sufficient flexibility to vary critical design and operating parameters to determine project sensitivity and parametric evaluation.

  16. Biotechnology Facility (BTF) for ISS

    Science.gov (United States)

    1998-01-01

    Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.

  17. Facility model for the Los Alamos Plutonium Facility

    International Nuclear Information System (INIS)

    The Los Alamos Plutonium Facility contains more than sixty unit processes and handles a large variety of nuclear materials, including many forms of plutonium-bearing scrap. The management of the Plutonium Facility is supporting the development of a computer model of the facility as a means of effectively integrating the large amount of information required for material control, process planning, and facility development. The model is designed to provide a flexible, easily maintainable facility description that allows the faciltiy to be represented at any desired level of detail within a single modeling framework, and to do this using a model program and data files that can be read and understood by a technically qualified person without modeling experience. These characteristics were achieved by structuring the model so that all facility data is contained in data files, formulating the model in a simulation language that provides a flexible set of data structures and permits a near-English-language syntax, and using a description for unit processes that can represent either a true unit process or a major subsection of the facility. Use of the model is illustrated by applying it to two configurations of a fictitious nuclear material processing line

  18. Theme: Laboratory Facilities Improvement.

    Science.gov (United States)

    Miller, Glen M.; And Others

    1993-01-01

    Includes "Laboratory Facilities Improvement" (Miller); "Remodeling Laboratories for Agriscience Instruction" (Newman, Johnson); "Planning for Change" (Mulcahy); "Laboratory Facilities Improvement for Technology Transfer" (Harper); "Facilities for Agriscience Instruction" (Agnew et al.); "Laboratory Facility Improvement" (Boren, Dwyer); and…

  19. [DNA computing].

    Science.gov (United States)

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  20. About the Facilities for the Aged in Central Urban Areas of Ningbo%宁波市中心城区老龄设施使用状态分析与对策

    Institute of Scientific and Technical Information of China (English)

    2013-01-01

    As the population aging process speeds up in China, many cities are confronted with a big problem of how to make the aged have a happy life. On the grounds that the supply-demand issue is the material base, this article analyzes the supply and demand of the facilities for the aged, explores the problems of the facilities for the aged, and suggests some solutions to the problems.%  伴随着我国老龄化程度的加深,如何让老龄人口安居晚年是目前我国各主要城市面临的主要问题之一。老龄设施的供需状况是老龄人口安居晚年的物质基础,针对老龄设施的供给与老龄人口养老需求之间的差异,阐述老龄设施的供给存在的问题,从软硬件两个层面提出解决当前老龄设施困境的措施。

  1. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  2. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  3. Facility management in German hospitals.

    Science.gov (United States)

    Gudat, H

    2000-04-01

    Facility management and optimum building management offer for hospitals a chance to reduce costs and to increase quality, process sequences, employee motivation and customer satisfaction. Some years ago simple services such as cleaning, catering or laundry were outsourced. Now, German hospitals progress to more complex fields such as building and medical technology, clinical support processes such as pharmacy, central laboratory and sterilization, goods and logistics services. PMID:11066999

  4. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  5. Designing for the home: a comparative study of support aids for central heating systems.

    Science.gov (United States)

    Sauer, J; Wastell, D G; Schmeink, C

    2009-03-01

    The study examined the influence of different types of enhanced system support on user performance during the management of a central heating system. A computer-based simulation of a central heating system, called CHESS V2.0, was used to model different interface options, providing different support facilities to the user (e.g., historical, predictive, and instructional displays). Seventy-five participants took part in the study and completed a series of operational scenarios under different support conditions. The simulation environment allowed the collection of performance measures (e.g., energy consumption), information sampling, and system control behaviour. Subjective user evaluations of various aspects of the system were also measured. The results showed performance gains for predictive displays whereas no such benefits were observed for the other display types. The data also revealed that status and predictive displays were valued most highly by users. The implications of the findings for designers of central heating systems are discussed.

  6. Designing for the home: a comparative study of support aids for central heating systems.

    Science.gov (United States)

    Sauer, J; Wastell, D G; Schmeink, C

    2009-03-01

    The study examined the influence of different types of enhanced system support on user performance during the management of a central heating system. A computer-based simulation of a central heating system, called CHESS V2.0, was used to model different interface options, providing different support facilities to the user (e.g., historical, predictive, and instructional displays). Seventy-five participants took part in the study and completed a series of operational scenarios under different support conditions. The simulation environment allowed the collection of performance measures (e.g., energy consumption), information sampling, and system control behaviour. Subjective user evaluations of various aspects of the system were also measured. The results showed performance gains for predictive displays whereas no such benefits were observed for the other display types. The data also revealed that status and predictive displays were valued most highly by users. The implications of the findings for designers of central heating systems are discussed. PMID:18433730

  7. Indoor Lighting Facilities

    Science.gov (United States)

    Matsushima, Koji; Saito, Yoshinori; Ichikawa, Shigenori; Kawauchi, Takao; Tanaka, Tsuneo; Hirano, Rika; Tazuke, Fuyuki

    According to the statistics by the Ministry of Land, Infrastructure and Transport, the total floor space of all building construction started was 188.87 million m2 (1.5% increase y/y), marking the fourth straight year of increase. Many large-scale buildings under construction in central Tokyo become fully occupied by tenants before completion. As for office buildings, it is required to develop comfortable and functional office spaces as working styles are becoming more and more diversified, and lighting is also an element of such functionalities. The total floor space of construction started for exhibition pavilions, multipurpose halls, conference halls and religious architectures decreased 11.1% against the previous year. This marked a decline for 10 consecutive years and the downward trend continues. In exhibition pavilions, the light radiation is measured and adjusted throughout the year so as not to damage the artworks by lighting. Hospitals, while providing higher quality medical services and enhancing the dwelling environment of patients, are expected to meet various restrictions and requirements, including the respect for privacy. Meanwhile, lighting designs for school classrooms tend to be homogeneous, yet new ideas are being promoted to strike a balance between the economical and functional aspects. The severe economic environment continues to be hampering the growth of theaters and halls in both the private and public sectors. Contrary to the downsizing trend of such facilities, additional installations of lighting equipment were conspicuous, and the adoption of high efficacy lighting appliances and intelligent function control circuits are becoming popular. In the category of stores/commercial facilities, the construction of complex facilities is a continuing trend. Indirect lighting, high luminance discharge lamps with excellent color rendition and LEDs are being effectively used in these facilities, together with the introduction of lighting designs

  8. central t

    Directory of Open Access Journals (Sweden)

    Manuel R. Piña Monarrez

    2007-01-01

    Full Text Available Dado que la Regresión Ridge (RR, es una estimación sesgada que parte de la solución de la regresión de Mínimos Cuadrados (MC, es vital establecer las condiciones para las que la distribución central t de Student que se utiliza en la prueba de hipótesis en MC, sea también aplicable a la regresión RR. La prueba de este importante resultado se presenta en este artículo.

  9. Expertise concerning the request by the ZWILAG Intermediate Storage Wuerenlingen AG for delivery of the operation licence for the conditioning installation as well as for the burning and melting installation of the central intermediate storage facility for radioactive wastes in Wuerenlingen

    International Nuclear Information System (INIS)

    On December 15, 1997, the Central Intermediate Storage Facility for radioactive materials (ZWILAG) delivered a request to the Swiss Federal Council for an operational licence for the conditioning installation and the incineration and melting installation at the Central Intermediate Storage Facility (ZZL). For the whole project, the general licence was granted on June 23, 1993. The construction of the whole installation as well as the operation in the storage halls with reception, hot cell and transhipment station were licensed on August 12, 1999. The Federal Agency for the Safety of Nuclear Installations (HSK) examined the request documentation of ZWILAG from the point of view of nuclear safety and radiation protection. The results of the examination are contained in this report. In general, the examination leads to a positive decision. During the realisation process many project adaptations have brought improvements to the installation as far as radiation protection and the treatment of wastes are concerned. HSK requests from a former licensing process were taken into account by ZWILAG. The present examination contains many remarks and requests that the HSK considers necessary. The most important remarks are issued as special references as far as they do not appear as proposals in the licence obligations. In general, the reference form was chosen when the fulfilment of the request can be guarantied by HSK within the framework of a release process. Several references inform ZWILAG of problems for which HSK wishes to receive the results of extended inquiries or where HSK will assist in testing or demonstrations. Other references concern requests for plant documentation. Further, calculations of radioactive material release during normal operation is considered as being necessary. Based on its examination, HSK concludes that the conditions are fulfilled for the safe operation of the conditioning installation and of the incineration and melting installation as long as

  10. The centralized monitoring system solution for auxiliary equipment and environmental facilities in the station operation%厂站运行辅助设备及环境设备的集中监测系统解决方案

    Institute of Scientific and Technical Information of China (English)

    张振华

    2013-01-01

    With the gradual expansion of power system construction, unattended substation is imperative. The number and type of auxiliary equipment and environmental equipment in station operation are relatively large, so to realize the efficient and centralized monitoring and management of the equipment bears the brunt of difficulties. A new solution of plant station running auxiliary equipment and environmental equipment for centralized monitoring system is designed and proposed based on the relative switch information of automatic control and new sensing technology collect equipment. The system realizes centralized monitoring and management on auxiliary equipment and environment in the station operation by the dispatching terminal, after the communication network of electric power and the internal information network sent back the running state data of auxiliary equipment and environmental equipment in the station operation, which not only greatly improves the quality and efficiency of the equipment operation and maintenance, but also provides a new solution for substation communication network operation and maintenance. It can effectively realize unmanned operation and development of substation.%随着电力系统建设规模日趋壮大,变电站无人化已势在必行。厂站运行辅助设备、环境设备的数量和类型均相对较多,实现设备的高效、集中监控管理便成为了首当其冲的难题。基于自动控制和新型传感技术采集设备的相关开关量信息,设计并提出了一种新型的厂站运行辅助设备及环境设备的集中监测系统解决方案。该方案通过电力通信网和内部信息网络完成厂站运行辅助设备及环境设备运行状态数据的回传,从而实现了地调主站端对厂站运行的辅助设备和环境的集中实时监控和管理,极大地提高了设备运行维护的质量和效率。该研究为今后变电站通信网络的运行和维护提供了一种新的解决

  11. Example of shared ATLAS Tier2 and Tier3 facilities

    CERN Document Server

    Gonzalez de la Hoz, S; Kemp, Y; Wolters, H; Severini, H; Bhimji, W

    2012-01-01

    The ATLAS computing and data models have moved/are moving away from the strict MONARC model (hierarchy) to a mesh model. Evolution of computing models also requires evolution of network infrastructure to enable any Tier2 and Tier3 to easily connect to any Tier1 or Tier2. In this way some changing of the data model are required: a) any site can replicate data from any other site. b) Dynamic data caching. Analysis sites receive datasets from any other site “on demand” based on usage pattern, and possibly using a dynamic placement of datasets by centrally managed replication of whole datasets. Unused data is removed. c) Remote data access. Local jobs could access data stored at remote sites using local caching on a file or sub-file level. In this contribution, the model of shared ATLAS Tier2 and Tier3 facilities in the EGI/gLite flavour is explained. The Tier3s in the US and the Tier3s in Europe are rather different because in Europe we have facilities, which are Tier2s with a Tier3 component (Tier3 with a c...

  12. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  13. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other...

  14. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  15. Central nervous system abnormalities on midline facial defects with hypertelorism detected by magnetic resonance image and computed tomography Anomalias de sistema nervoso central em defeitos de linha média facial com hipertelorismo detectados por ressonância magnética e tomografia computadorizada

    Directory of Open Access Journals (Sweden)

    Vera Lúcia Gil-da-Silva-Lopes

    2006-12-01

    Full Text Available The aim of this study were to describe and to compare structural central nervous system (CNS anomalies detected by magnetic resonance image (MRI and computed tomography (CT in individuals affected by midline facial defects with hypertelorism (MFDH isolated or associated with multiple congenital anomalies (MCA. The investigation protocol included dysmorphological examination, skull and facial X-rays, brain CT and/or MRI. We studied 24 individuals, 12 of them had an isolated form (Group I and the others, MCA with unknown etiology (Group II. There was no significative difference between Group I and II and the results are presented in set. In addition to the several CNS anomalies previously described, MRI (n=18 was useful for detection of neuronal migration errors. These data suggested that structural CNS anomalies and MFDH seem to have an intrinsic embryological relationship, which should be taken in account during the clinical follow-up.Este estudo objetivou descrever e comparar as anomalias estruturais do sistema nervoso central (SNC detectadas por meio de ressonância magnética (RM e tomografia computadorizada (TC de crânio em indivíduos com defeitos de linha média facial com hipertelorismo (DLMFH isolados ou associados a anomalias congênitas múltiplas (ACM. O protocolo de investigação incluiu exame dismorfológico, RX de crânio e face, CT e RM de crânio. Foram estudados 24 indivíduos, sendo que 12 apresentavam a forma isolada (Grupo I e os demais, DLMFH com ACM de etiologia não esclarecida (Grupo II. Não houve diferença entre os dois grupos e os resultados foram agrupados. Além de várias anomalias de SNC já descritas, a RM foi útil para detecção de erros de migração neuronal. Os dados sugerem que as alterações estruturais de SNC e os DLMFH têm relação embriológica, o que deve ser levado em conta durante o seguimento clínico.

  16. UNI C - A True Internet Pioneer, the Danish Computing Centre for Research and Education

    DEFF Research Database (Denmark)

    Olesen, Dorte

    2015-01-01

    that small computers could now be purchased for local use by the university departments whereas the need for high performance computing could only be satisfied by a joint national purchase and advanced network access to this central computer facility.The new center was named UNI-C and succeeded in helping...... Danish frontline research to use innovative computing techniques and have major breakthroughs using the first massively parallel computer architectures, but the greatest impact of UNI-C on Danish society was the successful early roll out of the Internet to universities with a follow-up of establishing...... the first Danish Internet service to ordinary PC users. This very first Internet service became a great success and helped to put Denmark on the international map as one of the very early Internet adopters. It also meant that UNI-C was tasked by the Ministry of Education with delivering a number...

  17. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  18. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  19. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  20. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITY Evaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  1. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance computer-based electronic backend that...

  2. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance-computer-based electronic backend that...

  3. Centralized Fabric Management Using Puppet, Git, and GLPI

    Science.gov (United States)

    Smith, Jason A.; De Stefano, John S., Jr.; Fetzko, John; Hollowell, Christopher; Ito, Hironori; Karasawa, Mizuki; Pryor, James; Rao, Tejas; Strecker-Kellogg, William

    2012-12-01

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of recent technological advances in administrative automation. Puppet is a seasoned open-source tool that is designed for enterprise class centralized configuration management. At the RHIC and ATLAS Computing Facility (RACF) at Brookhaven National Laboratory, we use Puppet along with Git, GLPI, and some custom scripts as part of our centralized configuration management system. In this paper, we discuss how we use these tools for centralized configuration management of our servers and services, change management requiring authorized approval of production changes, a complete version controlled history of all changes made, separation of production, testing and development systems using puppet environments, semi-automated server inventory using GLPI, and configuration change monitoring and reporting using the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a 2,000+ node cluster and 400+ infrastructure servers with an administrative staff of approximately 25 full-time employees (FTEs).

  4. Centralized Fabric Management Using Puppet, Git, and GLPI

    International Nuclear Information System (INIS)

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of recent technological advances in administrative automation. Puppet is a seasoned open-source tool that is designed for enterprise class centralized configuration management. At the RHIC and ATLAS Computing Facility (RACF) at Brookhaven National Laboratory, we use Puppet along with Git, GLPI, and some custom scripts as part of our centralized configuration management system. In this paper, we discuss how we use these tools for centralized configuration management of our servers and services, change management requiring authorized approval of production changes, a complete version controlled history of all changes made, separation of production, testing and development systems using puppet environments, semi-automated server inventory using GLPI, and configuration change monitoring and reporting using the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a 2,000+ node cluster and 400+ infrastructure servers with an administrative staff of approximately 25 full-time employees (FTEs).

  5. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  6. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  7. Computing by Observing Changes

    Science.gov (United States)

    Cavaliere, Matteo; Leupold, Peter

    Computing by Observing is a paradigm for the implementation of models of Natural Computing. It was inspired by the setup of experiments in biochemistry. One central feature is an observer that translates the evolution of an underlying observed system into sequences over a finite alphabet. We take a step toward more realistic observers by allowing them to notice only an occurring change in the observed system rather than to read the system's entire configuration. Compared to previous implementations of the Computing by Observing paradigm, this decreases the computational power; but with relatively simple systems we still obtain the language class generated by matrix grammars.

  8. Continuous real-time water-quality monitoring and regression analysis to compute constituent concentrations and loads in the North Fork Ninnescah River upstream from Cheney Reservoir, south-central Kansas, 1999–2012

    Science.gov (United States)

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir, located in south-central Kansas, is the primary water supply for the city of Wichita. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station since 1998 on the North Fork Ninnescah River, the main source of inflow to Cheney Reservoir. Continuously measured water-quality physical properties include streamflow, specific conductance, pH, water temperature, dissolved oxygen, and turbidity. Discrete water-quality samples were collected during 1999 through 2009 and analyzed for sediment, nutrients, bacteria, and other water-quality constituents. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to compute concentrations of those constituents of interest that are not easily measured in real time because of limitations in sensor technology and fiscal constraints. Regression models were published in 2006 that were based on data collected during 1997 through 2003. This report updates those models using discrete and continuous data collected during January 1999 through December 2009. Models also were developed for four new constituents, including additional nutrient species and indicator bacteria. In addition, a conversion factor of 0.68 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the North Ninnescah River upstream from Cheney Reservoir site. Newly developed models and 14 years of hourly continuously measured data were used to calculate selected constituent concentrations and loads during January 1999 through December 2012. The water-quality information in this report is important to the city of Wichita because it allows the concentrations of many potential pollutants of interest to Cheney Reservoir, including nutrients and sediment, to be estimated in real time and characterized over conditions and time scales that

  9. Computation Directorate 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Henson, V E; Guse, J A

    2008-03-06

    software; from expanding BlueGene/L, the world's most powerful computer, by 60% and using it to capture the most prestigious prize in the field of computing, to helping create an automated control system for the National Ignition Facility (NIF) that monitors and adjusts more than 60,000 control and diagnostic points; from creating a microarray probe that rapidly detects virulent high-threat organisms, natural or bioterrorist in origin, to replacing large numbers of physical computer servers with small numbers of virtual servers, reducing operating expense by 60%, the people in Computation have been at the center of weighty projects whose impacts are felt across the Laboratory and the DOE community. The accomplishments I just mentioned, and another two dozen or so, make up the stories contained in this report. While they form an exceptionally diverse set of projects and topics, it is what they have in common that excites me. They share the characteristic of being central, often crucial, to the mission-driven business of the Laboratory. Computational science has become fundamental to nearly every aspect of the Laboratory's approach to science and even to the conduct of administration. It is difficult to consider how we would proceed without computing, which occurs at all scales, from handheld and desktop computing to the systems controlling the instruments and mechanisms in the laboratories to the massively parallel supercomputers. The reasons for the dramatic increase in the importance of computing are manifest. Practical, fiscal, or political realities make the traditional approach to science, the cycle of theoretical analysis leading to experimental testing, leading to adjustment of theory, and so on, impossible, impractical, or forbidden. How, for example, can we understand the intricate relationship between human activity and weather and climate? We cannot test our hypotheses by experiment, which would require controlled use of the entire earth over centuries

  10. AOV Facility Tool/Facility Safety Specifications

    Data.gov (United States)

    Department of Transportation — Develop and maintain authorizing documents that are standards that facilities must follow. These standards are references of FAA regulations and are specific to the...

  11. 75 FR 76041 - Notice; Applications and Amendments to Facility Operating Licenses Involving Proposed No...

    Science.gov (United States)

    2010-12-07

    ... attack threat, thereby achieving high assurance that the facility's digital computer and communications... security program to protect digital computer and communication systems and networks against cyber attacks... cyber attacks. The Plan describes how plant modifications that involve digital computer systems...

  12. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  13. Composite analysis E-area vaults and saltstone disposal facilities

    International Nuclear Information System (INIS)

    This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potential sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public

  14. Composite analysis E-area vaults and saltstone disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cook, J.R.

    1997-09-01

    This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potential sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public.

  15. Central Virginia Community College 1999 Fact Book.

    Science.gov (United States)

    Hicks, Geoffrey

    The Fact Book is an annual publication of the Office of Research, Assessment and Planning at Central Virginia Community College (CVCC). It includes data and trends from 1994-1998 and is divided into six parts: (1) "The College" includes information on the College Board, institutional history, facilities, mission and goals, revenues and…

  16. In-facility transport code review

    Energy Technology Data Exchange (ETDEWEB)

    Spore, J.W.; Boyack, B.E.; Bohl, W.R. [and others

    1996-07-01

    The following computer codes were reviewed by the In-Facility Transport Working Group for application to the in-facility transport of radioactive aerosols, flammable gases, and/or toxic gases: (1) CONTAIN, (2) FIRAC, (3) GASFLOW, (4) KBERT, and (5) MELCOR. Based on the review criteria as described in this report and the versions of each code available at the time of the review, MELCOR is the best code for the analysis of in-facility transport when multidimensional effects are not significant. When multi-dimensional effects are significant, GASFLOW should be used.

  17. Central differences, Euler numbers and symbolic methods

    CERN Document Server

    Dowker, J S

    2013-01-01

    I relate some coefficients encountered when computing the functional determinants on spheres to the central differentials of nothing. In doing this I use some historic works, in particular transcribing the elegant symbolic formalism of Jeffery (1861) into central difference form which has computational advantages for Euler numbers, as discovered by Shovelton (1915). I derive sum rules for these, and for the central differentials, the proof of which involves an interesting expression for powers of sech x as multiple derivatives. I present a more general, symbolic treatment of central difference calculus which allows known, and unknown, things to be obtained in an elegant and compact fashion gaining, at no cost, the expansion of the powers of the inverse sinh, a basic central function. Systematic use is made of the operator 2 asinh(D/2). Umbral calculus is employed to compress the operator formalism. For example the orthogonality/completeness of the factorial numbers, of the first and second kinds, translates, ...

  18. Nuclear fuel cycle facility accident analysis handbook

    International Nuclear Information System (INIS)

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH

  19. Nuclear fuel cycle facility accident analysis handbook

    Energy Technology Data Exchange (ETDEWEB)

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  20. Computational physics: a perspective.

    Science.gov (United States)

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  1. Central Pain Syndrome

    Science.gov (United States)

    ... Enhancing Diversity Find People About NINDS NINDS Central Pain Syndrome Information Page Table of Contents (click to ... being done? Clinical Trials Organizations What is Central Pain Syndrome? Central pain syndrome is a neurological condition ...

  2. Central venous line - infants

    Science.gov (United States)

    CVL - infants; Central catheter - infants - surgically placed ... plastic tube that is put into a large vein in the chest. WHY IS A ... central catheter (PICC) or midline central catheter (MCC). A CVL ...

  3. Design of the disposal facility 2012

    International Nuclear Information System (INIS)

    The spent nuclear fuel accumulated from the nuclear power plants in Olkiluoto in Eurajoki and in Haestholmen in Loviisa will be disposed of in Olkiluoto. A facility complex will be constructed at Olkiluoto, and it will include two nuclear waste facilities according to Government Degree 736/2008. The nuclear waste facilities are an encapsulation plant, constructed to encapsulate spent nuclear fuel and a disposal facility consisting of an underground repository and other underground rooms and above ground service spaces. The repository is planned to be excavated to a depth of 400 - 450 meters. Access routes to the disposal facility are an inclined access tunnel and vertical shafts. The encapsulated fuel is transferred to the disposal facility in the canister lift. The canisters are transferred from the technical rooms to the disposal area via central tunnel and deposited in the deposition holes which are bored in the floors of the deposition tunnels and are lined beforehand with compacted bentonite blocks. Two parallel central tunnels connect all the deposition tunnels and these central tunnels are inter-connected at regular intervals. The solution improves the fire safety of the underground rooms and allows flexible backfilling and closing of the deposition tunnels in stages during the operational phase of the repository. An underground rock characterization facility, ONKALO, is excavated at the disposal level. ONKALO is designed and constructed so that it can later serve as part of the repository. The goal is that the first part of the disposal facility will be constructed under the building permit phase in the 2010's and operations will start in the 2020's. The fuel from 4 operating reactors as well the fuel from the fifth nuclear power plant under construction, has been taken into account in designing the disposal facility. According to the information from TVO and Fortum, the amount of the spent nuclear fuel is 5,440 tU. The disposal facility is being excavated

  4. Human factors engineering report for the cold vacuum drying facility

    Energy Technology Data Exchange (ETDEWEB)

    IMKER, F.W.

    1999-06-30

    The purpose of this report is to present the results and findings of the final Human Factors Engineering (HFE) technical analysis and evaluation of the Cold Vacuum Drying Facility (CVDF). Ergonomics issues are also addressed in this report, as appropriate. This report follows up and completes the preliminary work accomplished and reported by the Preliminary HFE Analysis report (SNF-2825, Spent Nuclear Fuel Project Cold Vacuum Drying Facility Human Factors Engineering Analysis: Results and Findings). This analysis avoids redundancy of effort except for ensuring that previously recommended HFE design changes have not affected other parts of the system. Changes in one part of the system may affect other parts of the system where those changes were not applied. The final HFE analysis and evaluation of the CVDF human-machine interactions (HMI) was expanded to include: the physical work environment, human-computer interface (HCI) including workstation and software, operator tasks, tools, maintainability, communications, staffing, training, and the overall ability of humans to accomplish their responsibilities, as appropriate. Key focal areas for this report are the process bay operations, process water conditioning (PWC) skid, tank room, and Central Control Room operations. These key areas contain the system safety-class components and are the foundation for the human factors design basis of the CVDF.

  5. National Biomedical Tracer Facility. Project definition study

    International Nuclear Information System (INIS)

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H-, H+, and D+). The proposed NBTF facility includes an 80 MeV, 1 mA H- cyclotron that will produce proton-induced (neutron deficient) research isotopes

  6. National Biomedical Tracer Facility. Project definition study

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, R.

    1995-02-14

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H{sup -}, H{sup +}, and D{sup +}). The proposed NBTF facility includes an 80 MeV, 1 mA H{sup -} cyclotron that will produce proton-induced (neutron deficient) research isotopes.

  7. Energetics Conditioning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Energetics Conditioning Facility is used for long term and short term aging studies of energetic materials. The facility has 10 conditioning chambers of which 2...

  8. Dialysis Facility Compare

    Data.gov (United States)

    U.S. Department of Health & Human Services — Dialysis Facility Compare helps you find detailed information about Medicare-certified dialysis facilities. You can compare the services and the quality of care...

  9. Facility Response Plan (FRP)

    Data.gov (United States)

    U.S. Environmental Protection Agency — A Facility Response Plan (FRP) demonstrates a facility's preparedness to respond to a worst case oil discharge. Under the Clean Water Act, as amended by the Oil...

  10. Projectile Demilitarization Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Projectile Wash Out Facility is US Army Ammunition Peculiar Equipment (APE 1300). It is a pilot scale wash out facility that uses high pressure water and steam...

  11. Explosive Components Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 98,000 square foot Explosive Components Facility (ECF) is a state-of-the-art facility that provides a full-range of chemical, material, and performance analysis...

  12. Armament Technology Facility (ATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Armament Technology Facility is a 52,000 square foot, secure and environmentally-safe, integrated small arms and cannon caliber design and evaluation facility....

  13. Wastewater Treatment Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Individual permits for municipal, industrial, and semi-public wastewater treatment facilities in Iowa for the National Pollutant Discharge Elimination System...

  14. The Educational Facilities Charrette

    Science.gov (United States)

    Chase, William W.

    1970-01-01

    The deputy director for the Division of Facilities Development of the U.S. Office of Education discusses a technique for studying and resolving educational facilities development problems within the context of total community planning needs." (Author/AA)

  15. Financing Professional Sports Facilities

    OpenAIRE

    Baade, Robert A.; Victor A. Matheson

    2011-01-01

    This paper examines public financing of professional sports facilities with a focus on both early and recent developments in taxpayer subsidization of spectator sports. The paper explores both the magnitude and the sources of public funding for professional sports facilities.

  16. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to: Evaluate and characterize the effect of flame and thermal...

  17. Reminder: Mandatory Computer Security Course

    CERN Multimedia

    IT Department

    2011-01-01

    Just like any other organization, CERN is permanently under attack – even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Therefore, a new dedicated basic computer security course has been designed informing you about the “Do’s” and “Dont’s” when using CERN's computing facilities. This course is mandatory for all person owning a CERN computer account and must be followed once every three years. Users who have never done the course, or whose course needs to be renewe...

  18. New Mandatory Computer Security Course

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Just like any other organization, CERN is permanently under attack - even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Since 2007, newcomers have to follow a dedicated basic computer security course informing them about the “Do’s” and “Dont’s” when using CERNs computing facilities. This course has recently been redesigned. It is now mandatory for all CERN members (users and staff) owning a CERN computer account and must be followed once every three years. Members who...

  19. Nuclear physics accelerator facilities

    International Nuclear Information System (INIS)

    The Department of Energy's Nuclear Physics program is a comprehensive program of interdependent experimental and theoretical investigation of atomic nuclei. Long range goals are an understanding of the interactions, properties, and structures of atomic nuclei and nuclear matter at the most elementary level possible and an understanding of the fundamental forces of nature by using nuclei as a proving ground. Basic ingredients of the program are talented and imaginative scientists and a diversity of facilities to provide the variety of probes, instruments, and computational equipment needed for modern nuclear research. Approximately 80% of the total Federal support of basic nuclear research is provided through the Nuclear Physics program; almost all of the remaining 20% is provided by the National Science Foundation. Thus, the Department of Energy (DOE) has a unique responsibility for this important area of basic science and its role in high technology. Experimental and theoretical investigations are leading us to conclude that a new level of understanding of atomic nuclei is achievable. This optimism arises from evidence that: (1) the mesons, protons, and neutrons which are inside nuclei are themselves composed of quarks and gluons and (2) quantum chromodynamics can be developed into a theory which both describes correctly the interaction among quarks and gluons and is also an exact theory of the strong nuclear force. These concepts are important drivers of the Nuclear Physics program

  20. Generalized plotting facility

    Energy Technology Data Exchange (ETDEWEB)

    Burris, R.D.; Gray, W.H.

    1978-01-01

    A command which causes the translation of any supported graphics file format to a format acceptable to any supported device was implemented on two linked DECsystem-10s. The processing of the command is divided into parsing and translating phases. In the parsing phase, information is extracted from the command and augmented by default data. The results of this phase are saved on disk, and the appropriate translating routine is invoked. Twenty-eight translating programs were implemented in this system. They support four different graphics file formats, including the DISSPLA and Calcomp formats, and seven different types of plotters, including Tektronix, Calcomp, and Versatec devices. Some of the plotters are devices linked to the DECsystem-10s, and some are driven by IBM System/360 computers linked via a communications network to the DECsystem-10s. The user of this facility can use any of the supported packages to create a file of graphics data, preview the file on an on-line scope, and, when satisfied, cause the same data to be plotted on a hard-copy device. All of the actions utilize a single simple command format. 2 figures.

  1. Dual-Spool Turbine Facility Design Overview

    Science.gov (United States)

    Giel, Paul; Pachlhofer, Pete

    2003-01-01

    The next generation of aircraft engines, both commercial and military, will attempt to capitalize on the benefits of close-coupled, vaneless, counter-rotating turbine systems. Experience has shown that significant risks and challenges are present with close-coupled systems in terms of efficiency and durability. The UEET program needs to demonstrate aerodynamic loading and efficiency goals for close-coupled, reduced-stage HP/LP turbine systems as a Level 1 Milestone for FY05. No research facility exists in the U.S. to provide risk reduction for successful development of close-coupled, high and low pressure turbine systems for the next generations of engines. To meet these objectives, the design, construction, and integrated systems testing of a Dual-Spool Turbine Facility (DSTF) facility has been initiated at the NASA Glenn Research Center. The facility will be a warm (-IOOO'F), continuous flow facility for overall aerodynamic performance and detailed flow field measurement acquisition. The facility will have state-of-the-art instrumentation to capture flow physics details. Accurate and reliable speed control will be achieved by utilizing the existing Variable Frequency Drive System. Utilization of this and other existing GRC centralized utilities will reduce the overall construction costs. The design allows for future installation of a turbine inlet combustor profile simulator. This presentation details the objectives of the facility and the concepts used in specifying its capabilities. Some preliminary design results will be presented along with a discussion of plans and schedules.

  2. Facility Description 2012. Summary report of the encapsulation plant and disposal facility designs

    International Nuclear Information System (INIS)

    The purpose of the facility description is to be a specific summary report of the scope of Posiva's nuclear facilities (encapsulation plant and disposal facility) in Olkiluoto. This facility description is based on the 2012 designs and completing Posiva working reports. The facility description depicts the nuclear facilities and their operation as the disposal of spent nuclear fuel starts in Olkiluoto in about 2020. According to the decisions-in-principle of the government, the spent nuclear fuel from Loviisa and Olkiluoto nuclear power plants in operation and in future cumulative spent nuclear fuel from Loviisa 1 and 2, Olkiluoto 1, 2, 3 and 4 nuclear power plants, is permitted to be disposed of in Olkiluoto bedrock. The design of the disposal facility is based on the KBS-3V concept (vertical disposal). Long-term safety concept is based on the multi-barrier principle i.e. several release barriers, which ensure one another so that insufficiency in the performance of one barrier doesn't jeopardize long-term safety of the disposal. The release barriers are the following: canister, bentonite buffer and deposition tunnel backfill, and the host rock around the repository. The canisters are installed into the deposition holes, which are bored to the floor of the deposition tunnels. The canisters are enveloped with compacted bentonite blocks, which swell after absorbing water. The surrounding bedrock and the central and access tunnel backfill provide additional retardation, retention, and dilution. The nuclear facilities consist of an encapsulation plant and of underground final disposal facility including other aboveground buildings and surface structures serving the facility. The access tunnel and ventilation shafts to the underground disposal facility and some auxiliary rooms are constructed as a part of ONKALO underground rock characterization facility during years 2004-2014. The construction works needed for the repository start after obtaining the construction

  3. Cloud Computing Vs. Grid Computing

    OpenAIRE

    Seyyed Mohsen Hashemi; Amid Khatibi Bardsiri

    2012-01-01

    Cloud computing emerges as one of the hottest topic in field of information technology. Cloud computing is based on several other computing research areas such as HPC, virtualization, utility computing and grid computing. In order to make clear the essential of cloud computing, we propose the characteristics of this area which make cloud computing being cloud computing and distinguish it from other research areas. The service oriented, loose coupling, strong fault tolerant, business model and...

  4. Coverage centralities for temporal networks*

    Science.gov (United States)

    Takaguchi, Taro; Yano, Yosuke; Yoshida, Yuichi

    2016-02-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex based on the fastest temporal paths which use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bottleneck time at which most information sent in the temporal network passes through a small number of temporal vertices, which suggests an important role of these temporal vertices in spreading phenomena. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.Supplementary material in the form of one pdf file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-60498-7

  5. Coverage centralities for temporal networks

    CERN Document Server

    Takaguchi, Taro; Yoshida, Yuichi

    2015-01-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex by the proportion of (normal) vertex pairs, the quickest routes between which can (or should) use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bo...

  6. Computation and Analysis of the Global Distribution of the Radioxenon Isotope 133Xe based on Emissions from Nuclear Power Plants and Radioisotope Production Facilities and its Relevance for the Verification of the Nuclear-Test-Ban Treaty

    Science.gov (United States)

    Wotawa, Gerhard; Becker, Andreas; Kalinowski, Martin; Saey, Paul; Tuma, Matthias; Zähringer, Matthias

    2010-05-01

    Monitoring of radioactive noble gases, in particular xenon isotopes, is a crucial element of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The capability of the noble gas network, which is currently under construction, to detect signals from a nuclear explosion critically depends on the background created by other sources. Therefore, the global distribution of these isotopes based on emissions and transport patterns needs to be understood. A significant xenon background exists in the reactor regions of North America, Europe and Asia. An emission inventory of the four relevant xenon isotopes has recently been created, which specifies source terms for each power plant. As the major emitters of xenon isotopes worldwide, a few medical radioisotope production facilities have been recently identified, in particular the facilities in Chalk River (Canada), Fleurus (Belgium), Pelindaba (South Africa) and Petten (Netherlands). Emissions from these sites are expected to exceed those of the other sources by orders of magnitude. In this study, emphasis is put on 133Xe, which is the most prevalent xenon isotope. First, based on the emissions known, the resulting 133Xe concentration levels at all noble gas stations of the final CTBT verification network were calculated and found to be consistent with observations. Second, it turned out that emissions from the radioisotope facilities can explain a number of observed peaks, meaning that atmospheric transport modelling is an important tool for the categorization of measurements. Third, it became evident that Nuclear Power Plant emissions are more difficult to treat in the models, since their temporal variation is high and not generally reported. Fourth, there are indications that the assumed annual emissions may be underestimated by factors of two to ten, while the general emission patterns seem to be well understood. Finally, it became evident that 133Xe sources mainly influence the sensitivity of the

  7. Ocean Thermal Energy Conversion (OTEC) test facilities study program. Final report. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-01-17

    A comprehensive test program has been envisioned by ERDA to accomplish the OTEC program objectives of developing an industrial and technological base that will lead to the commercial capability to successfully construct and economically operate OTEC plants. This study was performed to develop alternative non-site specific OTEC test facilities/platform requirements for an integrated OTEC test program including both land and floating test facilities. A progression of tests was established in which OTEC power cycle component designs proceed through advanced research and technology, component, and systems test phases. This progression leads to the first OTEC pilot plant and provides support for following developments which potentially reduce the cost of OTEC energy. It also includes provisions for feedback of results from all test phases to enhance modifications to existing designs or development of new concepts. The tests described should be considered as representative of generic types since specifics can be expected to change as the OTEC plant design evolves. Emphasis is placed on defining the test facility which is capable of supporting the spectrum of tests envisioned. All test support facilities and equipment have been identified and included in terms of space, utilities, cost, schedule, and constraints or risks. A highly integrated data acquisition and control system has been included to improve test operations and facility effectiveness through a centralized computer system capable of automatic test control, real-time data analysis, engineering analyses, and selected facility control including safety alarms. Electrical power, hydrogen, and ammonia are shown to be technically feasible as means for transmitting OTEC power to a land-based distribution point. (WHK)

  8. DOE facilities solar design handbook

    Energy Technology Data Exchange (ETDEWEB)

    Hunn, Bruce D.; Balcomb, J. Douglas

    1978-01-01

    This handbook covers design of solar heating systems for commercial and laboratory buildings at Department of Energy Facilities. It includes discussions of solar energy fundamentals, solar heating and cooling technology, systems, and components, as well as a discussion of solar system economics. Quantitative analysis, with generalized design and sizing curves, is presented for solar heating so that collector and other system parameters can be cost-economically sized without a computer simulation. Solar system design considerations and guidelines, as well as guidelines for developing subsystem specifications, are presented. Thus this handbook is both a primer for the solar novice and a reference manual for the solar system designer.

  9. To centralize or not to centralize?

    OpenAIRE

    Campbell, Andrew; Kunisch, Sven; Müller-Stewens, Günter

    2011-01-01

    The CEO's dilemma-were the gains of centralization worth the pain it could cause?-is a perennial one. Business leaders dating back at least to Alfred Sloan, who laid out GM's influential philosophy of decentralization in a series of memos during the 1920s, have recognized that badly judged centralization can stifle initiative, constrain the ability to tailor products and services locally, and burden business divisions with high costs and poor service.1 Insufficient centralization can deny bus...

  10. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  11. Fostering Computational Thinking

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  12. The Decommissioning Facility Characterization DB System (DEFACS)

    International Nuclear Information System (INIS)

    The computer system for the characterization on the nuclear facilities is established as the name of the DEFACS (DEcommissioning FAcility Characterization DB System). his system is consist of the four main part with the grouping of the items and it's code creation and management system, data input system, data processing and data out put system. All the data was processed by a simplified and formatted manner to provide useful information to the decommissioning planner. The four nuclear facilities are objected for the system; the KRR-1 and 2 (Research reactor), Uranium conversion plant (Nuclear chemical plant), UF4 pilot plant and the North Korea nuclear facility (5MWe Research Reactor). All the data from a nuclear facility was categorized and inputted into the several data fields in the input system, which were chosen by considering the facility characteristics. All the hardware is workstation for Web and DB server and PC grade computers for the users and the software 'ORACLE, RDBMS 11g' operated on the WINDOW 2008 O/S, was selected

  13. The OSG Open Facility: A Sharing Ecosystem

    Energy Technology Data Exchange (ETDEWEB)

    Jayatilaka, B. [Fermilab; Levshina, T. [Fermilab; Rynge, M. [USC - ISI, Marina del Rey; Sehgal, C. [Fermilab; Slyz, M. [USC - ISI, Marina del Rey

    2015-12-23

    The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers who are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. We believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.

  14. Facilities projects performance measurement system

    International Nuclear Information System (INIS)

    The two DOE-owned facilities at Hanford, the Fuels and Materials Examination Facility (FMEF), and the Fusion Materials Irradiation Test Facility (FMIT), are described. The performance measurement systems used at these two facilities are next described

  15. Facility transition instruction

    International Nuclear Information System (INIS)

    The Bechtel Hanford, Inc. facility transition instruction was initiated in response to the need for a common, streamlined process for facility transitions and to capture the knowledge and experience that has accumulated over the last few years. The instruction serves as an educational resource and defines the process for transitioning facilities to long-term surveillance and maintenance (S and M). Generally, these facilities do not have identified operations missions and must be transitioned from operational status to a safe and stable configuration for long-term S and M. The instruction can be applied to a wide range of facilities--from process canyon complexes like the Plutonium Uranium Extraction Facility or B Plant, to stand-alone, lower hazard facilities like the 242B/BL facility. The facility transition process is implemented (under the direction of the US Department of Energy, Richland Operations Office [RL] Assistant Manager-Environmental) by Bechtel Hanford, Inc. management, with input and interaction with the appropriate RL division and Hanford site contractors as noted in the instruction. The application of the steps identified herein and the early participation of all organizations involved are expected to provide a cost-effective, safe, and smooth transition from operational status to deactivation and S and M for a wide range of Hanford Site facilities

  16. Nuclear Physics accelerator facilities

    International Nuclear Information System (INIS)

    The Nuclear Physics program requires the existence and effective operation of large and complex accelerator facilities. These facilities provide the variety of projectile beams upon which virtually all experimental nuclear research depends. Their capability determine which experiments can be performed and which cannot. Seven existing accelerator facilities are operated by the Nuclear Physics program as national facilities. These are made available to all the Nation's scientists on the basis of scientific merit and technical feasibility of proposals. The national facilities are the Clinton P. Anderson Meson Physics Facility (LAMPF) at Los Alamos National Laboratory; the Bates Linear Accelerator Center at Massachusetts Institute of Technology; the Bevalac at Lawrence Berkeley Laboratory; the Tandem/AGS Heavy Ion Facility at Brookhaven National Laboratory; the ATLAS facility at Argonne National Laboratory; the 88-Inch Cyclotron at Lawrence Berkeley Laboratory; the Holifield Heavy Ion Research Facility at Oak Ridge National Laboratory. The Nuclear Physics Injector at the Stanford Linear Accelerator Center (SLAC) enables the SLAC facility to provide a limited amount of beam time for nuclear physics research on the same basis as the other national facilities. To complement the national facilities, the Nuclear Physics program supports on-campus accelerators at Duke University, Texas A and M University, the University of Washington, and Yale University. The facility at Duke University, called the Triangle Universities Nuclear Laboratory (TUNL), is jointly staffed by Duke University, North Carolina State University, and the University of North Carolina. These accelerators are operated primarily for the research use of the local university faculty, junior scientists, and graduate students

  17. Nuclear physics accelerator facilities of the world

    International Nuclear Information System (INIS)

    this report is intended to provide a convenient summary of the world's major nuclear physics accelerator facility with emphasis on those facilities supported by the US Department of Energy (DOE). Previous editions of this report have contained only DOE facilities. However, as the extent of global collaborations in nuclear physics grows, gathering summary information on the world's nuclear physics accelerator facilities in one place is useful. Therefore, the present report adds facilities operated by the National Science Foundation (NSF) as well as the leading foreign facilities, with emphasis on foreign facilities that have significant outside user programs. The principal motivation for building and operating these facilities is, of course, basic research in nuclear physics. The scientific objectives for this research were recently reviewed by the DOE/NSF Nuclear Science Advisory Committee, who developed a long range plan, Nuclei, Nucleons, and Quarks -- Nuclear Science in the 1990's. Their report begins as follows: The central thrust of nuclear science is the study of strongly interacting matter and of the forces that govern its structure and dynamics; this agenda ranges from large- scale collective nuclear behavior through the motions of individual nucleons and mesons, atomic nuclei, to the underlying distribution of quarks and gluons. It extends to conditions at the extremes of temperature and density which are of significance to astrophysics and cosmology and are conducive to the creation of new forms of strongly interacting matter; and another important focus is on the study of the electroweak force, which plays an important role in nuclear stability, and on precision tests of fundamental interactions. The present report provides brief descriptions of the accelerator facilities available for carrying out this agenda and their research programs

  18. Instrumentation of the ESRF medical imaging facility

    CERN Document Server

    Elleaume, H; Berkvens, P; Berruyer, G; Brochard, T; Dabin, Y; Domínguez, M C; Draperi, A; Fiedler, S; Goujon, G; Le Duc, G; Mattenet, M; Nemoz, C; Pérez, M; Renier, M; Schulze, C; Spanne, P; Suortti, P; Thomlinson, W; Estève, F; Bertrand, B; Le Bas, J F

    1999-01-01

    At the European Synchrotron Radiation Facility (ESRF) a beamport has been instrumented for medical research programs. Two facilities have been constructed for alternative operation. The first one is devoted to medical imaging and is focused on intravenous coronary angiography and computed tomography (CT). The second facility is dedicated to pre-clinical microbeam radiotherapy (MRT). This paper describes the instrumentation for the imaging facility. Two monochromators have been designed, both are based on bent silicon crystals in the Laue geometry. A versatile scanning device has been built for pre-alignment and scanning of the patient through the X-ray beam in radiography or CT modes. An intrinsic germanium detector is used together with large dynamic range electronics (16 bits) to acquire the data. The beamline is now at the end of its commissioning phase; intravenous coronary angiography is intended to start in 1999 with patients and the CT pre-clinical program is underway on small animals. The first in viv...

  19. Computing issues for large detectors

    International Nuclear Information System (INIS)

    The computing issues which will affect planning for experiments at RHIC is reviewed. It is traditional in workshops on new facilities to discuss the computing requirements for the experiments. Usually this discussion will focus on the number of MIPS (Million Instructions Per Second) needed to analyze the data tapes and, perhaps, the number of Terabytes of data storage. These are important parameters for planning new facilities. They are, however, only two of the features that define the computing requirements for new experiments. In this paper, the author discusses a broader range of computing issues and indicates how the problems of large experiments might be addressed on the time-scale of RHIC. In this discussion, he draws on the experiences of high energy physics groups working at colliding beam facilities, especially the Tevatron and LEP. In the colliding beam experiments, it has become increasingly important to consider the computing system as a component of the full detector system. The computing system includes online and offline computers, workstations, networks and mass-storage. It also includes software, both commercial and experiment-specific. In the next section, he reviews recent developments in computing equipment as well as mass storage and networks. In the following section, he turns to the problems of software development, testing and maintenance

  20. Feasibility study: PASS computer environment

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-10

    The Policy Analysis Screening System (PASS) is a computerized information-retrieval system designed to provide analysts in the Department of Energy, Assistant Secretary for Environment, Office of Technology Impacts (DOE-ASEV-OTI) with automated access to articles, computer simulation outputs, energy-environmental statistics, and graphics. Although it is essential that PASS respond quickly to user queries, problems at the computer facility where it was originally installed seriously slowed PASS's operations. Users attempting to access the computer by telephone repeatedly encountered busy signals and, once logged on, experienced unsatisfactory delays in response to commands. Many of the problems stemmed from the system's facility manager having brought another large user onto the system shortly after PASS was implemented, thereby significantly oversubscribing the facility. Although in March 1980 Energy Information Administration (EIA) transferred operations to its own computer facility, OTI has expressed concern that any improvement in computer access time and response time may not be sufficient or permanent. Consequently, a study was undertaken to assess the current status of the system, to identify alternative computer environments, and to evaluate the feasibility of each alternative in terms of its cost and its ability to alleviate current problems.

  1. Quantum information. Teleportation - cryptography - quantum computer

    International Nuclear Information System (INIS)

    The following topics are dealt with: Reality in the test facility, quantum teleportation, the reality of quanta, interaction-free quantum measurement, rules for quantum computers, quantum computers with ions, spintronics with diamond, the limits of the quantum computers, a view in the future of quantum optics. (HSI)

  2. 340 Facility compliance assessment

    International Nuclear Information System (INIS)

    This study provides an environmental compliance evaluation of the RLWS and the RPS systems of the 340 Facility. The emphasis of the evaluation centers on compliance with WAC requirements for hazardous and mixed waste facilities, federal regulations, and Westinghouse Hanford Company (WHC) requirements pertinent to the operation of the 340 Facility. The 340 Facility is not covered under either an interim status Part A permit or a RCRA Part B permit. The detailed discussion of compliance deficiencies are summarized in Section 2.0. This includes items of significance that require action to ensure facility compliance with WAC, federal regulations, and WHC requirements. Outstanding issues exist for radioactive airborne effluent sampling and monitoring, radioactive liquid effluent sampling and monitoring, non-radioactive liquid effluent sampling and monitoring, less than 90 day waste storage tanks, and requirements for a permitted facility

  3. Cloud Computing

    OpenAIRE

    Bhavana Gupta

    2012-01-01

    Cloud computing is such a type of computing environment, where business owners outsource their computing needs including application software services to a third party and when they need to use the computing power or employees need to use the application resources like database, emails etc., they access the resources via Internet. Cloud computing the use of computing resources (hardware and software) that are delivered as a service over a (typically the Internet).

  4. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  5. 33 CFR 106.305 - Facility Security Assessment (FSA) requirements.

    Science.gov (United States)

    2010-07-01

    ..., including computer systems and networks; (vi) Existing agreements with private security companies; (vii) Any... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Facility Security Assessment (FSA... SECURITY MARITIME SECURITY MARINE SECURITY: OUTER CONTINENTAL SHELF (OCS) FACILITIES Outer...

  6. Synchrotron radiation facilities

    CERN Multimedia

    1972-01-01

    Particularly in the past few years, interest in using the synchrotron radiation emanating from high energy, circular electron machines has grown considerably. In our February issue we included an article on the synchrotron radiation facility at Frascati. This month we are spreading the net wider — saying something about the properties of the radiation, listing the centres where synchrotron radiation facilities exist, adding a brief description of three of them and mentioning areas of physics in which the facilities are used.

  7. Air gun test facility

    International Nuclear Information System (INIS)

    This paper describes a facility that is potentially useful in providing data for models to predict the effects of nuclear explosions on cities. IIT Research Institute has a large air gun facility capable of launching heavy items of a wide variety of geometries to velocities ranging from about 80 fps to 1100 fps. The facility and its capabilities are described, and city model problem areas capable of investigation using the air gun are presented

  8. Computer Music

    Science.gov (United States)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  9. Textiles Performance Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Textiles Performance Testing Facilities has the capabilities to perform all physical wet and dry performance testing, and visual and instrumental color analysis...

  10. Materials Characterization Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Materials Characterization Facility enables detailed measurements of the properties of ceramics, polymers, glasses, and composites. It features instrumentation...

  11. Region 9 NPDES Facilities

    Data.gov (United States)

    U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...

  12. Neutron Therapy Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Neutron Therapy Facility provides a moderate intensity, broad energy spectrum neutron beam that can be used for short term irradiations for radiobiology (cells)...

  13. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Lee, H. H.; Kim, K. H. and others

    2000-03-01

    The objectives of this study are (1) the refurbishment for PIEF(Post Irradiation Examination Facility) and M6 hot-cell in IMEF(Irradiated Material Examination Facility), (2) the establishment of the compatible facility for DUPIC fuel fabrication experiments which is licensed by government organization, and (3) the establishment of the transportation system and transportation cask for nuclear material between facilities. The report for this project describes following contents, such as objectives, necessities, scope, contents, results of current step, R and D plan in future and etc.

  14. High Combustion Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL's High-Pressure Combustion Research Facility in Morgantown, WV, researchers can investigate new high-pressure, high-temperature hydrogen turbine combustion...

  15. Target Assembly Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Target Assembly Facility integrates new armor concepts into actual armored vehicles. Featuring the capability ofmachining and cutting radioactive materials, it...

  16. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural Analyses The ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide...

  17. Composite Structures Manufacturing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Composite Structures Manufacturing Facility specializes in the design, analysis, fabrication and testing of advanced composite structures and materials for both...

  18. Geodynamics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This GSL facility has evolved over the last three decades to support survivability and protective structures research. Experimental devices include three gas-driven...

  19. DUPIC facility engineering

    International Nuclear Information System (INIS)

    The objectives of this study are 1) the refurbishment for PIEF(Post Irradiation Examination Facility) and M6 hot-cell in IMEF(Irradiated Material Examination Facility), 2) the establishment of the compatible facility for DUPIC fuel fabrication experiments which is licensed by government organization, and 3) the establishment of the transportation system and transportation cask for nuclear material between facilities. The report for this project describes following contents, such as objectives, necessities, scope, contents, results of current step, R and D plan in future and etc

  20. Universal Drive Train Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This vehicle drive train research facility is capable of evaluating helicopter and ground vehicle power transmission technologies in a system level environment. The...

  1. Manufacturing Demonstration Facility (MDF)

    Data.gov (United States)

    Federal Laboratory Consortium — The U.S. Department of Energy Manufacturing Demonstration Facility (MDF) at Oak Ridge National Laboratory (ORNL) provides a collaborative, shared infrastructure to...

  2. Proximal Probes Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Proximal Probes Facility consists of laboratories for microscopy, spectroscopy, and probing of nanostructured materials and their functional properties. At the...

  3. GPS Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Global Positioning System (GPS) Test Facility Instrumentation Suite (GPSIS) provides great flexibility in testing receivers by providing operational control of...

  4. Assessment of Recreational Facilities in Federal Capital City, Abuja, Nigeria

    Directory of Open Access Journals (Sweden)

    Cyril Kanayo Ezeamaka

    2016-06-01

    Full Text Available Abuja Master Plan provided development of adequate Green Areas and other Recreational Facilities within the Federal Capital City (FCC, as part of its sustainability principles and provided for these recreational facilities within each neighborhood (FCDA, 1979. However, there have been several recent foul cries about the negative development of recreational facilities and the abuse of the Master Plan in the FCC.  The motivation for carrying out this study arose from the observation that recreational facilities in Phase 1 of the Federal Capital City Abuja are not clearly developed as intended by the policy makers and thus, the need to identify the recreational facilities in the Phase 1 of FCC and observe their level of development as well as usage. The field survey revealed that the Central Business District and Gazupe have higher numbers of recreational facilities with 45 and 56. While Wuse II (A08 and Wuse II (A07 Districts have lesser recreational facilities with 10 and 17. The field survey further revealed that all the districts in Phase 1 have over 35% cases of land use changes from recreational facilities to other use. The survey shows that over 65% of these recreational facilities are fully developed. The study also shows that just about 11% of the recreational sporting facilities were developed in line with the Abuja Master Plan in Phase 1. The study revealed that recreational facilities in Phase 1 of the FCC, Abuja has not being developed in compliance with the Abuja Master Plan.

  5. NIDDK Central Repository

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NIDDK Central Repository stores biosamples, genetic and other data collected in designated NIDDK-funded clinical studies. The purpose of the NIDDK Central...

  6. Ubiquitous Computing: Potentials and Challenges

    CERN Document Server

    Sen, Jaydip

    2010-01-01

    The world is witnessing the birth of a revolutionary computing paradigm that promises to have a profound effect on the way we interact with computers, devices, physical spaces, and other people. This new technology, called ubiquitous computing, envisions a world where embedded processors, computers, sensors, and digital communications are inexpensive commodities that are available everywhere. Ubiquitous computing will surround users with a comfortable and convenient information environment that merges physical and computational infrastructures into an integrated habitat. This habitat will feature a proliferation of hundreds or thousands of computing devices and sensors that will provide new functionality, offer specialized services, and boost productivity and interaction. This paper presents a comprehensive discussion on the central trends in ubiquitous computing considering them form technical, social and economic perspectives. It clearly identifies different application areas and sectors that will benefit f...

  7. Physical computation and cognitive science

    CERN Document Server

    Fresco, Nir

    2014-01-01

    This book presents a study of digital computation in contemporary cognitive science. Digital computation is a highly ambiguous concept, as there is no common core definition for it in cognitive science. Since this concept plays a central role in cognitive theory, an adequate cognitive explanation requires an explicit account of digital computation. More specifically, it requires an account of how digital computation is implemented in physical systems. The main challenge is to deliver an account encompassing the multiple types of existing models of computation without ending up in pancomputationalism, that is, the view that every physical system is a digital computing system. This book shows that only two accounts, among the ones examined by the author, are adequate for explaining physical computation. One of them is the instructional information processing account, which is developed here for the first time.   “This book provides a thorough and timely analysis of differing accounts of computation while adv...

  8. Centrality in Interconnected Multilayer Networks

    CERN Document Server

    De Domenico, Manlio; Omodei, Elisa; Gómez, Sergio; Arenas, Alex

    2013-01-01

    Real-world complex systems exhibit multiple levels of relationships. In many cases, they require to be modeled by interconnected multilayer networks, characterizing interactions on several levels simultaneously. It is of crucial importance in many fields, from economics to biology, from urban planning to social sciences, to identify the most (or the less) influent nodes in a network. However, defining the centrality of actors in an interconnected structure is not trivial. In this paper, we capitalize on the tensorial formalism, recently proposed to characterize and investigate this kind of complex topologies, to show how several centrality measures -- well-known in the case of standard ("monoplex") networks -- can be extended naturally to the realm of interconnected multiplexes. We consider diagnostics widely used in different fields, e.g., computer science, biology, communication and social sciences, to cite only some of them. We show, both theoretically and numerically, that using the weighted monoplex obta...

  9. A centralized audio presentation manager

    Energy Technology Data Exchange (ETDEWEB)

    Papp, A.L. III; Blattner, M.M.

    1994-05-16

    The centralized audio presentation manager addresses the problems which occur when multiple programs running simultaneously attempt to use the audio output of a computer system. Time dependence of sound means that certain auditory messages must be scheduled simultaneously, which can lead to perceptual problems due to psychoacoustic phenomena. Furthermore, the combination of speech and nonspeech audio is examined; each presents its own problems of perceptibility in an acoustic environment composed of multiple auditory streams. The centralized audio presentation manager receives abstract parameterized message requests from the currently running programs, and attempts to create and present a sonic representation in the most perceptible manner through the use of a theoretically and empirically designed rule set.

  10. The science of computing shaping a discipline

    CERN Document Server

    Tedre, Matti

    2014-01-01

    The identity of computing has been fiercely debated throughout its short history. Why is it still so hard to define computing as an academic discipline? Is computing a scientific, mathematical, or engineering discipline? By describing the mathematical, engineering, and scientific traditions of computing, The Science of Computing: Shaping a Discipline presents a rich picture of computing from the viewpoints of the field's champions. The book helps readers understand the debates about computing as a discipline. It explains the context of computing's central debates and portrays a broad perspecti

  11. Central and peripheral demyelination

    OpenAIRE

    Man Mohan Mehndiratta; Natasha Singh Gulati

    2014-01-01

    Several conditions cause damage to the inherently normal myelin of central nervous system, perepheral nervous system or both central and perepheral nervous system and hence termed as central demyelinating diseases, perepheral demyelinating diseases and combined central and perepheral demyelinating diseases respectively. Here we analysed and foccused on the etiology, prevalance, incidence and age of these demyelinating disorders. Clinical attention and various diagnostic tests are needed to ad...

  12. Cloud Computing

    OpenAIRE

    Bhavana Gupta

    2012-01-01

    Cloud computing is such a type of computing environment, where business owners outsource their computing needs including application software services to a third party and when they need to use the computing power or employees need to use the application resources like database, emails etc., they access the resources via Internet

  13. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  14. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  15. BLT-MS (Breach, Leach, and Transport -- Multiple Species) data input guide. A computer model for simulating release of contaminants from a subsurface low-level waste disposal facility

    International Nuclear Information System (INIS)

    The BLT-MS computer code has been developed, implemented, and tested. BLT-MS is a two-dimensional finite element computer code capable of simulating the time evolution of concentration resulting from the time-dependent release and transport of aqueous phase species in a subsurface soil system. BLT-MS contains models to simulate the processes (water flow, container degradation, waste form performance, transport, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is simulated through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, or solubility. Radioactive production and decay in the waste form are simulated. Transport considers the processes of advection, dispersion, diffusion, radioactive production and decay, reversible linear sorption, and sources (waste forms releases). To improve the usefulness of BLT-MS a preprocessor, BLTMSIN, which assists in the creation of input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. This document reviews the models implemented in BLT-MS and serves as a guide to creating input files for BLT-MS

  16. BLT-MS (Breach, Leach, and Transport -- Multiple Species) data input guide. A computer model for simulating release of contaminants from a subsurface low-level waste disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.M.; Kinsey, R.R.; Aronson, A.; Divadeenam, M. [Brookhaven National Lab., Upton, NY (United States); MacKinnon, R.J. [Brookhaven National Lab., Upton, NY (United States)]|[Ecodynamics Research Associates, Inc., Albuquerque, NM (United States)

    1996-11-01

    The BLT-MS computer code has been developed, implemented, and tested. BLT-MS is a two-dimensional finite element computer code capable of simulating the time evolution of concentration resulting from the time-dependent release and transport of aqueous phase species in a subsurface soil system. BLT-MS contains models to simulate the processes (water flow, container degradation, waste form performance, transport, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is simulated through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, or solubility. Radioactive production and decay in the waste form are simulated. Transport considers the processes of advection, dispersion, diffusion, radioactive production and decay, reversible linear sorption, and sources (waste forms releases). To improve the usefulness of BLT-MS a preprocessor, BLTMSIN, which assists in the creation of input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. This document reviews the models implemented in BLT-MS and serves as a guide to creating input files for BLT-MS.

  17. School Facilities Survey.

    Science.gov (United States)

    Saunders, Harry B.

    This survey of facility needs includes an evaluation of staff organization and operating procedures for the Philadelphia Public School District. The educational policies adopted by the Philadelphia Board of Education relating to school facilities are discussed, and existing sites and buildings, population enrollment data, and financial data are…

  18. Science Facilities Bibliography.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    A bibliographic collection on science buildings and facilities is cited with many different reference sources for those concerned with the design, planning, and layout of science facilities. References are given covering a broad scope of information on--(1) physical plant planning, (2) management and safety, (3) building type studies, (4) design…

  19. FACILITIES FOR PHYSICAL FITNESS.

    Science.gov (United States)

    MUSIAL, STAN

    THIS ARTICLE CITES THE LOW PRIORITY THAT PHYSICAL EDUCATION GENERALLY HAS IN CURRICULUM AND SCHOOL FACILITY PLANNING. IT ALSO CITES THE REASONS FOR DEVELOPING MORE ADEQUATE PHYSICAL EDUCATION FACILITIES--(1) OUR WAY OF LIFE NO LONGER PROVIDES VIGOROUS PHYSICAL ACTIVITY NECESSARY FOR HEALTHY DEVELOPMENT, (2) A DIRECT RELATIONSHIP EXISTS BETWEEN…

  20. Florida Educational Facilities, 1999.

    Science.gov (United States)

    Florida State Dept. of Education, Tallahassee. Office of Educational Facilities.

    This publication describes Florida school and community college facilities completed in 1999, including photographs and floor plans. The facilities profiled are: Buchholz High School (Alachua County); Gator Run Elementary School (Broward); Corkscrew Elementary School (Collier); The 500 Role Models Academy of Excellence (Miami-Dade); Caribbean…

  1. Bevalac biomedical facility

    International Nuclear Information System (INIS)

    This paper describes the physical layout of the Bevalac Facility and the research programs carried out at the facility. Beam time on the Bevalac is divided between two disciplines: one-third for biomedical research and two-thirds for nuclear science studies. The remainder of the paper discusses the beam delivery system including dosimetry, beam sharing and beam scanning

  2. Computer utility for interactive instrument control

    International Nuclear Information System (INIS)

    A careful study of the ANL laboratory automation needs in 1967 led to the conclusion that a central computer could support all of the real-time needs of a diverse collection of research instruments. A suitable hardware configuration would require an operating system to provide effective protection, fast real-time response and efficient data transfer. An SDS Sigma 5 satisfied all hardware criteria, however it was necessary to write an original operating system; services include program generation, experiment control real-time analysis, interactive graphics and final analysis. The system is providing real-time support for 21 concurrently running experiments, including an automated neutron diffractometer, a pulsed NMR spectrometer and multi-particle detection systems. It guarantees the protection of each user's interests and dynamically assigns core memory, disk space and 9-track magnetic tape usage. Multiplexor hardware capability allows the transfer of data between a user's device and assigned core area at rates of 100,000 bytes/sec. Real-time histogram generation for a user can proceed at rates of 50,000 points/sec. The facility has been self-running (no computer operator) for five years with a mean time between failures of 10

  3. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  4. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view...

  5. Grout treatment facility operation

    International Nuclear Information System (INIS)

    This paper summarizes the operation of the Grout Treatment Facility from initial testing to the final disposal to date of 3.8 x 103 m3 (1 Mgal) of low-level radioactive waste. It describes actual component testing and verification, testing of the full-scale system with simulated waste feed, summary of the radioactive disposal operation, lessons learned and equipment performance summary, facility impacts from safety analyses, long-term performance assessments, the Part B application, and projected facility modifications. The Grout Treatment Facility is one of two operations for permanently disposing of liquid defense wastes at the Hanford site near Richland, Washington, for the U.S. Department of Energy. High- and low-level radioactive wastes have been accumulating from defense material production since the mid-1940s at the Hanford site. All radioactive low-level and low-level mixed liquid wastes will be disposed of at the future Hanford Vitrification Facility

  6. Nuclear physics accelerator facilities

    International Nuclear Information System (INIS)

    This paper describes many of the nuclear physics heavy-ion accelerator facilities in the US and the research programs being conducted. The accelerators described are: Argonne National Laboratory--ATLAS; Brookhaven National Laboratory--Tandem/AGS Heavy Ion Facility; Brookhaven National Laboratory--Relativistic Heavy Ion Collider (RHIC) (Proposed); Continuous Electron Beam Accelerator Facility; Lawrence Berkeley Laboratory--Bevalac; Lawrence Berkeley Laboratory--88-Inch Cyclotron; Los Alamos National Laboratory--Clinton P. Anderson Meson Physics Facility (LAMPF); Massachusetts Institute of Technology--Bates Linear Accelerator Center; Oak Ridge National Laboratory--Holifield Heavy Ion Research Facility; Oak Ridge National Laboratory--Oak Ridge Electron Linear Accelerator; Stanford Linear Accelerator Center--Nuclear Physics Injector; Texas AandM University--Texas AandM Cyclotron; Triangle Universities Nuclear Laboratory (TUNL); University of Washington--Tandem/Superconducting Booster; and Yale University--Tandem Van de Graaff

  7. Computational chemistry

    OpenAIRE

    Truhlar, Donald G.; McKoy, Vincent

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  8. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  9. Expertise concerning the request by the ZWILAG Intermediate Storage Wuerenlingen AG for delivery of the operation licence for the conditioning installation as well as for the burning and melting installation of the central intermediate storage facility for radioactive wastes in Wuerenlingen; Gutachten zum Gesuch der ZWILAG Zwischenlager Wuerenlingen AG um Erteilung der Betriebsbewilligung fuer die Konditionierungsanlage sowie fuer die Verbrennungs- und Schmelzanlage des Zentralen Zwischenlagers fuer radioaktive Abfaelle in Wuerenlingen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-08-15

    On December 15, 1997, the Central Intermediate Storage Facility for radioactive materials (ZWILAG) delivered a request to the Swiss Federal Council for an operational licence for the conditioning installation and the incineration and melting installation at the Central Intermediate Storage Facility (ZZL). For the whole project, the general licence was granted on June 23, 1993. The construction of the whole installation as well as the operation in the storage halls with reception, hot cell and transhipment station were licensed on August 12, 1999. The Federal Agency for the Safety of Nuclear Installations (HSK) examined the request documentation of ZWILAG from the point of view of nuclear safety and radiation protection. The results of the examination are contained in this report. In general, the examination leads to a positive decision. During the realisation process many project adaptations have brought improvements to the installation as far as radiation protection and the treatment of wastes are concerned. HSK requests from a former licensing process were taken into account by ZWILAG. The present examination contains many remarks and requests that the HSK considers necessary. The most important remarks are issued as special references as far as they do not appear as proposals in the licence obligations. In general, the reference form was chosen when the fulfilment of the request can be guarantied by HSK within the framework of a release process. Several references inform ZWILAG of problems for which HSK wishes to receive the results of extended inquiries or where HSK will assist in testing or demonstrations. Other references concern requests for plant documentation. Further, calculations of radioactive material release during normal operation is considered as being necessary. Based on its examination, HSK concludes that the conditions are fulfilled for the safe operation of the conditioning installation and of the incineration and melting installation as long as

  10. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J. [Brookhaven National Lab., Upton, NY (United States)]|[Ecodynamic Research Associates, Inc., Albuquerque, NM (United States); Sullivan, T.M.; Kinsey, R.R. [Brookhaven National Lab., Upton, NY (United States)

    1997-05-01

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC.

  11. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    International Nuclear Information System (INIS)

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC

  12. Assessment of the proposed decontamination and waste treatment facility at LLNL

    International Nuclear Information System (INIS)

    To provide a centralized decontamination and waste treatment facility (DWTF) at LLNL, the construction of a new installation has been planned. Objectives for this new facility were to replace obsolete, structurally and environmentally sub-marginal liquid and solid waste process facilities and decontamination facility and to bring these facilities into compliance with existing federal, state and local regulations as well as DOE orders. In a previous study, SAIC conducted a preliminary review and evaluation of existing facilities at LLNL and cost effectiveness of the proposed DWTF. This document reports on a detailed review of specific aspects of the proposed DWTF

  13. Reactor coolant cleanup facility

    International Nuclear Information System (INIS)

    A depressurization device is disposed in pipelines upstream of recycling pumps of a reactor coolant cleanup facility to reduce a pressure between the pressurization device and the recycling pump at the downstream, thereby enabling high pressure coolant injection from other systems by way of the recycling pumps. Upon emergency, the recycling pumps of the coolant cleanup facility can be used in common to an emergency reactor core cooling facility and a reactor shutdown facility. Since existent pumps of the emergency reactor core cooling facility and the reactor shutdown facility which are usually in a stand-by state can be removed, operation confirmation test and maintenance for equipments in both of facilities can be saved, so that maintenance and reliability of the plant are improved and burdens on operators can also be mitigated. Moreover, low pressure design can be adopted for a non-regenerative heat exchanger and recycling coolant pumps, which enables to improve the reliability and economical property due to reduction of possibility of leakage. (N.H.)

  14. DUPIC facility engineering

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. S.; Choi, J. W.; Go, W. I.; Kim, H. D.; Song, K. C.; Jeong, I. H.; Park, H. S.; Im, C. S.; Lee, H. M.; Moon, K. H.; Hong, K. P.; Lee, K. S.; Suh, K. S.; Kim, E. K.; Min, D. K.; Lee, J. C.; Chun, Y. B.; Paik, S. Y.; Lee, E. P.; Yoo, G. S.; Kim, Y. S.; Park, J. C.

    1997-09-01

    In the early stage of the project, a comprehensive survey was conducted to identify the feasibility of using available facilities and of interface between those facilities. It was found out that the shielded cell M6 interface between those facilities. It was found out that the shielded cell M6 of IMEF could be used for the main process experiments of DUPIC fuel fabrication in regard to space adequacy, material flow, equipment layout, etc. Based on such examination, a suitable adapter system for material transfer around the M6 cell was engineered. Regarding the PIEF facility, where spent PWR fuel assemblies are stored in an annex pool, disassembly devices in the pool are retrofitted and spent fuel rod cutting and shipping system to the IMEF are designed and built. For acquisition of casks for radioactive material transport between the facilities, some adaptive refurbishment was applied to the available cask (Padirac) based on extensive analysis on safety requirements. A mockup test facility was newly acquired for remote test of DUPIC fuel fabrication process equipment prior to installation in the M6 cell of the IMEF facility. (author). 157 refs., 57 tabs., 65 figs.

  15. DUPIC facility engineering

    International Nuclear Information System (INIS)

    In the early stage of the project, a comprehensive survey was conducted to identify the feasibility of using available facilities and of interface between those facilities. It was found out that the shielded cell M6 interface between those facilities. It was found out that the shielded cell M6 of IMEF could be used for the main process experiments of DUPIC fuel fabrication in regard to space adequacy, material flow, equipment layout, etc. Based on such examination, a suitable adapter system for material transfer around the M6 cell was engineered. Regarding the PIEF facility, where spent PWR fuel assemblies are stored in an annex pool, disassembly devices in the pool are retrofitted and spent fuel rod cutting and shipping system to the IMEF are designed and built. For acquisition of casks for radioactive material transport between the facilities, some adaptive refurbishment was applied to the available cask (Padirac) based on extensive analysis on safety requirements. A mockup test facility was newly acquired for remote test of DUPIC fuel fabrication process equipment prior to installation in the M6 cell of the IMEF facility. (author). 157 refs., 57 tabs., 65 figs

  16. SECURITY FOR DATA STORAGE IN CLOUD COMPUTING

    OpenAIRE

    Prof. S.A.Gade; Mukesh P.Patil

    2015-01-01

    Cloud computing is nothing but a specific style of computing where everything from computing power to business apps are provided facility. This application moves the various data to the large data centers through computing where security provided fully trustworthy. The data stored in the cloud may be frequently updated by the users (registered user) including actions like insertion, deletion, modification etc. To ensure that data storage in cl oud this web applicatio...

  17. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  18. Replication of Space-Shuttle Computers in FPGAs and ASICs

    Science.gov (United States)

    Ferguson, Roscoe C.

    2008-01-01

    A document discusses the replication of the functionality of the onboard space-shuttle general-purpose computers (GPCs) in field-programmable gate arrays (FPGAs) and application-specific integrated circuits (ASICs). The purpose of the replication effort is to enable utilization of proven space-shuttle flight software and software-development facilities to the extent possible during development of software for flight computers for a new generation of launch vehicles derived from the space shuttles. The replication involves specifying the instruction set of the central processing unit and the input/output processor (IOP) of the space-shuttle GPC in a hardware description language (HDL). The HDL is synthesized to form a "core" processor in an FPGA or, less preferably, in an ASIC. The core processor can be used to create a flight-control card to be inserted into a new avionics computer. The IOP of the GPC as implemented in the core processor could be designed to support data-bus protocols other than that of a multiplexer interface adapter (MIA) used in the space shuttle. Hence, a computer containing the core processor could be tailored to communicate via the space-shuttle GPC bus and/or one or more other buses.

  19. Biotechnology Facility: An ISS Microgravity Research Facility

    Science.gov (United States)

    Gonda, Steve R.; Tsao, Yow-Min

    2000-01-01

    The International Space Station (ISS) will support several facilities dedicated to scientific research. One such facility, the Biotechnology Facility (BTF), is sponsored by the Microgravity Sciences and Applications Division (MSAD) and developed at NASA's Johnson Space Center. The BTF is scheduled for delivery to the ISS via Space Shuttle in April 2005. The purpose of the BTF is to provide: (1) the support structure and integration capabilities for the individual modules in which biotechnology experiments will be performed, (2) the capability for human-tended, repetitive, long-duration biotechnology experiments, and (3) opportunities to perform repetitive experiments in a short period by allowing continuous access to microgravity. The MSAD has identified cell culture and tissue engineering, protein crystal growth, and fundamentals of biotechnology as areas that contain promising opportunities for significant advancements through low-gravity experiments. The focus of this coordinated ground- and space-based research program is the use of the low-gravity environment of space to conduct fundamental investigations leading to major advances in the understanding of basic and applied biotechnology. Results from planned investigations can be used in applications ranging from rational drug design and testing, cancer diagnosis and treatments and tissue engineering leading to replacement tissues.

  20. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.