WorldWideScience

Sample records for central computing facility

  1. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  2. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  3. On-line satellite/central computer facility of the Multiparticle Argo Spectrometer System

    International Nuclear Information System (INIS)

    An on-line satellite/central computer facility has been developed at Brookhaven National Laboratory as part of the Multiparticle Argo Spectrometer System (MASS). This facility consisting of a PDP-9 and a CDC-6600, has been successfully used in study of proton-proton interactions at 28.5 GeV/c. (U.S.)

  4. Fermilab Central Computing Facility: Energy conservation report and mechanical systems design optimization and cost analysis study

    Energy Technology Data Exchange (ETDEWEB)

    Krstulovich, S.F.

    1986-11-12

    This report is developed as part of the Fermilab Central Computing Facility Project Title II Design Documentation Update under the provisions of DOE Document 6430.1, Chapter XIII-21, Section 14, paragraph a. As such, it concentrates primarily on HVAC mechanical systems design optimization and cost analysis and should be considered as a supplement to the Title I Design Report date March 1986 wherein energy related issues are discussed pertaining to building envelope and orientation as well as electrical systems design.

  5. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  6. Advantages for the introduction of computer techniques in centralized supervision of radiation levels in nuclear facilities

    International Nuclear Information System (INIS)

    A new computerized information system at the Saclay Center comprising 120 measuring channels is described. The advantages offered by this system with respect to the systems in use up to now are presented. Experimental results are given which support the argument that the system can effectively supervise the radioisotope facility at the Center. (B.G.)

  7. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  8. Securing a HENP Computing Facility

    CERN Document Server

    Misawa, S; Throwe, T

    2003-01-01

    Traditionally, HENP computing facilities have been open facilities that are accessed in many different ways by users that are both internal and external to the facility. However, the need to protect the facility from cybersecurity threats has made it difficult to maintain the openness of the facility to off-site and on-site users. In this paper, we discuss the strategy we have used and the architecture we have developed and deployed to increase the security the US ATLAS and RHIC Computing Facilities, while trying to maintain the openness and accessibility that our user community has come to expect. Included in this discussion are the tools that we have used and the operational experience we have had with the deployed architecture.

  9. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  10. Facility for positron computed tomography

    International Nuclear Information System (INIS)

    The positron computed tomography facility has got scintillator detector rings simultaneously recording more than one tomogrphic image of different cross-sections of the patient. The detectors in neighboring rings are staggered and can be rotated with respect to each other in order to increase the count rate without loss of efficiency. (DG)

  11. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  12. Facility for positron computed tomography

    International Nuclear Information System (INIS)

    For positron computed tomography two or more rings of scintillation detectors are used by which three or more sections of the object may be determined at a time. The rings are placed in parallel planes having got some distance from each other, axially movable collimator rings being provided for. Each collimator can be moved towards the opposite collimator and towards a central collimator which also is ring-shaped and is placed between the rows of detectors. The external and internal collimator are used for data selection and image-forming. (DG)

  13. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Giesbrecht, Alan [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    The Central Facilities Area (CFA) located in Butte County, Idaho at Idaho National Laboratory (INL) has an existing wastewater system to collect and treat sanitary wastewater and non contact cooling water from the facility. The existing treatment facility consists of three cells: Cell 1 has a surface area of 1.7 acres, Cell 2 has a surface area of 10.3 acres, and Cell 3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5 acre land application site that utilizes a center pivot irrigation sprinkler system. The purpose of this current study is to update the analysis and conclusions of the December 2013 study. In this current study, the new seepage rate and influent flow rate data have been used to update the calculations, model, and analysis.

  14. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    Science.gov (United States)

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  15. SERC Central Laser Facility annual report 1992

    International Nuclear Information System (INIS)

    In this 1992 Annual Report to the Laser Facility Committee of the Science and Engineering Research Council, the Central Laser Facility at Rutherford Appleton Laboratory, technical progress is described and mid-term organizational goals outlined. Outstanding among recent achievements is the work on plasma heating being undertaken on the Sprite facility using the ultra-bright KrF laser pumped Raman beams. Two-beam operation at power levels approaching 2 TW in 10 ps are hoped for. On a four year timescale the Titania system will provide four Raman beams of exceptional brightness and power up to 20TW in 10ps. The other high power laser facility, Vulcan is also producing exciting work. Progress in nanosecond studies using Raman spectroscopy have produced the first Raman spectrum of solvated Buckmister fullerene and direct observation of the separation of germinate ion pairs, as well as information on the behaviour of a single base in an oligonuclide chain. Phase boundaries for the solidification of a two dimensional electron fluid have been determined in a Gallium Arsenide heterojunction. Despite staff number attrition, operation and development of the facilities have continued successfully. (UK)

  16. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  17. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  18. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  19. Central Facilities Area Sewage Lagoon Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mark R. Cole

    2013-12-01

    The Central Facilities Area (CFA), located in Butte County, Idaho, at the Idaho National Laboratory has an existing wastewater system to collect and treat sanitary wastewater and non-contact cooling water from the facility. The existing treatment facility consists of three cells: Cell #1 has a surface area of 1.7 acres, Cell #2 has a surface area of 10.3 acres, and Cell #3 has a surface area of 0.5 acres. If flows exceed the evaporative capacity of the cells, wastewater is discharged to a 73.5-acre land application site that uses a center-pivot irrigation sprinkler system. As flows at CFA have decreased in recent years, the amount of wastewater discharged to the land application site has decreased from 13.64 million gallons in 2004 to no discharge in 2012 and 2013. In addition to the decreasing need for land application, approximately 7.7 MG of supplemental water was added to the system in 2013 to maintain a water level and prevent the clay soil liners in the cells from drying out and “cracking.” The Idaho National Laboratory is concerned that the sewage lagoons and land application site may be oversized for current and future flows. A further concern is the sustainability of the large volumes of supplemental water that are added to the system according to current operational practices. Therefore, this study was initiated to evaluate the system capacity, operational practices, and potential improvement alternatives, as warranted.

  20. Science and Engineering Research Council Central Laser Facility

    International Nuclear Information System (INIS)

    This report covers the work done at, or in association with, the Central Laser Facility during the year April 1980 to March 1981. In the first chapter the major reconstruction and upgrade of the glass laser, which has been undertaken in order to increase the versatility of the facility, is described. The work of the six groups of the Glass Laser Scientific Progamme and Scheduling Committee is described in further chapters entitled; glass laser development, laser plasma interactions, transport and particle emission studies, ablative acceleration and compression studies, spectroscopy and XUV lasers, and theory and computation. Publications based on the work of the facility which have either appeared or been accepted for publication during the year are listed. (U.K.)

  1. Computing facility at SSC for detectors

    International Nuclear Information System (INIS)

    A description of the RISC-based distributed computing facility for detector simulaiton being developed at the SSC Laboratory is discussed. The first phase of this facility is scheduled for completion in early 1991. Included is the status of the project, overview of the concepts used to model and define system architecture, networking capabilities for user access, plans for support of physics codes and related topics concerning the implementation of this facility

  2. Central control element expands computer capability

    Science.gov (United States)

    Easton, R. A.

    1975-01-01

    Redundant processing and multiprocessing modes can be obtained from one computer by using logic configuration. Configuration serves as central control element which can automatically alternate between high-capacity multiprocessing mode and high-reliability redundant mode using dynamic mode switching in real time.

  3. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  4. Computer facilities at the Research Centre Seibersdorf

    International Nuclear Information System (INIS)

    The computer facilities available at the Mathematics Division of the Institute are outlined including their development since 1966. The major areas of use of the computers by the science divisions and in administration are described as well as the tasks performed for industry. Two examples of the computer applications are considered in some detail: 1) A system developed for control and data acquisition in asbestos-cement plate production; 2) A model treatment of safety calculations for the steam generating systems of light-water reactors. (S.R.)

  5. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  6. A new conception for the central control facility of MEA

    International Nuclear Information System (INIS)

    To control the AFBU from the Central Control Facility some change will be necessary. This has led to a complete revision of the facilities of both consoles. Some proposals are made to improve the response time of the control systems. These improvements are feasible at short notice. (G.J.P.)

  7. Computer simulation of tritium removal facility design

    International Nuclear Information System (INIS)

    In this study, a computer simulation of tritium diffusion out of molten salt is performed using COMSOL Multiphysics. The purpose of the simulation is to investigate the efficiency of the permeation window type tritium removal facility, which is proposed for tritium control in FHRs. The result of the simulation suggests a large surface area is one of the key issues in the design of the tritium removal facility, and the simple tube bundle concept is insufficient to provide the surface area needed for an efficient tritium removal process. (author)

  8. The Central Raman Laser Facility at the Pierre Auger Observatory

    Science.gov (United States)

    medina, C.; Mayotte, E.; Wiencke, L. R.; Rizi, V.; Grillo, A.

    2013-12-01

    We describe the newly upgraded Central Raman Laser Facility (CRLF) located close to the center of the Piere Auger observatory (PAO) in Argentina. The CRLF features a Raman Lidar receiver, a 335 nm wavelength solid state laser, a robotic beam energy calibration system, and a weather station, all powered by solar energy and operated autonomously using a single board computer. The system optics are arranged to direct the laser beam into the atmosphere in steered and vertical modes with adjustable polarization settings,and it is measured in a bi-static configuration by the 4 fluorescence stations of the Pierre Auger observatory. Additionally the system optics can be easily switched to provide a fixed vertical beam that is measured by a Raman Lidar receiver in mono-static configuration,allowing an independent measurement of the aerosol optical depth τ(z,t) and other properties of the atmosphere. A description of the CLRF's installation, hardware and software integration, initial operations and examples of data collected, will also be presented.

  9. Conceptual design report for Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    The permanent facilities are defined, and cost estimates are provided for the disposal of Low-Level Radioactive Wastes (LLW) at the Central Waste Disposal Facility (CWDF). The waste designated for the Central Waste Disposal Facility will be generated by the Y-12 Plant, the Oak Ridge Gaseous Diffusion Plant, and the Oak Ridge National Laboratory. The facility will be operated by ORNL for the Office of Defense Waste and By-Products Management of the Deparment of Energy. The CWDF will be located on the Department of Energy's Oak Ridge Reservation, west of Highway 95 and south of Bear Creek Road. The body of this Conceptual Design Report (CDR) describes the permanent facilities required for the operation of the CWDF. Initial facilities, trenches, and minimal operating equipment will be provided in earlier projects. The disposal of LLW will be by shallow land burial in engineered trenches. DOE Order 5820 was used as the performance standard for the proper disposal of radioactive waste. The permanent facilities are intended for beneficial occupancy during the first quarter of fiscal year 1989. 3 references, 9 figures, 7 tables

  10. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  11. Centralization and Decentralization of Schools' Physical Facilities Management in Nigeria

    Science.gov (United States)

    Ikoya, Peter O.

    2008-01-01

    Purpose: This research aims to examine the difference in the availability, adequacy and functionality of physical facilities in centralized and decentralized schools districts, with a view to making appropriate recommendations to stakeholders on the reform programmes in the Nigerian education sector. Design/methodology/approach: Principals,…

  12. Computational evaluation oa a neutron field facility

    International Nuclear Information System (INIS)

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction 3H(d,n)4He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 108 n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  13. Computational evaluation oa a neutron field facility

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Jose Julio de O.; Pazianotto, Mauricio T., E-mail: jjfilos@hotmail.com, E-mail: mpazianotto@gmail.com [Instituto Tecnologico de Aeronautica (ITA/DCTA), Sao Jose dos Campos, SP (Brazil); Federico, Claudio A.; Passaro, Angelo, E-mail: claudiofederico@ieav.cta.br, E-mail: angelo@ieav.cta.br [Instituto de Estudos Avancados (IEAv/DCTA), Sao Jose dos Campos, SP (Brazil)

    2015-07-01

    This paper describes the results of a study based on computer simulation for a realistic 3D model of Ionizing Radiation Laboratory of the Institute for Advanced Studies (IEAv) using the MCNP5 (Monte Carlo N-Particle) code, in order to guide the installing a neutron generator, produced by reaction {sup 3}H(d,n){sup 4}He. The equipment produces neutrons with energy of 14.1 MeV and 2 x 10{sup 8} n/s production rate in 4 πgeometry, which can also be used for neutron dosimetry studies. This work evaluated the spectra and neutron fluence provided on previously selected positions inside the facility, chosen due to the interest to evaluate the assessment of ambient dose equivalent so that they can be made the necessary adjustments to the installation to be consistent with the guidelines of radiation protection and radiation safety, determined by the standards of National Nuclear Energy Commission (CNEN). (author)

  14. Computational analysis of PARAMETR facility experiments

    International Nuclear Information System (INIS)

    Full text of publication follows: Results of calculation of PARAMETR experiments are given in the paper. The PARAMETR facility is designed to research the phenomena relevant to typical LOCA scenarios (including severe accident) of VVER type reactors. The investigations at PARAMETR facility are directed to experimental research of fuel rods and core materials behavior, hydrogen generation processes, melting and interaction of core materials during severe accidents. The main facility components are rod bundle of 1250 mm heated length (up to 37 rods can be used), electrical power source, steam and water supply systems and instrumentation. The bundle is a mix of fresh fuel rods and electrically heated rods with uranium tablets and tungsten heater inside. The main objectives of calculations are analysis of computer code capability, in particular, RELAP/SCDAPSIM, to model severe accidents, identification of major parameter impact on calculation results and thus accident analysis improvements. RELAP/SCDAPSIM calculations were used to choose key parameters of experiments. Analysis of influence of thermal insulation properties, uncertainties of heater geometry, insulation thermal conductivity was done. Conditions and parameters needed to burn up intensive zirconium reaction were investigated. As a whole, calculation results showed good agreement with experiments. Some key points were observed such as essential impact of preheating phase, importance of thermal insulation material properties. Proper modeling of particular processes during preheating phase was very important since this phase defined bundle temperature level at the heating phase. There were some difficulties here. For instance, overestimation of temperatures had been observed until axial profiling of thermal conductivity was introduced. Some more proper models were used to reach the better agreement with experiments. The work done can be used in safety analysis of VVER type reactors and allow improving of

  15. Technical evaluation of proposed Ukrainian Central Radioactive Waste Processing Facility

    International Nuclear Information System (INIS)

    This technical report is a comprehensive evaluation of the proposal by the Ukrainian State Committee on Nuclear Power Utilization to create a central facility for radioactive waste (not spent fuel) processing. The central facility is intended to process liquid and solid radioactive wastes generated from all of the Ukrainian nuclear power plants and the waste generated as a result of Chernobyl 1, 2 and 3 decommissioning efforts. In addition, this report provides general information on the quantity and total activity of radioactive waste in the 30-km Zone and the Sarcophagus from the Chernobyl accident. Processing options are described that may ultimately be used in the long-term disposal of selected 30-km Zone and Sarcophagus wastes. A detailed report on the issues concerning the construction of a Ukrainian Central Radioactive Waste Processing Facility (CRWPF) from the Ukrainian Scientific Research and Design institute for Industrial Technology was obtained and incorporated into this report. This report outlines various processing options, their associated costs and construction schedules, which can be applied to solving the operating and decommissioning radioactive waste management problems in Ukraine. The costs and schedules are best estimates based upon the most current US industry practice and vendor information. This report focuses primarily on the handling and processing of what is defined in the US as low-level radioactive wastes

  16. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    Science.gov (United States)

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  17. Computer Security at Nuclear Facilities (French Edition)

    International Nuclear Information System (INIS)

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  18. Computer software design description for the Treated Effluent Disposal Facility (TEDF), Project L-045H, Operator Training Station (OTS)

    International Nuclear Information System (INIS)

    The Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) is a computer-based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS)

  19. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  20. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  1. Radiation analysis for a generic centralized interim storage facility

    International Nuclear Information System (INIS)

    This paper documents the radiation analysis performed for the storage area of a generic Centralized Interim Storage Facility (CISF) for commercial spent nuclear fuel (SNF). The purpose of the analysis is to establish the CISF Protected Area and Restricted Area boundaries by modeling a representative SNF storage array, calculating the radiation dose at selected locations outside the storage area, and comparing the results with regulatory radiation dose limits. The particular challenge for this analysis is to adequately model a large (6000 cask) storage array with a reasonable amount of analysis time and effort. Previous analyses of SNF storage systems for Independent Spent Fuel Storage Installations at nuclear plant sites (for example in References 5.1 and 5.2) had only considered small arrays of storage casks. For such analyses, the dose contribution from each storage cask can be modeled individually. Since the large number of casks in the CISF storage array make such an approach unrealistic, a simplified model is required

  2. Computed tomography of the central nervous system in small animals

    International Nuclear Information System (INIS)

    With computed tomography in 44 small animals some well defined anatomical structures and pathological processes of the central nervous system are described. Computed tomography is not only necessary for the diagnosis of tumors; malformations, inflammatory, degenerative and vascular diseases and traumas are also visible

  3. Configuration and Management of a Cluster Computing Facility in Undergraduate Student Computer Laboratories

    Science.gov (United States)

    Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.

    2006-01-01

    Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…

  4. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  5. National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P.J., LLNL

    1998-06-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance.

  6. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  7. Accelerator system for the Central Japan Synchrotron Radiation Facility

    International Nuclear Information System (INIS)

    Accelerator system for Central Japan Synchrotron Radiation Research Facility that consists of 50MeV electron S-band linac, 1.2GeV full energy booster synchrotron and 1.2GeV storage ring, has been constructed. Eight 1.4T bending magnets and four 5T superconducting magnet with compact refrigerator system provide beam lines. For top-up operation, the 1ns single bunch electron beam from 50MeV injector linac is injected by on-axis injection scheme and accelerated up to 1.2GeV at booster synchrotron. The timing system is designed for injection from booster ring is possible for any bunch position of storage ring. To improve efficiency of booster injection, the electron gun trigger and RF frequency of 2856MHz is synchronized with storage ring frequency of 499.654MHz. The EPICS control system is used with timing control system for linac, pulse magnet and also for booster pattern memory system. The beam commissioning for 1.2GeV storage ring has been progressing. (author)

  8. Beginner's manual of TSS terminal in connection with the INS central computer

    International Nuclear Information System (INIS)

    One of the facilities of the newly installed INS central computer system, FACOM M-180IIAD, is TSS (Time Sharing System) service. It seems to be necessary for beginners to prepare a manual easy to use the TSS terminal without special education or knowledge. In this manual are described how to handle the terminal with several basic and practical examples, and an explanation of selected important commands. (author)

  9. Establishing a central waste processing and storage facility in Ghana

    International Nuclear Information System (INIS)

    regulations. About 50 delegates from various ministries and establishment participated in the seminar. The final outcome of the draft regulation was sent to the Attorney General's office for the necessary legal review before been presented to Parliament through the Ministry of Environment, Science and Technology. A radiation sources and radioactive waste inventory have been established using the Regulatory Authority Information System (RAIS) and the Sealed Radiation Sources Registry System (SRS). A central waste processing and storage facility was constructed in the mid sixties to handle waste from a 2MW reactor that was never installed. The facility consists of a decontamination unit, two concrete vaults (about 5x15 m and 4m deep) intended for low and intermediate level waste storage and 60 wells (about 0.5m diameter x 4.6m) for storage of spent fuel. This Facility will require significant rehabilitation. Safety and performance assessment studies have been carried out with the help of three IAEA experts. The recommendations from the assessment indicate that the vaults are very old and deteriorated to be considered for any future waste storage. However the decontamination unit and the wells are still in good condition and were earmarked for refurbishment and use as waste processing and storage facilities respectively. The decontamination unit has a surface area of 60m2 and a laboratory of surface area 10m2. The decontamination unit will have four technological areas. An area for cementation of non-compactible solid waste and spent sealed sources. An area for compaction of compactable solid waste and a controlled area for conditioned wastes in 200L drums. Provision has been made to condition liquid waste. There will be a section for receipt and segregation of the waste. The laboratory will be provided with the necessary equipment for quality control. Research to support technological processes will be carried out in the laboratory. A quality assurance and control systems shall

  10. Neutronic computational modeling of the ASTRA critical facility using MCNPX

    International Nuclear Information System (INIS)

    The Pebble Bed Very High Temperature Reactor is considered as a prominent candidate among Generation IV nuclear energy systems. Nevertheless the Pebble Bed Very High Temperature Reactor faces an important challenge due to the insufficient validation of computer codes currently available for use in its design and safety analysis. In this paper a detailed IAEA computational benchmark announced by IAEA-TECDOC-1694 in the framework of the Coordinated Research Project 'Evaluation of High Temperature Gas Cooled Reactor (HTGR) Performance' was solved in support of the Generation IV computer codes validation effort using MCNPX ver. 2.6e computational code. In the IAEA-TECDOC-1694 were summarized a set of four calculational benchmark problems performed at the ASTRA critical facility. Benchmark problems include criticality experiments, control rod worth measurements and reactivity measurements. The ASTRA Critical Facility at the Kurchatov Institute in Moscow was used to simulate the neutronic behavior of nuclear pebble bed reactors. (Author)

  11. Empirical Foundation of Central Concepts for Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Spannagel, Christian

    2008-01-01

    The design of computer science curricula should rely on central concepts of the discipline rather than on technical short-term developments. Several authors have proposed lists of basic concepts or fundamental ideas in the past. However, these catalogs were based on subjective decisions without any empirical support. This article describes the…

  12. Central Facilities Area Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Brion Bennett

    2011-11-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Central Facilities Area facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facilityspecific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  13. A central tower solar test facility /RM/CTSTF/

    Science.gov (United States)

    Bevilacqua, S.; Gislon, R.

    The considered facility is intended for the conduction of test work in connection with studies of receivers, thermodynamic cycles, heliostats, components, and subassemblies. Major components of the test facility include a mirror field with a reflecting surface of 800 sq m, a 40 m tower, an electronic control system, a data-acquisition system, and a meteorological station. A preliminary experimental program is discussed, taking into account investigations related to facility characterization, an evaluation of advanced low-cost heliostats, materials and components tests, high-concentration photovoltaic experiments, and a study of advanced solar thermal cycles.

  14. Security central processing unit applications in the protection of nuclear facilities

    International Nuclear Information System (INIS)

    New or upgraded electronic security systems protecting nuclear facilities or complexes will be heavily computer dependent. Proper planning for new systems and the employment of new state-of-the-art 32 bit processors in the processing of subsystem reports are key elements in effective security systems. The processing of subsystem reports represents only a small segment of system overhead. In selecting a security system to meet the current and future needs for nuclear security applications the central processing unit (CPU) applied in the system architecture is the critical element in system performance. New 32 bit technology eliminates the need for program overlays while providing system programmers with well documented program tools to develop effective systems to operate in all phases of nuclear security applications

  15. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  16. New construction of an inside-container drying facility in the central decontamination and water treatment facility (ZDW)

    International Nuclear Information System (INIS)

    For the future conditioning of radioactive liquid waste during the proceeding dismantling of the NPP Greifswald the GNS company provides the an inside-container drying facility in the frame of the new construction of ZDW (central decontamination and water treatment facility) including related infrastructure and media supply. The concept of the FAVORIT facility which is in operation since years has been refined; a fully automated version was realized so that no handling by the personnel is necessary for loading and unloading of the container station. Components of the vacuum system were optimized.

  17. Outline and operation results of centralized radwaste treatment facility in Fukushima Daiichi Nuclear Power Station

    International Nuclear Information System (INIS)

    In the Fukushima Nuclear Power Station I (Daiichi), Unit 1 started operation in 1971 and Unit 6 in 1979; the six power plants are now in steady operation. The Centralized Radwaste Treatment Facility, whose construction was started in 1980 and completed in 1984, is located south of Unit 4. Its total floor space is 36,000 m2, the main building being of the size of a 1,100 MW reactor building. The following equipments in Centralized Radwaste Treatment Facility are described, including features and operation results: radioactive liquid volume reduction treatment facility, laundary-center waste water concentration treatment facility, machinery drain water treatment facility, combustible solids incineration facility. (Mori, K.)

  18. Shieldings for X-ray radiotherapy facilities calculated by computer

    International Nuclear Information System (INIS)

    This work presents a methodology for calculation of X-ray shielding in facilities of radiotherapy with help of computer. Even today, in Brazil, the calculation of shielding for X-ray radiotherapy is done based on NCRP-49 recommendation establishing a methodology for calculating required to the elaboration of a project of shielding. With regard to high energies, where is necessary the construction of a labyrinth, the NCRP-49 is not very clear, so that in this field, studies were made resulting in an article that proposes a solution to the problem. It was developed a friendly program in Delphi programming language that, through the manual data entry of a basic design of architecture and some parameters, interprets the geometry and calculates the shields of the walls, ceiling and floor of on X-ray radiation therapy facility. As the final product, this program provides a graphical screen on the computer with all the input data and the calculation of shieldings and the calculation memory. The program can be applied in practical implementation of shielding projects for radiotherapy facilities and can be used in a didactic way compared to NCRP-49.

  19. Computational investigations of low-emission burner facilities for char gas burning in a power boiler

    Science.gov (United States)

    Roslyakov, P. V.; Morozov, I. V.; Zaychenko, M. N.; Sidorkin, V. T.

    2016-04-01

    Various variants for the structure of low-emission burner facilities, which are meant for char gas burning in an operating TP-101 boiler of the Estonia power plant, are considered. The planned increase in volumes of shale reprocessing and, correspondingly, a rise in char gas volumes cause the necessity in their cocombustion. In this connection, there was a need to develop a burner facility with a given capacity, which yields effective char gas burning with the fulfillment of reliability and environmental requirements. For this purpose, the burner structure base was based on the staging burning of fuel with the gas recirculation. As a result of the preliminary analysis of possible structure variants, three types of early well-operated burner facilities were chosen: vortex burner with the supply of recirculation gases into the secondary air, vortex burner with the baffle supply of recirculation gases between flows of the primary and secondary air, and burner facility with the vortex pilot burner. Optimum structural characteristics and operation parameters were determined using numerical experiments. These experiments using ANSYS CFX bundled software of computational hydrodynamics were carried out with simulation of mixing, ignition, and burning of char gas. Numerical experiments determined the structural and operation parameters, which gave effective char gas burning and corresponded to required environmental standard on nitrogen oxide emission, for every type of the burner facility. The burner facility for char gas burning with the pilot diffusion burner in the central part was developed and made subject to computation results. Preliminary verification nature tests on the TP-101 boiler showed that the actual content of nitrogen oxides in burner flames of char gas did not exceed a claimed concentration of 150 ppm (200 mg/m3).

  20. Computer program for selection of a commercial hazardous waste facility

    International Nuclear Information System (INIS)

    The computer program discussed in this paper is part of a methodology entitled open-quotes Complete Methodology for Minimizing Generator Liability While Disposing Hazardous Wasteclose quotes developed at the University of Oklahoma. The program is designed to aid hazardous waste generators in evaluating their relative potential liability associated with the use of commercial off-site treatment, storage and disposal facilities (TSDFs). The program can also assist in record-keeping information related to hazardous waste off-site disposal. There are menu-driven options to retrieve information in the database, both on TSDFs and disposed waste, including possible errors in the input data. The program bases the potential liability evaluation in mu parts. Part I has evaluation facility(ies) data files. evaluation programs and reporting files, and Part 11 has a scoring library with an editing option. The first subpart of each part deals with 30 common facility decision factors (grouped into 8 categories), and the second subpart deals with 5 waste/disposal technology specific factors (grouped into I category). While using the program, a database on the evaluated facilities (identified by U.S. EPA identification number) can be created for selected factors. A relative risk/liability representation score for each factor is established in the scoring library of the program. A paired comparison technique can be used to score the factors. Then, according to the user's choice, evaluation and reporting of final scores can be made. The program can be used on an IBM PC (or compatible). It is user-friendly with easy to use menu-driven options. If it is required, the program can incorporate changes in the definitions of factors and in factor scoring. Thus the program can be adapted to changing regulations and technologies

  1. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  2. Computer-based tools for radiological assessment of emergencies at nuclear facilities in the United Kingdom

    International Nuclear Information System (INIS)

    HMNll is responsible for regulating the activities at licensed nuclear sites in the UK. In the event of an emergency being declared at any of these sites. HMNll would mobilize an emergency response team. This team would, interalia, monitor the activities of the operator at the affected site, assess the actual or potential radiological consequences of the event and provide briefings to senior members of government. Central to this response to an emergency is the assessment effort that would be provided by the Bootle Headquarters Emergency Room. To facilitate the assessments carried out at Bottle, computer based tools have been developed. The major licensed nuclear facilities in the UK fall into two broad groups, civil power reactors and nuclear chemical plant. These two types of facilities pose different levels of radiological hazard as a result of their different radioactive inventories and the different physical processes in operation. Furthermore these two groups of facilities pose different problems in assessing the radiological hazard in emergency situations. This paper describes the differences in approach used in designing and using computer based tools to assess the radiological consequences of emergencies at power reactor and chemical plant sites

  3. Status of the National Ignition Facility Integrated Computer Control System

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1ω, 2ω, and 3ω beamlines in the world. NIF's target experimental

  4. Improvement of the Computing - Related Procurement Process at a Government Research Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gittins, C.

    2000-04-03

    The purpose of the project was to develop, implement, and market value-added services through the Computing Resource Center in an effort to streamline computing-related procurement processes across the Lawrence Livermore National Laboratory (LLNL). The power of the project was in focusing attention on and value of centralizing the delivery of computer related products and services to the institution. The project required a plan and marketing strategy that would drive attention to the facility's value-added offerings and services. A significant outcome of the project has been the change in the CRC internal organization. The realignment of internal policies and practices, together with additions to its product and service offerings has brought an increased focus to the facility. This movement from a small, fractious organization into one that is still small yet well organized and focused on its mission and goals has been a significant transition. Indicative of this turnaround was the sharing of information. One-on-one and small group meetings, together with statistics showing work activity was invaluable in gaining support for more equitable workload distribution, and the removal of blame and finger pointing. Sharing monthly reports on sales and operating costs also had a positive impact.

  5. Safety assessments for centralized waste treatment and disposal facility in Puspokszilagy Hungary

    International Nuclear Information System (INIS)

    The centralized waste treatment and disposal facility Puspokszilagy is a shallow land, near surface engineered type disposal unit. The site, together with its geographic, geological and hydrogeological characteristics, is described. Data are given on the radioactive inventory. The operational safety assessment and the post-closure safety assessment is outlined. (author)

  6. Operation and Maintenance Manual for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Norm Stanley

    2011-02-01

    This Operation and Maintenance Manual lists operator and management responsibilities, permit standards, general operating procedures, maintenance requirements and monitoring methods for the Sewage Treatment Plant at the Central Facilities Area at the Idaho National Laboratory. The manual is required by the Municipal Wastewater Reuse Permit (LA-000141-03) the sewage treatment plant.

  7. High resolution muon computed tomography at neutrino beam facilities

    International Nuclear Information System (INIS)

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pion decay pipe at a neutrino beam facility and what can be achieved for momentum resolution in a muon spectrometer. Such an imaging system can be applied in archaeology, art history, engineering, material identification and whenever there is a need to image inside a transportable object constructed of dense materials

  8. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... validation of computer systems in the user's facility, the guidance does not address the software... HUMAN SERVICES Food and Drug Administration Guidance for Industry: Blood Establishment Computer System... ``Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility'' dated...

  9. CAD/CAM transtibial prosthetic sockets from central fabrication facilities: How accurate are they?

    Science.gov (United States)

    Sanders, Joan E.; Rogers, Ellen L.; Sorenson, Elizabeth A.; Lee, Gregory S.; Abrahamson, Daniel C.

    2014-01-01

    This research compares transtibial prosthetic sockets made by central fabrication facilities with their corresponding American Academy of Orthotists and Prosthetists (AAOP) electronic shape files and assesses the central fabrication process. We ordered three different socket shapes from each of 10 manufacturers. Then we digitized the sockets using a very accurate custom mechanical digitizer. Results showed that quality varied considerably among the different manufacturers. Four of the companies consistently made sockets within +/−1.1% volume (approximately 1 sock ply) of the AAOP electronic shape file, while six other companies did not. Six of the companies showed consistent undersizing or oversizing in their sockets, which suggests a consistent calibration or manufacturing error. Other companies showed inconsistent sizing or shape distortion, a difficult problem that represents a most challenging limitation for central fabrication facilities. PMID:18247236

  10. Central Laser Facility. Rutherford Appleton Laboratory; annual report 1996/97

    International Nuclear Information System (INIS)

    The Central Laser Facility (CLF) is one of the UK's major research facilities and is devoted to the provision of advanced laser systems for pure and applied research. Based at Rutherford Appleton Laboratory near Didcot in Oxfordshire, the CLF was set up twenty years ago to provide the UK university community with a world class high power laser system. The initial impetus behind the establishment of the CLF was the then recently released idea of laser induced thermonuclear fusion. These days, laser fusion is just one topic amongst many that is studied at the CLF and the extent and capabilities of the laser systems have expanded enormously as new discoveries in the rapidly changing field of laser science have been incorporated into the facilities provided. This overview looks at these facilities as they were at the end of March 1997. (author)

  11. Status of the National Ignition Facility Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Lagin, L; Bryant, R; Carey, R; Casavant, D; Edwards, O; Ferguson, W; Krammen, J; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Van Arsdall, P J; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately 3/4 complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF has now demonstrated the highest energy 1{omega}, 2{omega}, and 3{omega} beamlines in the world

  12. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  13. Computer Security at Nuclear Facilities. Reference Manual (Chinese Edition)

    International Nuclear Information System (INIS)

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  14. Computer Security at Nuclear Facilities. Reference Manual (Russian Edition)

    International Nuclear Information System (INIS)

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  15. Computer Security at Nuclear Facilities. Reference Manual (Arabic Edition)

    International Nuclear Information System (INIS)

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  16. Single-computer HWIL simulation facility for real-time vision systems

    Science.gov (United States)

    Fuerst, Simon; Werner, Stefan; Dickmanns, Ernst D.

    1998-07-01

    UBM is working on autonomous vision systems for aircraft for more than one and a half decades by now. The systems developed use standard on-board sensors and two additional monochrome cameras for state estimation of the aircraft. A common task is to detect and track a runway for an autonomous landing approach. The cameras have different focal lengths and are mounted on a special pan and tilt camera platform. As the platform is equipped with two resolvers and two gyros it can be stabilized inertially and the system has the ability to actively focus on the objects of highest interest. For verification and testing, UBM has a special HWIL simulation facility for real-time vision systems. Central part of this simulation facility is a three axis motion simulator (DBS). It is used to realize the computed orientation in the rotational degrees of freedom of the aircraft. The two-axis camera platform with its two CCD-cameras is mounted on the inner frame of the DBS and is pointing at the cylindrical projection screen with a synthetic view displayed on it. As the performance of visual perception systems has increased significantly in recent years, a new, more powerful synthetic vision system was required. A single Onyx2 machine replaced all the former simulation computers. This computer is powerful enough to simulate the aircraft, to generate a high-resolution synthetic view, to control the DBS and to communicate with the image processing computers. Further improvements are the significantly reduced delay times for closed loop simulations and the elimination of communication overhead.

  17. Evaluating Computer Technology Integration in a Centralized School System

    Science.gov (United States)

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  18. Safe operation of a national centralized processing facility for radioactive wastes

    International Nuclear Information System (INIS)

    Peru has a centralized waste processing and storage facility for radioactive wastes. It is placed in the Nuclear Research Center 'RACSO'. It is located 40 km in the north of Lima and started operation in the beginning of 1989. Actually, RACSO, operates a pool type research reactor of 10 Mw to produce radioisotopes. There are many laboratories where a small research program is developed. Iodine-131 and Technetium 99 are produced twice a week. Some kind of radioactive wastes are produced. Other sources are from the application of radioisotopes in hospitals, industry and other institutions. This paper presents the experience gained in the operation of this centralized facility. In twenty years of operation has been produced 15 cubic metres of solid wastes, 532 TBq of activity and approximately standardized 52 drums of 200 litres of conditioned solid wastes. During the period of operation it has not been produced any kind of accident. (author)

  19. Recycled Water Reuse Permit Renewal Application for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2014-09-01

    This renewal application for a Recycled Water Reuse Permit is being submitted in accordance with the Idaho Administrative Procedures Act 58.01.17 “Recycled Water Rules” and the Municipal Wastewater Reuse Permit LA-000141-03 for continuing the operation of the Central Facilities Area Sewage Treatment Plant located at the Idaho National Laboratory. The permit expires March 16, 2015. The permit requires a renewal application to be submitted six months prior to the expiration date of the existing permit. For the Central Facilities Area Sewage Treatment Plant, the renewal application must be submitted by September 16, 2014. The information in this application is consistent with the Idaho Department of Environmental Quality’s Guidance for Reclamation and Reuse of Municipal and Industrial Wastewater and discussions with Idaho Department of Environmental Quality personnel.

  20. Characterization and reclamation assessment for the central shops diesel storage facility at Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Fliermans, C.B.; Hazen, T.C.; Bledsoe, H.W.

    1994-12-31

    The contamination of subsurface terrestrial environments by organic contaminants is a global phenomenon. The remediation of such environments requires innovative assessment techniques and strategies for successful cleanups. Using innovative approaches, the central Shops Diesel Storage Facility at the Savannah River Site (SRS) was characterized to determine the extent of subsurface diesel fuel contamination. Effective bioremediation techniques for cleaning up of the contaminant plume were established.

  1. Wastewater Land Application Permit LA-000141 Renewal Information for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Laboratory, Idaho National

    1999-02-01

    On July 25, 1994, the State ofldaho Division of Environmental Quality (DEQ) issued a Wastewater Land Application Permit (WLAP) for the Idaho National Engineering Laboratory's (INEL, now the Idaho National Engineering and Environmental Laboratory [INEEL]) Central Facilities Area (CFA) Sewage Treatment Plant (STP). The permit expires August 7, 1999. In addition to the renewal application, this report was prepared to provide the following information as requested by DEQ.

  2. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  3. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  4. Safety report for Central Interim Storage facility for radioactive waste from small producers

    International Nuclear Information System (INIS)

    In 1999 the Agency for Radwaste Management took over the management of the Central Interim Storage (CIS) in Brinje, intended only for radioactive waste from industrial, medical and research applications. With the transfer of the responsibilities for the storage operation, ARAO, the new operator of the facility, received also the request from the Slovenian Nuclear Safety Administration for refurbishment and reconstruction of the storage and for preparation of the safety report for the storage with the operational conditions and limitations. In order to fulfill these requirements ARAO first thoroughly reviewed the existing documentation on the facility, the facility itself and the stored inventory. Based on the findings of this review ARAO prepared several basic documents for improvement of the current conditions in the storage facility. In October 2000 the Plan for refurbishment and modernization of the CIS was prepared, providing an integral approach towards remediation and refurbishment of the facility, optimization of the inventory arrangement and modernization of the storage and storing utilization. In October 2001 project documentation for renewal of electric installations, water supply and sewage system, ventilation system, the improvements of the fire protection and remediation of minor defects discovered in building were completed according to the Act on Construction. In July 2003 the safety report was prepared, based on the facility status after the completion of the reconstruction works. It takes into account all improvements and changes introduced by the refurbishment and reconstruction of the facility according to project documentation. Besides the basic characteristics of the location and its surrounding, it also gives the technical description of the facility together with proposed solutions for the renewal of electric installations, renovation of water supply and sewage system, refurbishment of the ventilation system, the improvement of fire

  5. Developing a Central “Point-of-Truth” Database for Core Facility Information

    Science.gov (United States)

    Cheng, S.; Webber, G.; Bourges-Waldegg, D.; Pearse, R.V.

    2014-01-01

    Core facilities need customers, yet an average researcher is aware of a very small fraction of the facilities available near them. Diligent facilities combat this knowledge gap by broadcasting information about their facility either locally, or by publishing information on one or multiple public-facing websites or third-party repositories of information (e.g. VGN cores database or Science Exchange). Each additional site of information about a facility increases visibility but also impairs their ability to maintain up-to-date information on all sites. Additionally, most third-party repositories house their data in traditional relational databases that are not indexable by common search engines. To start addressing these problems, the eagle-i project (an free, open-source, open-access, publication platform), has begun integrating its core facility database with external websites and web applications allowing them to synchronize their information in real-time. We present here two experimental integrations. The Harvard Catalyst Cores webpage originally required independent updates which were not within the direct control of the core directors themselves. The eagle-i linked open data architecture developed now allows the Catalyst cores page to pull information from the Harvard eagle-i server and update all data on it's page accordingly. Additionally, Harvard's “Profiles” web application references eagle-i data and links resource information from eagle-i to personnel information in the Profiles database. Because of these direct links, updating information in Harvard's eagle-i server (which can be accessed directly by facility directors through an account on the eagle-i SWEET), automatically updates information on the Catalyst Cores webpage and updates resource counts linked to a researcher's public profile. This functionality of the eagle-i platform as a central “point-of-truth” for information has the potential to drastically reduce the effort required to

  6. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    Science.gov (United States)

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  7. A methodology for assessing computer software applicability to inventory and facility management

    OpenAIRE

    Paul, Debashis

    1989-01-01

    Computer applications have become popular and widespread in architecture and other related fields. While the architect uses a computer for design and construction of a building, the user takes the advantage of computer for maintenance of the building. Inventory and facility management are two such fields where computer applications have become predominant. The project has investigated the use and application of different commercially available computer software in the above men...

  8. Hyperpolarized 3-helium MR imaging of the lungs: testing the concept of a central production facility

    International Nuclear Information System (INIS)

    The aim of this study was to test the feasibility of a central production facility with distribution network for implementation of hyperpolarized 3-helium MRI. The 3-helium was hyperpolarized to 50-65% using a large-scale production facility based at a university in Germany. Using a specially designed transport box, containing a permanent low-field shielded magnet and dedicated iron-free glass cells, the hyperpolarized 3-helium gas was transported via airfreight to a university in the UK. At this location, the gas was used to perform in vivo MR experiments in normal volunteers and patients with chronic obstructive lung diseases. Following initial tests, the transport (road-air-road cargo) was successfully arranged on six occasions (approximately once per month). The duration of transport to imaging averaged 18 h (range 16-20 h), which was due mainly to organizational issues such as working times and flight connections. During the course of the project, polarization at imaging increased from 20% to more than 30%. A total of 4 healthy volunteers and 8 patients with chronic obstructive pulmonary disease were imaged. The feasibility of a central production facility for hyperpolarized 3-helium was demonstrated. This should enable a wider distribution of gas for this novel technology without the need for local start-up costs. (orig.)

  9. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  10. Process as Content in Computer Science Education: Empirical Determination of Central Processes

    Science.gov (United States)

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2008-01-01

    Computer science education should not be based on short-term developments but on content that is observable in multiple domains of computer science, may be taught at every intellectual level, will be relevant in the longer term, and is related to everyday language and/or thinking. Recently, a catalogue of "central concepts" for computer science…

  11. Status of accelerator design for Central Japan Synchrotron Radiation Research Facility project

    International Nuclear Information System (INIS)

    Nagoya University had a project to construct a new synchrotron light facility, called Photo-Science Nanofactory, to develop a wide range research on basic science, industrial applications, life science and environmental engineering. The project is now developed to 'Central Japan Synchrotron Radiation Research Facility' as the principal facility of the project of Aichi prefecture, 'Knowledge Hub', to establish a new research center for technological innovations. The key equipment of the facility is a small electron storage ring, which is able to supply hard x-rays. The energy of the stored electron beam is 1.2 GeV, the circumference is 62.4 m, the current is 300 mA, and natural emittance is about 53 nm-rad. The storage ring consists of four triple bend cells. Eight of the twelve bending magnets are normal conducting ones. Four of them are 5 T superconducting magnets (super-bends). The bending angle of the super-bend is 12 degrees and three hard x-ray beam lines can be extracted from each super-bend. Two insertion devices will be installed in the straight sections. The electron beam is injected from a booster synchrotron with the energy of 1.2 GeV as full energy injection. A 50 MeV linac is used as an injector to the booster synchrotron. The top-up operation is also planned. (author)

  12. Techniques for Measuring Aerosol Attenuation using the Central Laser Facility at the Pierre Auger Observatory

    CERN Document Server

    ,

    2013-01-01

    The Pierre Auger Observatory in Malarg\\"ue, Argentina, is designed to study the properties of ultra-high energy cosmic rays with energies above 1018 eV. It is a hybrid facility that employs a Fluorescence Detector to perform nearly calorimetric measurements of Extensive Air Shower energies. To obtain reliable calorimetric information from the FD, the atmospheric conditions at the observatory need to be continuously monitored during data acquisition. In particular, light attenuation due to aerosols is an important atmospheric correction. The aerosol concentration is highly variable, so that the aerosol attenuation needs to be evaluated hourly. We use light from the Central Laser Facility, located near the center of the observatory site, having an optical signature comparable to that of the highest energy showers detected by the FD. This paper presents two procedures developed to retrieve the aerosol attenuation of fluorescence light from CLF laser shots. Cross checks between the two methods demonstrate that re...

  13. Reference equilibrium core with central flux irradiation facility for Pakistan research reactor-1

    International Nuclear Information System (INIS)

    In order to assess various core parameters a reference equilibrium core with Low Enriched Uranium (LEU) fuel for Pakistan Research Reactor (PARR-1) was assembled. Due to increased volume of reference core, the average neutron flux reduced as compared to the first higher power operation. To get a higher neutron flux an irradiation facility was created in centre of the reference equilibrium core where the advantage of the neutron flux peaking was taken. Various low power experiments were performed in order to evaluate control rods worth and neutron flux mapping inside the core. The neutron flux inside the central irradiation facility almost doubled. With this arrangement reactor operation time was cut down from 72 hours to 48 hours for the production of the required specific radioactivity. (author)

  14. Wastewater Land Application Permit LA-000141 Renewal Information for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    None

    1999-02-01

    On July 25, 1994, the State of ldaho Division of Environmental Quality issued a Wastewater Land Application Permit, #LA-000141-01, for the Central Facilities Area Sewage Treatment Plant. The permit expires August 7, 1999. This report is being submitted with the renewal application and specifically addresses; Wastewater flow; Wastewater characteristics; Impacts to vegetation in irrigation area; Impacts to soil in irrigation area; Evaluation of groundwater monitoring wells for Wastewater Land Application Permit purposes; Summary of trends observed during the 5-year reporting period; and Projection of changes and new processes.

  15. Implementation of computer security at nuclear facilities in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lochthofen, Andre; Sommer, Dagmar [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    2013-07-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  16. 2010 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike lewis

    2011-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2009, through October 31, 2010. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of special compliance conditions • Discussion of the facility’s environmental impacts. During the 2010 permit year, approximately 2.2 million gallons of treated wastewater was land-applied to the irrigation area at Central Facilities Area Sewage Treatment plant.

  17. 2012 Annual Wastewater Reuse Report for the Idaho National Laboratory Site's Central facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2013-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2011, through October 31, 2012. The report contains the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2012 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant.

  18. Computer Support for Document Management in the Danish Central Government

    DEFF Research Database (Denmark)

    Hertzum, Morten

    1995-01-01

    Document management systems are generally assumed to hold a potential for delegating the recording and retrieval of documents to professionals such as civil servants and for supporting the coordination and control of work, so-called workflow management. This study investigates the use...... and organizational impact of document management systems in the Danish central government. The currently used systems unfold around the recording of incoming and outgoing paper mail and have typically not been accompanied by organizational changes. Rather, document management tends to remain an appendix...... to the primary work and be delegated to a specialized organizational unit. Several factors contribute to the present document management practices, for example it takes an extraordinary effort to achieve the benefits, and few institutions are forced to pursue them. Furthermore, document and workflow management...

  19. NNS computing facility manual P-17 Neutron and Nuclear Science

    International Nuclear Information System (INIS)

    This document describes basic policies and provides information and examples on using the computing resources provided by P-17, the Neutron and Nuclear Science (NNS) group. Information on user accounts, getting help, network access, electronic mail, disk drives, tape drives, printers, batch processing software, XSYS hints, PC networking hints, and Mac networking hints is given

  20. High Resolution Muon Computed Tomography at Neutrino Beam Facilities

    CERN Document Server

    Suerfu, Burkhant

    2015-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pio...

  1. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  2. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  3. Site safety progress review of spent fuel central interim storage facility. Final report

    International Nuclear Information System (INIS)

    Following the request of the Czech Power Board (CEZ) and within the scope of the Technical Cooperation Project CZR/9/003, a progress review of the site safety of the Spent Fuel Central Interim Storage Facility (SFCISF) was performed. The review involved the first two stages of the works comprising the regional survey and identification of candidate sites for the underground and surface storage options. Five sites have been identified as a result of the previous works. The following two stages will involved the identification of the preferred candidate sites for the two options and the final site qualification. The present review had the purpose of assessing the work already performed and making recommendations for the next two stages of works

  4. Performance of grid-tied PV facilities based on real data in Spain: Central inverter versus string system

    International Nuclear Information System (INIS)

    Highlights: • The operation of two grid-tied PV facilities over a two-year period is presented. • The central inverter system is compared to the string inverter system. • A procedure based on a small number of easily obtained parameters is used. • The string inverter outperforms the central inverter. • Conclusions of use to maintenance firms and operational facilities are obtained. - Abstract: Two complete years of operation of two grid-tied PV facilities is presented. Energetic and economic performance of both installations has been compared. Located in the same place, the installation of these facilities followed the same construction criteria – PV panels, panel support system and wiring – and the facilities are exposed to the same atmospheric temperature and solar radiation. They differ with regard to their inverter topology used: one facility uses a central inverter and the other a string inverter configuration. The performance of the facilities has been determined using a procedure based on a small number of easily obtained parameters and the knowledge of the analyzed system and its operation mode. Electrical losses have been calculated for both systems and a complete comparison between them has been carried out. The results have shown better performance for distributed system in economic and energetic terms

  5. Implementation of the Facility Integrated Inventory Computer System (FICS)

    International Nuclear Information System (INIS)

    This paper describes a computer system which has been developed for nuclear material accountability and implemented in an active radiochemical processing plant involving remote operations. The system posesses the following features: comprehensive, timely records of the location and quantities of special nuclear materials; automatically updated book inventory files on the plant and sub-plant levels of detail; material transfer coordination and cataloging; automatic inventory estimation; sample transaction coordination and cataloging; automatic on-line volume determination, limit checking, and alarming; extensive information retrieval capabilities; and terminal access and application software monitoring and logging

  6. Health Risks to Computer Workers in Various Indoor Facilities

    OpenAIRE

    Tint, Piia; Tuulik, Viiu; Karai, Deniss; Meigas, Kalju

    2014-01-01

    Musculoskeletal disorders (MSDs) are the main cause of occupational diseases in Estonia and Europe. The number of MSDs has increased with the increasing number of computer workers and the increasing workload overall. In the current paper, the survey of 295 workers in Estonian enterprises is carried out to clarify the reasons of occupational stress. The causes of occupational stress are non-ergonomically designed workplace, social and human factors. The main questionnaires used were KIVA and G...

  7. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  8. Computer-controlled facility for laser switching and damage testing

    International Nuclear Information System (INIS)

    Apparatus has been assembled for mostly-automatic laser switching and damage measurements in which dynamic events are followed in real time. Lasers, optical elements, and detectors are available for operation at 2.8 μm and at 10.6 μm. Sample exposure time is variable from 50 μs to Cw. Sample positioning in X:Y:Z is controllable to 0.2 μm. Testing can be done with static beam position or with a scanning beam. Samples can be tested at ambient temperature, or from near 77 degrees K to 370 degrees K. Software provides for test control, data processing, and graphic, hard copy output. Beam profiling can be done automatically at the sample position, and the profile fit to a Gaussian of the same 1/e2 width. Sample data acquisition is sequenced by the computer after an enable pulse is given at the discretion of the experimenter. The computer records and processes the incoming data, presents intermediate data display, completes data manipulation, and compares processed measurement data with internally available prediction models in graphic form. Representative output for a thin film nonlinear optical material is shown

  9. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities

    International Nuclear Information System (INIS)

    Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan

  10. Central and Eastern United States (CEUS) Seismic Source Characterization (SSC) for Nuclear Facilities Project

    Energy Technology Data Exchange (ETDEWEB)

    Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller; Laura L. Glaser; Kathryn L. Hanson; Ross D. Hartleb; William R. Lettis; Scott C. Lindvall; Stephen M. McDuffie; Robin K. McGuire; Gerry L. Stirewalt; Gabriel R. Toro; Robert R. Youngs; David L. Slayter; Serkan B. Bozkurt; Randolph J. Cumbest; Valentina Montaldo Falero; Roseanne C. Perman' Allison M. Shumway; Frank H. Syms; Martitia (Tish) P. Tuttle

    2012-01-31

    Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan

  11. Thermal analysis of the unloading cell of the Spanish centralized interim storage facility (CISF); Analisis termico de la celda de desarga del almacen temporal centralizado (ATC)

    Energy Technology Data Exchange (ETDEWEB)

    Perez Dominguez, J. R.; Perez Vara, R.; Huelamo Martinez, E.

    2016-08-01

    This article deals with the thermal analysis performed for the Untoading Cell of Spain Centralized Interim Storage Facility, CISF (ATC, in Spanish). The analyses are done using computational fluid dynamics (CFD) simulation, with the aim of obtaining the air flow required to remove the residual heat of the elements stored in the cell. Compliance with the admissible heat limits is checked with the results obtained in the various operation and accident modes. The calculation model is flexible enough to allow carrying out a number of sensitivity analyses with the different parameters involved in the process. (Author)

  12. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  13. Demonstration of neutron radiography and computed tomography at the University of Texas thermal neutron imaging facility

    International Nuclear Information System (INIS)

    A thermal neutron imaging facility for real-time neutron radiography and computed tomography has recently been developed and built at the University of Texas TRIGA reactor. Herein the authors present preliminary results of radiography and tomography test experiments. These preliminary results showed that the beam is of high quality and is suitable for radiography and tomography applications. A more detailed description of the facility is given elsewhere

  14. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  17. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  18. Computer security at ukrainian nuclear facilities: interface between nuclear safety and security

    International Nuclear Information System (INIS)

    Active introduction of information technology, computer instrumentation and control systems (I and C systems) in the nuclear field leads to a greater efficiency and management of technological processes at nuclear facilities. However, this trend brings a number of challenges related to cyber-attacks on the above elements, which violates computer security as well as nuclear safety and security of a nuclear facility. This paper considers regulatory support to computer security at the nuclear facilities in Ukraine. The issue of computer and information security considered in the context of physical protection, because it is an integral component. The paper focuses on the computer security of I and C systems important to nuclear safety. These systems are potentially vulnerable to cyber threats and, in case of cyber-attacks, the potential negative impact on the normal operational processes can lead to a breach of the nuclear facility security. While ensuring nuclear security of I and C systems, it interacts with nuclear safety, therefore, the paper considers an example of an integrated approach to the requirements of nuclear safety and security

  19. Determination of computed tomography diagnostic reference levels in North-Central Nigeria

    International Nuclear Information System (INIS)

    The aim of this study is to estimate computed tomography (CT) dose levels for common CT examinations in North-Central Nigeria. Dose parameters and scan parameters for the most commonly performed CT examinations (head, chest and abdominal CT scans) were surveyed during a four month period in 4 CT centres with Multislice scanning capabilities (4- 64 slices). Data on CT volume index (CTDIvol) and dose length product (DLP) displayed on scanner console was recorded for a minimum of 10 averaged-sized (70±10kg) patients for each facility to estimate the DRLs. The rounded 75th percentile of the distribution was then used to calculate a DRL for each centre and the region by compiling all results from centres surveyed. Data for 226 patients was collected. CT dosimetry software Impact CT patient dosimetry calculator, version 1.0.4 with National Radiation Protection Board SR250 data set was used to validate and compare surveyed scanner generated dose values. Estimated regional DRLs for head, chest and abdominal scans are (60mG and 1024 mGy.cm), (10mGy and 407mGy.cm) and (15mGy and 757mGy.cm) for CTDIvol and DLP respectively. Mean effective dose values are 1.7mSv, 5mSv and 11.9mSv for head, chest and abdominal scan respectively. A wide variation of mean doses was observed across the centres, however, DRLs estimates were lower than EC (1999) values but above UK (2003) DRLs except for chest examination, this indicates a need for optimization. Validation result show unity (‹10% overall variation) between scanner generated and software calculated dose values. (au)

  20. Experience from the operation of the Swedish Central Interim Storage Facility for Spent Nuclear Fuel, CLAB

    International Nuclear Information System (INIS)

    Currently, about 50% of the electric power in Sweden is generated by means of nuclear power. The Swedish nuclear programme comprises 12 plants. According to political decisions, no more nuclear power plants will be built and the existing plants will not be operated beyond the year 2010. The programme will give rise to not more than 7800 t U of spent fuel, which will be directly disposed of in the crystalline bedrock without reprocessing. A keystone in the spent fuel management strategy is the Central Interim Storage Facility for Spent Nuclear Fuel, CLAB. After intensive pre-project work, the licensing of CLAB according to the Building, Environment Protection and Atomic Energy Acts took place in 1978-1979. After a total licensing time of about 20 months, the last permit was obtained in August 1979. By August 1994, CLAB had received and unloaded some 720 fuel transport casks, corresponding to about 2000 t U, and 60 casks containing highly active core components. The performance of the plant has been very satisfactory and with increasing experience it has been possible to reduce the operating and maintenance costs. The extensive efforts during the design phase have resulted in a collective dose of 25-30% of the dose calculated in the final safety report. Owing to a low activity release from the fuel and optimized management of the used water filtering agents, the number of waste packages emanating from CLAB has been less than 10% of what was originally expected. The activity release to air and water from the facility during the first five years of operation has been around 0.01% of the permissible release. In order to postpone the building of additional storage pools, new storage canisters have been developed which has increased the storage capacity from 3000 to 5000 t U. (author). 1 fig

  1. The development and operation of the international solar-terrestrial physics central data handling facility

    Science.gov (United States)

    Lehtonen, Kenneth

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) International Solar-Terrestrial Physics (ISTP) Program is committed to the development of a comprehensive, multi-mission ground data system which will support a variety of national and international scientific missions in an effort to study the flow of energy from the sun through the Earth-space environment, known as the geospace. A major component of the ISTP ground data system is an ISTP-dedicated Central Data Handling Facility (CDHF). Acquisition, development, and operation of the ISTP CDHF were delegated by the ISTP Project Office within the Flight Projects Directorate to the Information Processing Division (IPD) within the Mission Operations and Data Systems Directorate (MO&DSD). The ISTP CDHF supports the receipt, storage, and electronic access of the full complement of ISTP Level-zero science data; serves as the linchpin for the centralized processing and long-term storage of all key parameters generated either by the ISTP CDHF itself or received from external, ISTP Program approved sources; and provides the required networking and 'science-friendly' interfaces for the ISTP investigators. Once connected to the ISTP CDHF, the online catalog of key parameters can be browsed from their remote processing facilities for the immediate electronic receipt of selected key parameters using the NASA Science Internet (NSI), managed by NASA's Ames Research Center. The purpose of this paper is twofold: (1) to describe how the ISTP CDHF was successfully implemented and operated to support initially the Japanese Geomagnetic Tail (GEOTAIL) mission and correlative science investigations, and (2) to describe how the ISTP CDHF has been enhanced to support ongoing as well as future ISTP missions. Emphasis will be placed on how various project management approaches were undertaken that proved to be highly effective in delivering an operational ISTP CDHF to the Project on schedule and

  2. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  3. 2015 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Michael George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2014, through October 31, 2015.

  4. 2013 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Mike Lewis

    2014-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2012, through October 31, 2013. The report contains, as applicable, the following information: • Site description • Facility and system description • Permit required monitoring data and loading rates • Status of compliance conditions and activities • Discussion of the facility’s environmental impacts. During the 2013 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. However, soil samples were collected in October from soil monitoring unit SU-014101.

  5. An enhanced aerobic bioremediation system at a central production facility -- system design and data analysis

    International Nuclear Information System (INIS)

    A successful field demonstration of the enhanced in-situ aerobic bioremediation with remarkable results took place during the period of August 1, 1991 through year-end 1992 at a central production facility in Michigan. The in-situ soil logging and groundwater sampling by the cone penetrometer/porous probe system provided a real-time definition of the groundwater flow ''channel'' and a clear delineation of the plume extent. That facilitated the design of the closed-loop bioremediation system, consisting of two downgradient pumping wells to completely capture the plume and two pairs of bi-level injection wells located upgradient of the plume. The purged groundwater from the two pumping wells after amending with dissolved oxygen is directly reinjected to the two pairs of upgradient bi-level injection wells. In addition, the performance of the system is monitored by 17 multilevel piezometers. Each piezometer consists of four vertical sampling levels, providing a total of 68 sampling points to fully define the three-dimensional characteristics of the BTEX and DO plumes. Based on a hydrograph analysis of the groundwater data, the closed-loop bioremediation system has been operating properly. In addition, a particle tracking analysis showed groundwater flowlines converge to the pumping wells demonstrating the effectiveness of the plume capture. The trend analysis showed a consistent decline of BTEX concentrations at all of the 68 sampling points

  6. The Organization and Evaluation of a Computer-Assisted, Centralized Immunization Registry.

    Science.gov (United States)

    Loeser, Helen; And Others

    1983-01-01

    Evaluation of a computer-assisted, centralized immunization registry after one year shows that 93 percent of eligible health practitioners initially agreed to provide data and that 73 percent continue to do so. Immunization rates in audited groups have improved significantly. (GC)

  7. Central washout sign in computer-aided evaluation of breast MRI: preliminary results

    International Nuclear Information System (INIS)

    Background: Although computer-aided evaluation (CAE) programs were introduced to help differentiate benign tumors from malignant ones, the set of CAE-measured parameters that best predict malignancy have not yet been established. Purpose: To assess the value of the central washout sign on CAE color overlay images of breast MRI. Material and Methods: We evaluated the frequency of the central washout sign using CAE. The central washout sign was determined so that thin, rim-like, persistent kinetics were seen in the periphery of the tumor. Then, sequentially, plateau and washout kinetics appeared. Two additional CAE-delayed kinetic variables were compared with the central washout sign for assessment of diagnostic utility: the predominant enhancement type (washout, plateau, or persistent) and the most suspicious enhancement type (any washout > any plateau > any persistent kinetics). Results: One hundred and forty-nine pathologically proven breast lesions (130 malignant, 19 benign) were evaluated. A central washout sign was associated with 87% of malignant lesions but only 11% of benign lesions. Significant differences were found when delayed-phase kinetics were categorized by the most suspicious enhancement type (P< 0.001) and the presence of the central washout sign (P< 0.001). Under the criteria of the most suspicious kinetics, 68% of benign lesions were assigned as plateau or washout pattern. Conclusion: The central washout sign is a reliable indicator of malignancy on CAE color overlay images of breast MRI

  8. Final Safety Report for Central Interim Storage Facility for Radioactive Waste from Small Producers in Brinje near Ljubljana

    International Nuclear Information System (INIS)

    In 1999 the Agency for Radwaste Management took over the management of the Central Interim Storage for radioactive waste from small producers in Brinje. At the same time in accordance with a Decree on the Mode, Subject and Terms of Performing the Public Service of Radioactive Waste Management (Official Gazette of RS, 32/99) the ARAO agency was appointed to perform the public service of radioactive waste management for waste produced in Slovenia in industry, medicine and research. After taking over responsibility for the storage the ARAO thoroughly reviewed the facility and the stored inventory. Based on the findings of this review the ARAO agency prepared a basic document for the facility i.e. Final Safety Report for the Central Interim Storage for radioactive waste from small producers in Brinje. The safety report is based on the present state of the facility, its location, stored inventory and present operation. The latter is limited to storing of already stored waste, acceptance of new waste only in emergency cases, and internal transport of the accepted waste with a forklift or manually. The Final Safety Report is prepared in accordance with the requirements of Regulation E2 and deals with the following areas: The safety approach to LILW storage, Description and location analysis of the Central interim storage, Technical features of the Central interim storage, Safety analysis of the Central interim storage, Organisational measures for normal operation of the Central interim storage, Operational conditions and limitations, Ionising radiation protection service, its methods and equipment, Radioactive waste management and disposal, Review of the plans, measures and procedures to prevent radiological accidents, Quality assurance programme, Review of the measures for physical protection of the LILW storage and stored radioactive waste, Planned measures and necessary equipment for the closure of the Central interim storage. Safety analysis proves that the facility

  9. Trends in health facility based maternal mortality in Central Region, Kenya: 2008-2012

    Science.gov (United States)

    Muchemi, Onesmus Maina; Gichogo, Agnes Wangechi; Mungai, Jane Githuku; Roka, Zeinab Gura

    2016-01-01

    Introduction WHO classifies Kenya as having a high maternal mortality. Regional data on maternal mortality trends is only available in selected areas. This study reviewed health facility maternal mortality trends, causes and distribution in Central Region of Kenya, 2008-2012. Methods We reviewed health records from July 2008 to June 2012. A maternal death was defined according to ICD-10 criterion. The variables reviewed included socio-demographic, obstetric characteristics, reasons for admission, causes of death and contributing factors. We estimated maternal mortality ratio for each year and overall for the four year period using a standard equation and used frequencies means/median and proportions for other descriptive variables. Results A total 421 deaths occurred among 344,191 live births; 335(80%) deaths were audited. Maternal mortality ratios were: 127/100,000 live births in 2008/09; 124/100,000 live births in 2009/2010; 129/100,000 live births in 2010/2011 and 111/100,000 live births in 2011/2012. Direct causes contributed majority of deaths (77%, n=234) including hemorrhage, infection and pre-eclampsia/eclampsia. Mean age was 30(±6) years; 147(71%) attended less than four antenatal visits and median gestation at birth was 38 weeks (IQR=9). One hundred ninety (59%) died within 24 hours after admission. There were 111(46%) caesarian births, 95(39%) skilled vaginal, 31(13%) unskilled 5(2%) vacuum deliveries and 1(<1%) destructive operation. Conclusion The region recorded an unsteady declining trend. Direct causes contributed to the majority deaths including hemorrhage, infection and pre-eclampsia/eclampsia. We recommend health education on individualized birth plan and mentorship on emergency obstetric care. Further studies are necessary to clarify and expand the findings of this study. PMID:27516824

  10. Interventions on central computing services during the weekend of 21 and 22 August

    CERN Multimedia

    2004-01-01

    As part of the planned upgrade of the computer centre infrastructure to meet the LHC computing needs, approximately 150 servers, hosting in particular the NICE home directories, Mail services and Web services, will need to be physically relocated to another part of the computing hall during the weekend of the 21 and 22 August. On Saturday 21 August, starting from 8:30a.m. interruptions of typically 60 minutes will take place on the following central computing services: NICE and the whole Windows infrastructure, Mail services, file services (including home directories and DFS workspaces), Web services, VPN access, Windows Terminal Services. During any interruption, incoming mail from outside CERN will be queued and delivered as soon as the service is operational again. All Services should be available again on Saturday 21 at 17:30 but a few additional interruptions will be possible after that time and on Sunday 22 August. IT Department

  11. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    A proposal has been made at LBL to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multilevel, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data through typical transformations and correlations in under 30 s. The throughput for such a facility, for five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600. 3 figures

  12. Algorithms and Heuristics for Scalable Betweenness Centrality Computation on Multi-GPU Systems

    OpenAIRE

    Vella, Flavio; Carbone, Giancarlo; Bernaschi, Massimo

    2016-01-01

    Betweenness Centrality (BC) is steadily growing in popularity as a metrics of the influence of a vertex in a graph. The BC score of a vertex is proportional to the number of all-pairs-shortest-paths passing through it. However, complete and exact BC computation for a large-scale graph is an extraordinary challenge that requires high performance computing techniques to provide results in a reasonable amount of time. Our approach combines bi-dimensional (2-D) decomposition of the graph and mult...

  13. Clinical radiodiagnosis of metastases of central lung cancer in regional lymph nodes using computers

    International Nuclear Information System (INIS)

    On the basis of literary data and clinical examination (112 patients) methods of clinical radiodiagnosis of metastases of central lung cancer in regional lymph nodes using computers are developed. Methods are tested on control clinical material (110 patients). Using computers (Bayes and Vald methods) 57.3% and 65.5% correct answers correspondingly are obtained, that is by 14.6% and 22.8% higher the level of clinical diagnosis of metastases. Diagnostic errors are analysed. Complexes of clinical-radiological signs of symptoms of metastases are outlined

  14. Central Nervous System Based Computing Models for Shelf Life Prediction of Soft Mouth Melting Milk Cakes

    Directory of Open Access Journals (Sweden)

    Gyanendra Kumar Goyal

    2012-04-01

    Full Text Available This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve good quality of food products, detection of shelf life is important. Central nervous system based intelligent computing model was developed which detected 19.82 days shelf life, as against 21 days experimental shelf life.

  15. Opportunities for artificial intelligence application in computer- aided management of mixed waste incinerator facilities

    International Nuclear Information System (INIS)

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site. It is designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conservation and Recovery Act (RCRA). This facility, known as the TSCA Incinerator, services seven DOE/OR installations. This incinerator was recently authorized for production operation in the United States for the processing of mixed (radioactively contaminated-chemically hazardous) wastes as regulated under TSCA and RCRA. Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. These requirements impact the characteristics and disposition of incinerator residues, limits the quality of liquid and gaseous effluents, limit the characteristics and rates of waste feeds and operating conditions, and restrict the handling of the waste feed inventories. This incinerator facility presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. Demonstrated computer-aided management systems could be transferred to future mixed waste incinerator facilities

  16. Central Nervous System Based Computing Models for Shelf Life Prediction of Soft Mouth Melting Milk Cakes

    OpenAIRE

    Gyanendra Kumar Goyal; Sumit Goyal

    2012-01-01

    This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve g...

  17. Automation of a cryogenic facility by commercial process-control computer

    International Nuclear Information System (INIS)

    To insure that Brookhaven's superconducting magnets are reliable and their field quality meets accelerator requirements, each magnet is pre-tested at operating conditions after construction. MAGCOOL, the production magnet test facility, was designed to perform these tests, having the capacity to test ten magnets per five day week. This paper describes the control aspects of MAGCOOL and the advantages afforded the designers by the implementation of a commercial process control computer system

  18. Taking the classical large audience university lecture online using tablet computer and webconferencing facilities

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    During four offerings (September 2008 – May 2011) of the course 02402 Introduction to Statistics for Engineering students at DTU, with an average of 256 students, the lecturing was carried out 100% through a tablet computer combined with the web conferencing facility Adobe Connect (version 7). This...... reacted positive to the initiative. The one year hit statistics of almost 10.000 during 2010 on the resulting output lecture videos indicates that the approach has a clear impact....

  19. Report on the Best Available Technology (BAT) for the treatment of the INEL Central Laundry and Respirator Facility (CFA-617)

    International Nuclear Information System (INIS)

    The Central Laundry and Respirator Facility (CLRF) designated by the building number of CFA-617 has been addressed as a potential source of contamination to the Central Facilities Area (CFA) subsurface drainage field which also receives waste water from the current CFA Sewage Treatment Plant (STP). Currently, discharges from the CLRF have been below set guidelines, DCG. A new STP has been proposed for the CFA. Since the CLRF has been designated as a potential source of contamination, a Best Available Technology (BAT) assessment was requested to determine what action should be taken in respect to the aqueous discharges from the CLRF. The BAT assessment involved source definition, technology evaluation, BAT matrix development, BAT selection, and BAT documentation. The BAT for the Central laundry and Respirator Facility selected the treatment which would impact the CLRF and the new STP the least in all aspects considered and was the system of filtration and a lined pond for natural evaporation of the water. The system will provide an isolation of this waste stream from all other CFA waste water which will be treated at the new STP. Waste minimization possibilities exist within the laundry process and are considered. These minimization actions will reduce the amount of waste water being released, but will result in raising the contaminate's concentrations (the total mass will remain the same). The second option was the use of ion exchange to remove the contaminates and recycle the water back to the wash and rinse cycles in the laundry. 3 refs., 9 figs., 11 tabs

  20. Nuclear fuel cycle facilities in the world (excluding the centrally planned economies)

    International Nuclear Information System (INIS)

    Information on the existing, under construction and planned fuel cycle facilities in the various countries is presented. Some thirty countries have activities related to different nuclear fuel cycle steps and the information covers the capacity, status, location, and the names of owners of the facilities

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  2. Temporary septic holding tank at the 300-FF-1 remedial action central support facility -- Engineering report

    International Nuclear Information System (INIS)

    The 300-FF-1 Remedial Action Support Facility will be required in the 300 Area (at the Hanford Site in Richland, Washington) to support the remedial actions planned for the 300-FF-1 Operable Unit. In conjunction with this project, soils laden with radiological contamination will be excavated, removed, and transported to a permitted disposal facility, if required based upon characterization. This facility will be a temporary, modular building sized to provide office and work space for the supervisors, engineers, and technicians assigned to the project and engaged in the associated field work. Electrical and potable water service to the 300-FF-1 Support Facility will be provided via permanent connections to existing systems. A temporary septic system is desired as opposed to connecting to the existing sewer system due to regulatory issues. The paper describes the project location, geology and flooding potential, design criteria, operations, and maintenance

  3. Value of computed tomography and magnetic resonance imaging in diagnosis of central nervous system

    International Nuclear Information System (INIS)

    Systemic sclerosis is an autoimmune connective tissue disease characterized by vascular abnormalities and fibrotic changes in skin and internal organs. The aim of the study was to investigate involvement of the central nervous system in systemic sclerosis and the value of computed tomography (CT) and magnetic resonance imaging (MRI) in evaluation of central nervous system involvement in systemic sclerosis. 26 patients with neuropsychiatric symptoms in the course of systemic sclerosis were investigated for central nervous system abnormalities by computed tomography (CT) and magnetic resonance imaging (MRI). Among these 26 symptomatic patients lesions in brain MRI and CT examinations were present in 54% and in 50% patients respectively. Most common findings (in 46% of all patients), were symptoms of cortical and subcortical atrophy, seen in both, MRI and CT. Single and multiple focal lesions, predominantly in the white matter, were detected by MRI significantly more frequently as compared to CT (62% and 15% patients respectively). These data indicate that brain involvement is common in patients with severe systemic sclerosis. MRI shows significantly higher than CT sensitivity in detection focal brain lesions in these patients. (author)

  4. Computer mapping and visualization of facilities for planning of D and D operations

    International Nuclear Information System (INIS)

    The lack of as-built drawings for many old nuclear facilities impedes planning for decontamination and decommissioning. Traditional manual walkdowns subject workers to lengthy exposure to radiological and other hazards. The authors have applied close-range photogrammetry, 3D solid modeling, computer graphics, database management, and virtual reality technologies to create geometrically accurate 3D computer models of the interiors of facilities. The required input to the process is a set of photographs that can be acquired in a brief time. They fit 3D primitive shapes to objects of interest in the photos and, at the same time, record attributes such as material type and link patches of texture from the source photos to facets of modeled objects. When they render the model as either static images or at video rates for a walk-through simulation, the phototextures are warped onto the objects, giving a photo-realistic impression. The authors have exported the data to commercial CAD, cost estimating, robotic simulation, and plant design applications. Results from several projects at old nuclear facilities are discussed

  5. Calculation of shielding of X rays in radiotherapy facilities with computer aid

    International Nuclear Information System (INIS)

    This work presents a methodology for calculation of shielding of X rays in radiotherapy facilities with computer aid. A friendly program, called RadTeraX, was developed in programming language Delphi that, through manual data input of a basic project of architecture and of some parameters, interprets the geometry and calculates the shielding of the walls, ground and roof of a radiotherapy installation for X rays. As a final product, this program supplies a graphic screen in the computer with all the input data and the calculation of the shielding, besides the respective calculation memory. Still today, in Brazil, the calculation of the shielding for radiotherapy facilities with X rays has been made based on recommendations of NCRP-49, that establishes a necessary calculation methodology to the elaboration of a shielding project. However, in high energies, where it is necessary the construction of a maze, NCRP-49 is insufficient, so that in this field, studies were made originating an article that proposes a solution for the problem and this solution was implemented in the program. The program can be applied in the practical execution of shielding projects for radiotherapy facilities and in didactic way in comparison with NCRP-49 and has been registered under number 00059420 at INPI - Instituto Nacional da Propriedade Industrial (National Institute of Industrial Property). (author)

  6. SynapSense wireless environmental monitoring system of the RHIC and ATLAS computing facility at BNL

    International Nuclear Information System (INIS)

    RHIC and ATLAS Computing Facility (RACF) at BNL is a 15000 sq. ft. facility hosting the IT equipment of the BNL ATLAS WLCG Tier-1 site, offline farms for the STAR and PHENIX experiments operating at the Relativistic Heavy Ion Collider (RHIC), the BNL Cloud installation, various Open Science Grid (OSG) resources, and many other small physics research oriented IT installations. The facility originated in 1990 and grew steadily up to the present configuration with 4 physically isolated IT areas with the maximum rack capacity of about 1000 racks and the total peak power consumption of 1.5 MW. In June 2012 a project was initiated with the primary goal to replace several environmental monitoring systems deployed earlier within RACF with a single commercial hardware and software solution by SynapSense Corporation based on wireless sensor groups and proprietary SynapSense™ MapSense™ software that offers a unified solution for monitoring the temperature and humidity within the rack/CRAC units as well as pressure distribution underneath the raised floor across the entire facility. The deployment was completed successfully in 2013. The new system also supports a set of additional features such as capacity planning based on measurements of total heat load, power consumption monitoring and control, CRAC unit power consumption optimization based on feedback from the temperature measurements and overall power usage efficiency estimations that are not currently implemented within RACF but may be deployed in the future.

  7. 2014 Annual Wastewater Reuse Report for the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Mike [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    This report describes conditions, as required by the state of Idaho Wastewater Reuse Permit (#LA-000141-03), for the wastewater land application site at the Idaho National Laboratory Site’s Central Facilities Area Sewage Treatment Plant from November 1, 2013, through October 31, 2014. The report contains, as applicable, the following information; Site description; Facility and system description; Permit required monitoring data and loading rates; Status of compliance conditions and activities; and Discussion of the facility’s environmental impacts. The current permit expires on March 16, 2015. A permit renewal application was submitted to Idaho Department of Environmental Quality on September 15, 2014. During the 2014 permit year, no wastewater was land-applied to the irrigation area of the Central Facilities Area Sewage Treatment Plant and therefore, no effluent flow volumes or samples were collected from wastewater sampling point WW-014102. Seepage testing of the three lagoons was performed between August 26, 2014 and September 22, 2014. Seepage rates from Lagoons 1 and 2 were below the 0.25 inches/day requirement; however, Lagoon 3 was above the 0.25 inches/day. Lagoon 3 has been isolated and is being evaluated for future use or permanent removal from service.

  8. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  9. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    In the process of review and evaluation of licensing issues related to nuclear power plants, it is essential to understand the behavior of seismic loading, foundation and structural properties and their impact on the overall structural response. In most cases, such knowledge could be obtained by using simplified engineering models which, when properly implemented, can capture the essential parameters describing the physics of the problem. Such models do not require execution on large computer systems and could be implemented through a personal computer (PC) based capability. Recognizing the need for a PC software package that can perform structural response computations required for typical licensing reviews, the US Nuclear Regulatory Commission sponsored the development of a PC operated computer software package CARES (Computer Analysis for Rapid Evaluation of Structures) system. This development was undertaken by Brookhaven National Laboratory (BNL) during FY's 1988 and 1989. A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to operate on a PC, have user friendly input/output interface, and have quick turnaround. This paper describes the various features which have been implemented into the seismic module of CARES version 1.0

  10. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  11. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    International Nuclear Information System (INIS)

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  12. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  13. Current situation with the centralized storage facilities for non-power radioactive wastes in Latin American countries

    International Nuclear Information System (INIS)

    Full text: Several Latin American (LA) countries have been firmly committed to the peaceful applications of ionizing radiations in medicine, industry, agriculture and research in order to achieve socioeconomic development in diverse sectors. Consequently the use of radioactive materials and radiation sources as well as the production of radioisotopes and labeled compounds may always produce radioactive wastes which require adequate management and, in the end, disposal. However, there are countries in the Latin American region whose radioactive waste volumes do not easily justify a national repository. Moreover, such facilities are extremely expensive to develop. It is unlikely that such an option will become available in the foreseeable future for most of these countries, which do not have nuclear industries. Storage has long been incorporated as a step in the management of radioactive wastes. In the recent years, there have been developments that have led some countries to consider whether the roles of storage might be expanded to provide longer-term care of long-live radioactive wastes The aim of this paper is to discuss the current situation with the storage facilities/conditions for the radioactive wastes and disused sealed radioactive sources in Latin-American countries. In some cases a brief description of the existing facilities for certain countries are provided. In other cases, when no centralized facility exists, general information on the radioactive inventories and disused sealed sources is given. (author)

  14. Studies of the thermohydraulics of the Irradiation Research Facility (IRF) chimney using computational fluid dynamics

    International Nuclear Information System (INIS)

    AECL is developing a concept for a new Irradiation Research Facility (IRF) that will be used to support ongoing development of CANDU technology and advanced materials research after the NRU reactor shuts down. As part of the IRF Pre-Project Engineering Program, computational fluid dynamics (CFD) analyses of the flow patterns and heat transfer within four reactor components - the inlet plenum, reflector tank, chimney, and the pool - were done to support the design. This paper describes the results of the CFD analyses of the IRF chimney. (author)

  15. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  16. Computer-guided facility for the study of single crystals at the gamma diffractometer GADI

    International Nuclear Information System (INIS)

    In the study of solid-state properties it is in many cases necessary to work with single crystals. The increased requirement in the industry and research as well as the desire for better characterization by means of γ-diffractometry made it necessary to improve and to modernize the existing instrument. The advantages of a computer-guided facility against the conventional, semiautomatic operation are manifold. Not only the process guidance, but also the data acquisition and evaluation are performed by the computer. By a remote control the operator is able to find quickly a reflex and to drive the crystal in every desired measuring position. The complete protocollation of all important measuring parameters, the convenient data storage, as well as the automatic evaluation are much useful for the user. Finally the measuring time can be increased to practically 24 hours per day. By this the versed characterization by means of γ-diffractometry is put on a completely new level. (orig.)

  17. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  18. Investigating Preterm Care at the Facility Level: Stakeholder Qualitative Study in Central and Southern Malawi.

    Science.gov (United States)

    Gondwe, Austrida; Munthali, Alister; Ashorn, Per; Ashorn, Ulla

    2016-07-01

    Objectives Malawi is estimated to have one of the highest preterm birth rates in the world. However, care of preterm infants at facility level in Malawi has not been explored. We aimed to explore the views of health stakeholders about the care of preterm infants in health facilities and the existence of any policy protocol documents guiding the delivery of care to these infants. Methods We conducted 16 in-depth interviews with health stakeholders (11 service providers and 5 policy makers) using an interview guide and asked for any existing policy protocol documents guiding care for preterm infants in the health facilities in Malawi. The collected documents were reviewed and all the interviews were digitally recorded, transcribed and translated. All data were analysed using content analysis approach. Results We identified four policy protocol documents and out of these, one had detailed information explaining the care of preterm infants. Policy makers reported that policy protocol documents to guide care for preterm infants were available in the health facilities but majority (63.6 %) of the service providers lacked knowledge about the existence of these documents. Health stakeholders reported several challenges in caring for preterm infants including lack of trained staff in preterm infant care, antibiotics, space, supervision and poor referral system. Conclusions Our study highlights that improving health care service provider knowledge of preterm infant care is an integral part in preterm child birth. Our findings suggests that policy makers and health decision makers should retain those trained in preterm new born care in the health facility's preterm unit. PMID:26976282

  19. Stand alone computer system to aid the development of Mirror Fusion Test Facility rf heating systems

    International Nuclear Information System (INIS)

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  20. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  1. Enhanced computational infrastructure for data analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from nine national laboratories, 19 foreign laboratories, 16 universities, and five industrial partnerships. As a result of this work, DIII-D data is available on a 24x7 basis from a set of viewing and analysis tools that can be run on either the collaborators' or DIII-D's computer systems. Additionally, a web based data and code documentation system has been created to aid the novice and expert user alike

  2. A centralized hazardous waste treatment plant: the facilities of the ZVSMM at Schwabach as an example

    Energy Technology Data Exchange (ETDEWEB)

    Amsoneit, Norbert [Zweckverband Sondermuell-Entsorgung Mittelfranken, Rednitzhembach (Germany)

    1993-12-31

    In this work a centralized hazardous waste treatment plant is described and its infra-structure is presented. Special emphasis is given to the handling of the residues produced and the different treatment processes at the final disposal. 2 refs., 4 figs.

  3. Application of personal computer to development of entrance management system for radiating facilities

    International Nuclear Information System (INIS)

    The report describes a system for managing the entrance and exit of personnel to radiating facilities. A personal computer is applied to its development. Major features of the system is outlined first. The computer is connected to the gate and two magnetic card readers provided at the gate. The gate, which is installed at the entrance to a room under control, opens only for those who have a valid card. The entrance-exit management program developed is described next. The following three files are used: ID master file (random file of the magnetic card number, name, qualification, etc., of each card carrier), entrance-exit management file (random file of time of entrance/exit, etc., updated everyday), and entrance-exit record file (sequential file of card number, name, date, etc.), which are stored on floppy disks. A display is provided to show various lists including a list of workers currently in the room and a list of workers who left the room at earlier times of the day. This system is useful for entrance management of a relatively small facility. Though small in required cost, it requires only a few operators to perform effective personnel management. (N.K.)

  4. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  5. Techniques for measuring aerosol attenuation using the Central Laser Facility at the Pierre Auger Observatory

    OpenAIRE

    PIERRE AUGER Collaboration; Abreu, P; Pastor, Sergio

    2013-01-01

    The Pierre Auger Observatory in Malargue, Argentina, is designed to study the properties of ultra-high energy cosmic rays with energies above 10(18) eV. It is a hybrid facility that employs a Fluorescence Detector to perform nearly calorimetric measurements of Extensive Air Shower energies. To obtain reliable calorimetric information from the FD, the atmospheric conditions at the observatory need to be continuously monitored during data acquisition. In particular, light attenuation due to aer...

  6. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  7. A framework design of computing environment for the material and life science facility at J-PARC

    International Nuclear Information System (INIS)

    We have started to design a framework of the computing environment for the Material and Life science facility at J-PARC. Some of developments are in progress. We also just started to collaborate with the muon group of J-PARC for common software developments and the computing center of JAERI for a large scale computing environments. Security control is always delicate and difficult problem. Not only network securities but also definitions of access privileges to database should be solved carefully. International collaboration is also under consideration. It not easy to have same computing environment with other facilities but data format can be standardized (ex. NeXus format) and basic usage of experiment control can be the same. Such standardization brings benefits to users who often use several facilities. (author)

  8. Centralized Monitoring of the Microsoft Windows-based computers of the LHC Experiment Control Systems

    Science.gov (United States)

    Varela Rodriguez, F.

    2011-12-01

    The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.

  9. Achieving production-level use of HEP software at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Uram, T. D.; Childers, J. T.; LeCompte, T. J.; Papka, M. E.; Benjamin, D.

    2015-12-01

    HEP's demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.

  10. The Overview of the National Ignition Facility Distributed Computer Control System

    International Nuclear Information System (INIS)

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008

  11. Central Storage Facility Project In Colombia To Provide The Safe Storage And Protection Of High-Activity Radioactive Sources

    International Nuclear Information System (INIS)

    The Global Threat Reduction Initiative (GTRI) reduces and protects vulnerable nuclear and radiological material located at civilian sites worldwide. Internationally, over 40 countries are cooperating with GTRI to enhance the security of these materials. The GTRI program has worked successfully with foreign countries to remove and protect nuclear and radioactive materials, including orphaned and disused high-activity sources. GTRI began cooperation with the Republic of Colombia in April 2004. This cooperation has been a resounding success by securing forty high-risk sites, consolidating disused/orphan sources at an interim secure national storage facility, and developing a comprehensive approach to security, training, and sustainability. In 2005 the Colombian Ministry of Mines and Energy requested the Department of Energy's support in the construction of a new Central Storage Facility (CSF). In December 2005, the Ministry selected to construct this facility at the Institute of Geology and Mining (Ingeominas) site in Bogota. This site already served as Colombia's national repository, where disused sources were housed in various buildings around the complex. The CSF project was placed under contract in May 2006, but environmental issues and public protests, which led to a class action lawsuit against the Colombian Government, forced the Ministry to quickly suspend activities, thereby placing the project in jeopardy. Despite these challenges, however, the Ministry of Mines and Energy worked closely with public and environmental authorities to resolve these issues, and continued to be a strong advocate of the GTRI program. In June 2008, the Ministry of Mines and Energy was granted the construction and environmental licenses. As a result, construction immediately resumed and the CSF was completed by December 2008. A commissioning ceremony was held for the new facility in January 2009, which was attended by representatives from the Department of Energy, U.S. Embassy, and

  12. Multiplanar and two-dimensional imaging of central airway stenting with multidetector computed tomography

    Directory of Open Access Journals (Sweden)

    Ozgul Mehmet

    2012-08-01

    Full Text Available Abstract Background Multidetector computed tomography (MDCT provides guidance for primary screening of the central airways. The aim of our study was assessing the contribution of multidetector computed tomography- two dimensional reconstruction in the management of patients with tracheobronchial stenosis prior to the procedure and during a short follow up period of 3 months after the endobronchial treatment. Methods This is a retrospective study with data collected from an electronic database and from the medical records. Patients evaluated with MDCT and who had undergone a stenting procedure were included. A Philips RSGDT 07605 model MDCT was used, and slice thickness, 3 mm; overlap, 1.5 mm; matrix, 512x512; mass, 90 and kV, 120 were evaluated. The diameters of the airways 10 mm proximal and 10 mm distal to the obstruction were measured and the stent diameter (D was determined from the average between D upper and D lower. Results Fifty-six patients, 14 (25% women and 42 (75% men, mean age 55.3 ± 13.2 years (range: 16-79 years, were assessed by MDCT and then treated with placement of an endobronchial stent. A computed tomography review was made with 6 detector Philips RSGDT 07605 multidetector computed tomography device. Endobronchial therapy was provided for the patients with endoluminal lesions. Stents were placed into the area of stenosis in patients with external compression after dilatation and debulking procedures had been carried out. In one patient the migration of a stent was detected during the follow up period by using MDCT. Conclusions MDCT helps to define stent size, length and type in patients who are suitable for endobronchial stinting. This is a non-invasive, reliable method that helps decisions about optimal stent size and position, thus reducing complications.

  13. Integrated assessment of a new Waste-to-Energy facility in Central Greece in the context of regional perspectives.

    Science.gov (United States)

    Perkoulidis, G; Papageorgiou, A; Karagiannidis, A; Kalogirou, S

    2010-07-01

    The main aim of this study is the integrated assessment of a proposed Waste-to-Energy facility that could contribute in the Municipal Solid Waste Management system of the Region of Central Greece. In the context of this paper alternative transfer schemes for supplying the candidate facility were assessed considering local conditions and economical criteria. A mixed-integer linear programming model was applied for the determination of optimum locations of Transfer Stations for an efficient supplying chain between the waste producers and the Waste-to-Energy facility. Moreover different Regional Waste Management Scenarios were assessed against multiple criteria, via the Multi Criteria Decision Making method ELECTRE III. The chosen criteria were total cost, Biodegradable Municipal Waste diversion from landfill, energy recovery and Greenhouse Gas emissions and the analysis demonstrated that a Waste Management Scenario based on a Waste-to-Energy plant with an adjacent landfill for disposal of the residues would be the best performing option for the Region, depending however on the priorities of the decision makers. In addition the study demonstrated that efficient planning is necessary and the case of three sanitary landfills operating in parallel with the WtE plant in the study area should be avoided. Moreover alternative cases of energy recovery of the candidate Waste-to-Energy facility were evaluated against the requirements of the new European Commission Directive on waste in order for the facility to be recognized as recovery operation. The latter issue is of high significance and the decision makers in European Union countries should take it into account from now on, in order to plan and implement facilities that recover energy efficiently. Finally a sensitivity check was performed in order to evaluate the effects of increased recycling rate, on the calorific value of treated Municipal Solid Waste and the gate fee of the candidate plant and found that increased

  14. Integrated assessment of a new Waste-to-Energy facility in Central Greece in the context of regional perspectives

    International Nuclear Information System (INIS)

    The main aim of this study is the integrated assessment of a proposed Waste-to-Energy facility that could contribute in the Municipal Solid Waste Management system of the Region of Central Greece. In the context of this paper alternative transfer schemes for supplying the candidate facility were assessed considering local conditions and economical criteria. A mixed-integer linear programming model was applied for the determination of optimum locations of Transfer Stations for an efficient supplying chain between the waste producers and the Waste-to-Energy facility. Moreover different Regional Waste Management Scenarios were assessed against multiple criteria, via the Multi Criteria Decision Making method ELECTRE III. The chosen criteria were total cost, Biodegradable Municipal Waste diversion from landfill, energy recovery and Greenhouse Gas emissions and the analysis demonstrated that a Waste Management Scenario based on a Waste-to-Energy plant with an adjacent landfill for disposal of the residues would be the best performing option for the Region, depending however on the priorities of the decision makers. In addition the study demonstrated that efficient planning is necessary and the case of three sanitary landfills operating in parallel with the WtE plant in the study area should be avoided. Moreover alternative cases of energy recovery of the candidate Waste-to-Energy facility were evaluated against the requirements of the new European Commission Directive on waste in order for the facility to be recognized as recovery operation. The latter issue is of high significance and the decision makers in European Union countries should take it into account from now on, in order to plan and implement facilities that recover energy efficiently. Finally a sensitivity check was performed in order to evaluate the effects of increased recycling rate, on the calorific value of treated Municipal Solid Waste and the gate fee of the candidate plant and found that increased

  15. The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster

    Science.gov (United States)

    Löwe, P.; Klump, J.; Thaler, J.

    2012-04-01

    Compute clusters can be used as GIS workbenches, their wealth of resources allow us to take on geocomputation tasks which exceed the limitations of smaller systems. To harness these capabilities requires a Geographic Information System (GIS), able to utilize the available cluster configuration/architecture and a sufficient degree of user friendliness to allow for wide application. In this paper we report on the first successful porting of GRASS GIS, the oldest and largest Free Open Source (FOSS) GIS project, onto a compute cluster using Platform Computing's Load Sharing Facility (LSF). In 2008, GRASS6.3 was installed on the GFZ compute cluster, which at that time comprised 32 nodes. The interaction with the GIS was limited to the command line interface, which required further development to encapsulate the GRASS GIS business layer to facilitate its use by users not familiar with GRASS GIS. During the summer of 2011, multiple versions of GRASS GIS (v 6.4, 6.5 and 7.0) were installed on the upgraded GFZ compute cluster, now consisting of 234 nodes with 480 CPUs providing 3084 cores. The GFZ compute cluster currently offers 19 different processing queues with varying hardware capabilities and priorities, allowing for fine-grained scheduling and load balancing. After successful testing of core GIS functionalities, including the graphical user interface, mechanisms were developed to deploy scripted geocomputation tasks onto dedicated processing queues. The mechanisms are based on earlier work by NETELER et al. (2008). A first application of the new GIS functionality was the generation of maps of simulated tsunamis in the Mediterranean Sea for the Tsunami Atlas of the FP-7 TRIDEC Project (www.tridec-online.eu). For this, up to 500 processing nodes were used in parallel. Further trials included the processing of geometrically complex problems, requiring significant amounts of processing time. The GIS cluster successfully completed all these tasks, with processing times

  16. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  17. Development of a personal computer based facility-level SSAC component and inspector support system

    International Nuclear Information System (INIS)

    Research Contract No. 4658/RB was conducted between the IAEA and the Bulgarian Committee on Use of Atomic Energy for Peaceful Purposes. The contract required the Committee to develop and program a personal computer based software package to be used as a facility-level computerized State System of Accounting and Control (SSAC) at an off-load power reactor. The software delivered, called the National Safeguards System (NSS) keeps track of all fuel assembly activity at a power reactor and generates all ledgers, MBA material balances and any required reports to national or international authorities. The NSS is designed to operate on a PC/AT or compatible equipment with a hard disk of 20 MB, color graphics monitor or adaptor and at least one floppy disk drive, 360 Kb. The programs are written in Basic (compiler 2.0). They are executed under MS DOS 3.1 or later

  18. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  19. [Elderlies in street situation or social vulnerability: facilities and difficulties in the use of computational tools].

    Science.gov (United States)

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Pereira, Valclei Aparecida Gandolpho; Negreiros, Maria Célia de; Paranhos, Wana Yeda; Leite, Maria Madalena Januário

    2014-01-01

    This study aimed to identify the advantages and difficulties encountered by older people living on the streets or social vulnerability, to use the computer or internet. It is an exploratory qualitative research, in which five elderlies, attended on a non-governmental organization located in the city of São Paulo, have participated. The discourses were analyzed by content analysis technique and showed, as facilities, among others, to clarify doubts with the monitors, the stimulus for new discoveries coupled with proactivity and curiosity, and develop new skills. The mentioned difficulties were related to physical or cognitive issues, lack of instructor, and lack of knowledge to interact with the machine. The studies focusing on the elderly population living on the streets or in social vulnerability may contribute with evidence to guide the formulation of public policies to this population. PMID:25517671

  20. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  1. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations.

  2. A design study for the upgraded ALICE O2 computing facility

    Science.gov (United States)

    Richter, Matthias

    2015-12-01

    An upgrade of the ALICE detector is currently prepared for the Run 3 period of the Large Hadron Collider (LHC) at CERN starting in 2020. The physics topics under study by ALICE during this period will require the inspection of all collisions at a rate of 50 kHz for minimum bias Pb-Pb and 200 kHz for pp and p-Pb collisions in order to extract physics signals embedded into a large background. The upgraded ALICE detector will produce more than 1 TByte/s of data. Both collision and data rate impose new challenges onto the detector readout and compute system. Some detectors will not use a triggered readout, which will require a continuous processing of the detector data. The challenging requirements will be met by a combined online and offline facility developed and managed by the ALICE O2 project. The combined facility will accommodate the necessary substantial increase of data taking rate. In this paper we present first results of a prototype with estimates for scalability and feasibility for a full scale system.

  3. Centralized configuration system for a large scale farm of network booted computers

    International Nuclear Information System (INIS)

    The ATLAS trigger and data acquisition online farm is composed of nearly 3,000 computing nodes, with various configurations, functions and requirements. Maintaining such a cluster is a big challenge from the computer administration point of view, thus various tools have been adopted by the System Administration team to help manage the farm efficiently. In particular, a custom central configuration system, ConfDBv2, was developed for the overall farm management. The majority of the systems are network booted, and are running an operating system image provided by a Local File Server (LFS) via the local area network (LAN). This method guarantees the uniformity of the system and allows, in case of issues, very fast recovery of the local disks which could be used as scratch area. It also provides greater flexibility as the nodes can be reconfigured and restarted with a different operating system in a very timely manner. A user-friendly web interface offers a quick overview of the current farm configuration and status, allowing changes to be applied on selected subsets or on the whole farm in an efficient and consistent manner. Also, various actions that would otherwise be time consuming and error prone can be quickly and safely executed. We describe the design, functionality and performance of this system and its web–based interface, including its integration with other CERN and ATLAS databases and with the monitoring infrastructure.

  4. Simulation of an EPRI-Nevada Test Site (NTS) hydrogen burn test at the Central Receiver Test Facility

    International Nuclear Information System (INIS)

    In order to augment results obtained from the hydrogen burn equipment survival tests performed by the Electric Power Research Institute (EPRI) at the Nevada Test Site (NTS), a series of tests was conducted at the Sandia National Laboratories Central Receiver Test Facility (CRTF). The CRTF tests simulated a 13 volume-percent burn from the EPRI-NTS series. During the CRTF tests, the thermal and operational responses of several specimens of nuclear power plant safety-related equipment were monitored when subjected to a solar heat flux simulation of a hydrogen burn. The simulation was conducted with and without steam in the vicinity of the test specimens. Prior to exposure, the specimens were preheated to temperatures corresponding to the precombustion environment in the EPRI-NTS test vessel

  5. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  6. The method of program design and C language realization of computing irradiation field in a γ-ray irradiation facility

    International Nuclear Information System (INIS)

    The methods and steps of program design of computing irradiation field in a γ-ray irradiation facility with a pair of irradiation source frames are discussed. They are separately discussed that setting up a coordinate system in an irradiation field and calibrating the coordinates of source rods, computing irradiation dose rate at any point in Euler space by a single thread source, computing irradiation dose rate at any point by composition of multiple thread sources, namely one frame, computing irradiation dose rate at any point by composition of a pair of frames. The program by a language that is analogy to C language is given in the end

  7. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  8. Use of Monte Carlo simulation for computational analysis of critical systems on IPPE's facility addressing needs of nuclear safety

    International Nuclear Information System (INIS)

    The critical facility BFS-1 critical facility was built at the Institute of Physics and Power Engineering (Obninsk, Russia) for full-scale modeling of fast-reactor cores, blankets, in-vessel shielding, and storage. Whereas BFS-1 is a fast-reactor assembly; however, it is a very flexible assembly that can easily be reconfigured to represent numerous other types of reactor designs. This paper describes specific problems with calculation of evaluation neutron physics characteristics of integral experiments performed on BFS facility. The analysis available integral experiments performed on different critical configuration of BFS facility were performed. Calculations of criticality, central reaction rate ratios, and fission rate distributions were carried out by the MCNP5 Monte-Carlo code with different files of evaluated nuclear data. MCNP calculations with multigroup library with 299 energy groups were also made for comparison with pointwise library calculations. (authors)

  9. The National Ignition Facility: Status of the Integrated Computer Control System

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3ω single laser beam performance is 10.4 kJ, equivalent to 2 MJ for a fully activated

  10. The National Ignition Facility: Status of the Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Van Arsdall, P J; Bryant, R; Carey, R; Casavant, D; Demaret, R; Edwards, O; Ferguson, W; Krammen, J; Lagin, L; Larson, D; Lee, A; Ludwigsen, P; Miller, M; Moses, E; Nyholm, R; Reed, R; Shelton, R; Wuest, C

    2003-10-13

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. When completed, NIF will be the world's largest and most energetic laser experimental system, providing an international center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. NIF's 192 energetic laser beams will compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. Laser hardware is modularized into line replaceable units such as deformable mirrors, amplifiers, and multi-function sensor packages that are operated by the Integrated Computer Control System (ICCS). ICCS is a layered architecture of 300 front-end processors attached to nearly 60,000 control points and coordinated by supervisor subsystems in the main control room. The functional subsystems--beam control including automatic beam alignment and wavefront correction, laser pulse generation and pre-amplification, diagnostics, pulse power, and timing--implement automated shot control, archive data, and support the actions of fourteen operators at graphic consoles. Object-oriented software development uses a mixed language environment of Ada (for functional controls) and Java (for user interface and database backend). The ICCS distributed software framework uses CORBA to communicate between languages and processors. ICCS software is approximately three quarters complete with over 750 thousand source lines of code having undergone off-line verification tests and deployed to the facility. NIF has entered the first phases of its laser commissioning program. NIF's highest 3{omega} single laser beam performance is 10.4 kJ, equivalent to 2 MJ

  11. Guide to computing at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Peavler, J. (ed.)

    1979-06-01

    This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)

  12. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the Path to Ignition

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of 8 beams each using laser hardware that is modularized into more than 6,000 line replaceable units such as optical assemblies, laser amplifiers, and multifunction sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-Megajoule capability of infrared light. During the next two years, the control system will be expanded to include automation of target area systems including final optics, target positioners and

  13. Computational investigation of the discharge coefficient of bellmouth flow meters in engine test facilities

    Science.gov (United States)

    Sebourn, Charles Lynn

    2002-11-01

    In this thesis computation of the discharge coefficient of bellmouth flow meters installed in engine test facilities is presented. The discharge coefficient is a critical parameter for accurately calculating flow rate in any flow meter which operates by means of creating a pressure differential. Engine airflow is a critical performance parameter and therefore, it is necessary for engine test facilities to accurately measure airflow. In this report the author investigates the use of computational fluid dynamics using finite difference methods to calculate the flow in bellmouth flow meters and hence the discharge coefficient at any measurement station desired. Experimental boundary layer and core flow data was used to verify the capability of the WIND code to calculate the discharge coefficient accurately. Good results were obtained for Reynolds numbers equal to or greater than about three million which is the primary range of interest. After verifying the WIND code performance, results were calculated for a range of Reynolds numbers and Mach numbers. Also the variation in discharge coefficient as a function of measurement location was examined. It is demonstrated that by picking the proper location for pressure measurement, sensitivity to measurement location can be minimized. Also of interest was the effect of bellmouth geometry. Calculations were performed to investigate the effect of duct to bellmouth diameter ratio and the eccentricity of the bellmouth contraction. In general the effects of the beta ratio were seen to be quite small. For the eccentricity, the variation in discharge coefficient was as high as several percent for axial locations less than half a diameter downstream from the throat. The second portion of the thesis examined the effect of a turbofan engine stationed just downstream of the bellmouth flow meter. The study approximated this effect by examining a single fan stage installed in the duct. This calculation was performed by making use of a

  14. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including final optics

  15. New challenges for HEP computing: RHIC [Relativistic Heavy Ion Collider] and CEBAF [Continuous Electron Beam Accelerator Facility

    International Nuclear Information System (INIS)

    We will look at two facilities; RHIC and CEBF. CEBF is in the construction phase, RHIC is about to begin construction. For each of them, we examine the kinds of physics measurements that motivated their construction, and the implications of these experiments for computing. Emphasis will be on on-line requirements, driven by the data rates produced by these experiments

  16. A service-based SLA (Service Level Agreement) for the RACF (RHIC and ATLAS computing facility) at brookhaven national lab

    International Nuclear Information System (INIS)

    The RACF provides computing support to a broad spectrum of scientific programs at Brookhaven. The continuing growth of the facility, the diverse needs of the scientific programs and the increasingly prominent role of distributed computing requires the RACF to change from a system to a service-based SLA with our user communities. A service-based SLA allows the RACF to coordinate more efficiently the operation, maintenance and development of the facility by mapping out a matrix of system and service dependencies and by creating a new, configurable alarm management layer that automates service alerts and notification of operations staff. This paper describes the adjustments made by the RACF to transition to a service-based SLA, including the integration of its monitoring software, alarm notification mechanism and service ticket system at the facility to make the new SLA a reality.

  17. Comparison of measured and computed plasma loading resistance in the tandem mirror experiment-upgrade (TMX-U) central cell

    International Nuclear Information System (INIS)

    The plasma loading resistance vs density plots computed with McVey's Code XANTENA1, agree well with experimental measurements in the TMX-U central cell. The agreement is much better for frequencies where ω/ω/sub ci/ <1 than for ω/ω/sub ci/ greater than or equal to 1

  18. Surface Water Modeling Using an EPA Computer Code for Tritiated Waste Water Discharge from the heavy Water Facility

    International Nuclear Information System (INIS)

    Tritium releases from the D-Area Heavy Water Facilities to the Savannah River have been analyzed. The U.S. EPA WASP5 computer code was used to simulate surface water transport for tritium releases from the D-Area Drum Wash, Rework, and DW facilities. The WASP5 model was qualified with the 1993 tritium measurements at U.S. Highway 301. At the maximum tritiated waste water concentrations, the calculated tritium concentration in the Savannah River at U.S. Highway 301 due to concurrent releases from D-Area Heavy Water Facilities varies from 5.9 to 18.0 pCi/ml as a function of the operation conditions of these facilities. The calculated concentration becomes the lowest when the batch releases method for the Drum Wash Waste Tanks is adopted

  19. Analysis of shielding calculation methods for 16- and 64-slice computed tomography facilities

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, C; Cenizo, E; Bodineau, C; Mateo, B; Ortega, E M, E-mail: c_morenosaiz@yahoo.e [Servicio de RadiofIsica Hospitalaria, Hospital Regional Universitario Carlos Haya, Malaga (Spain)

    2010-09-15

    The new multislice computed tomography (CT) machines require some new methods of shielding calculation, which need to be analysed. NCRP Report No. 147 proposes three shielding calculation methods based on the following dosimetric parameters: weighted CT dose index for the peripheral axis (CTDI{sub w,per}), dose-length product (DLP) and isodose maps. A survey of these three methods has been carried out. For this analysis, we have used measured values of the dosimetric quantities involved and also those provided by the manufacturer, making a comparison between the results obtained. The barrier thicknesses when setting up two different multislice CT instruments, a Philips Brilliance 16 or a Philips Brilliance 64, in the same room, are also compared. Shielding calculation from isodose maps provides more reliable results than the other two methods, since it is the only method that takes the actual scattered radiation distribution into account. It is concluded therefore that the most suitable method for calculating the barrier thicknesses of the CT facility is the one based on isodose maps. This study also shows that for different multislice CT machines the barrier thicknesses do not necessarily become bigger as the number of slices increases, because of the great dependence on technique used in CT protocols for different anatomical regions.

  20. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  1. Development of a computer code for shielding calculation in X-ray facilities

    International Nuclear Information System (INIS)

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011

  2. Central Weighted Non-Oscillatory (CWENO) and Operator Splitting Schemes in Computational Astrophysics

    Science.gov (United States)

    Ivanovski, Stavro

    2011-05-01

    High-resolution shock-capturing schemes (HRSC) are known to be the most adequate and advanced technique used for numerical approximation to the solution of hyperbolic systems of conservation laws. Since most of the astrophysical phenomena can be described by means of system of (M)HD conservation equations, finding most ac- curate, computationally not expensive and robust numerical approaches for their solution is a task of great importance for numerical astrophysics. Based on the Central Weighted Non-Oscillatory (CWENO) reconstruction approach, which relies on the adaptive choice of the smoothest stencil for resolving strong shocks and discontinuities in central framework on staggered grid, we present a new algorithm for systems of conservation laws using the key idea of evolving the intermediate stages in the Runge Kutta time discretization in primitive variables . In this thesis, we introduce a new so-called conservative-primitive variables strategy (CPVS) by integrating the latter into the earlier proposed Central Runge Kutta schemes (Pareschi et al., 2005). The advantages of the new shock-capturing algorithm with respect to the state-of-the-art HRSC schemes used in astrophysics like upwind Godunov-type schemes can be summarized as follows: (i) Riemann-solver-free central approach; (ii) favoring dissipation (especially needed for multidimensional applications in astrophysics) owing to the diffusivity coming from the design of the scheme; (iii) high accuracy and speed of the method. The latter stems from the fact that the advancing in time in the predictor step does not need inversion between the primitive and conservative variables and is essential in applications where the conservative variables are neither trivial to compute nor to invert in the set of primitive ones as it is in relativistic hydrodynamics. The main objective of the research adopted in the thesis is to outline the promising application of the CWENO (with CPVS) in the problems of the

  3. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  4. Safety Assessment Of The Centralized Storage Facility For Disused Sealed Radioactive Sources In United Republic Of Tanzania

    International Nuclear Information System (INIS)

    SRS are no longer in use, they are declared as disused, and they are transferred to the central radioactive management facility (CRMF) belonging to Tanzania Atomic Energy Commission (regulatory body) and managed as radioactive waste. In order to reduce the risk associated with disused sealed radioactive sources (DSRS), the first priority would be to bring them to appropriate controls under the regulatory body. When DSRS are safely managed, regulatory body need to make assessment of the likelihood and potential impact of incidents, accidents and hazards for proper management plan. The paper applies Analytical Hierarchy Process (AHP) for assessing and allocating weights and priorities for solving the problem of mult criteria consideration for management plan. Using pairwise comparisons, the relative importance of one criterion over another can be expressed. The method allows decision makers to provide judgments about the relative importance of each criterion and to estimate radiological risk by using expert's judgments or probability of occurrence. AHP is the step by step manner where the resulting priorities are shown and the possible inconsistencies are determined. The Information provided by experts helps to rank hazards according to probability of occurrence, potential impact and mitigation cost. The strength of the AHP method lies in its ability to incorporate both qualitative and quantitative data in decision making. AHP present a powerful tool for weighting and prioritizing hazards in terms of occurrence probability. However, AHP also has some weak points. AHP requires data based on experience, knowledge and judgment which are subjective for each decision-maker

  5. Recent ORNL experience in site performance prediction: the Gas Centrifuge Enrichment Plant and the Oak Ridge Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    The suitability of the Portsmouth Gas Centrifuge Enrichment Plant Landfill and the Oak Ridge, Tennessee, Central Waste Disposal Facility for disposal of low-level radioactive waste was evaluated using pathways analyses. For these evaluations, a conservative approach was selected; that is, conservatism was built into the analyses when assumptions concerning future events had to be made or when uncertainties concerning site or waste characteristics existed. Data from comprehensive laboratory and field investigations were used in developing the conceptual and numerical models that served as the basis for the numerical simulations of the long-term transport of contamination to man. However, the analyses relied on conservative scenarios to describe the generation and migration of contamination and the potential human exposure to the waste. Maximum potential doses to man were calculated and compared to the appropriate standards. Even under this conservative framework, the sites were found to provide adequate buffer to persons outside the DOE reservations and conclusions concerning site capacity and site acceptability were drawn. Our experience through these studies has shown that in reaching conclusions in such studies, some consideration must be given to the uncertainties and conservatisms involved in the analyses. Analytical methods to quantitatively assess the probability of future events to occur and to quantitatively determine the sensitivity of the results to data uncertainty may prove useful in relaxing some of the conservatism built into the analyses. The applicability of such methods to pathways analyses is briefly discussed

  6. Radiation exposure and central nervous system cancers: A case-control study among workers at two nuclear facilities

    International Nuclear Information System (INIS)

    A nested case-control study was conducted among workers employed between 1943 and 1977 at two nuclear facilities to investigate the possible association of primary malignant neoplasms of the central nervous system (CNS) with occupational exposure to ionizing radiation from external and internal sources. Eighty-nine white male and female workers, who according to the information on death certificates dies of primary CNS cancers, were identified as cases. Four matched controls were selected for each case. External radiation exposure data were available from film badge readings for individual workers, whereas radiation dose to lung from internally deposited radionuclides, mainly uranium, was estimated from area and personnel monitoring data and was used in analyses in lieu of the dose to the brain. Matched sets were included in the analyses only if information was available for the case and at least one of the corresponding controls. Thus, the analyses of external radiation included 27 cases and 90 matched controls, and 47 cases and 120 matched controls were analyzed for the effects of radiation from internally deposited uranium. No association was observed between deaths fron CNS cancers and occupational exposure to ionizing radiation from external or internal sources. However, due to the small number of monitored subjects and low doses, a weak association could not be ruled out. 43 refs., 1 fig., 15 tabs

  7. Dynamic control of a central pattern generator circuit: a computational model of the snail feeding network.

    Science.gov (United States)

    Vavoulis, Dimitris V; Straub, Volko A; Kemenes, Ildikó; Kemenes, György; Feng, Jianfeng; Benjamin, Paul R

    2007-05-01

    Central pattern generators (CPGs) are networks underlying rhythmic motor behaviours and they are dynamically regulated by neuronal elements that are extrinsic or intrinsic to the rhythmogenic circuit. In the feeding system of the pond snail, Lymnaea stagnalis, the extrinsic slow oscillator (SO) interneuron controls the frequency of the feeding rhythm and the N3t (tonic) has a dual role; it is an intrinsic CPG interneuron, but it also suppresses CPG activity in the absence of food, acting as a decision-making element in the feeding circuit. The firing patterns of the SO and N3t neurons and their synaptic connections with the rest of the CPG are known, but how these regulate network function is not well understood. This was investigated by building a computer model of the feeding network based on a minimum number of cells (N1M, N2v and N3t) required to generate the three-phase motor rhythm together with the SO that was used to activate the system. The intrinsic properties of individual neurons were represented using two-compartment models containing currents of the Hodgkin-Huxley type. Manipulations of neuronal activity in the N3t and SO neurons in the model produced similar quantitative effects to food and electrical stimulation in the biological network indicating that the model is a useful tool for studying the dynamic properties of the feeding circuit. The model also predicted novel effects of electrical stimulation of two CPG interneurons (N1M and N2v). When tested experimentally, similar effects were found in the biological system providing further validation of our model. PMID:17561845

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. Computer based plant display and digital control system of Wolsong NPP Tritium Removal Facility

    International Nuclear Information System (INIS)

    The Wolsong Tritium Removal Facility (WTRF) is an AECL-designed, first-of-a-kind facility that removes tritium from the heavy water that is used in systems of the CANDUM reactors in operation at the Wolsong Nuclear Power Plant in South Korea. The Plant Display and Control System (PDCS) provides digital plant monitoring and control for the WTRF and offers the advantages of state-of-the-art digital control system technologies for operations and maintenance. The overall features of the PDCS will be described and some of the specific approaches taken on the project to save construction time and costs, to reduce in-service life-cycle costs and to improve quality will be presented. The PDCS consists of two separate computer sub-systems: the Digital Control System (DCS) and the Plant Display System (PDS). The PDS provides the computer-based Human Machine Interface (HMI) for operators, and permits efficient supervisory or device level monitoring and control. A System Maintenance Console (SMC) is included in the PDS for the purpose of software and hardware configuration and on-line maintenance. A Historical Data System (HDS) is also included in the PDS as a data-server that continuously captures and logs process data and events for long-term storage and on-demand selective retrieval. The PDCS of WTRF has been designed and implemented based on an off-the-self PDS/DCS product combination, the Delta-V System from Emerson. The design includes fully redundant Ethernet network communications, controllers, power supplies and redundancy on selected I/O modules. The DCS provides field bus communications to interface with 3rd party controllers supplied on specialized skids, and supports HART communication with field transmitters. The DCS control logic was configured using a modular and graphical approach. The control strategies are primarily device control modules implemented as autonomous control loops, and implemented using IEC 61131-3 Function Block Diagram (FBD) and Structured

  10. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    International Nuclear Information System (INIS)

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab

  11. Verification of computer codes for dynamic processes in nuclear reactors against experiments at loop facility of IGR-1 pulse reactor

    International Nuclear Information System (INIS)

    Basic principles of PRISDG and PRISET computer codes structure to analyze dynamic processes in nuclear reactors are presented. The codes were verified against experimental studies of dynamic processes related with flow-stop and power surge. The experimental data were obtained at loop facility of IGR-1 pulse reactor using fuel assemblies of IVV-2M research reactor. Accuracy of the codes is the same as the accuracy achieved in the experiments. Analysis could be performed at PS-2-type personal computers. Running time is not longer than several tens of minutes. (author)

  12. Transmutation of alloys in MFE facilities as calculated by REAC (a computer code system for activation and transmutation)

    International Nuclear Information System (INIS)

    A computer code system for fast calculation of activation and transmutation has been developed. The system consists of a driver code, cross-section libraries, flux libraries, a material library, and a decay library. The code is used to predict transmutations in a Ti-modified 316 stainless steel, a commercial ferritic alloy (HT9), and a V-15%Cr-5%Ti alloy in various magnetic fusion energy (MFE) test facilities and conceptual reactors

  13. Intensive archaeological survey of the proposed Central Sanitary Wastewater Treatment Facility, Savannah River Site, Aiken and Barnwell Counties, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    Stephenson, D.K.; Sassaman, K.E.

    1993-11-01

    The project area for the proposed Central Sanitary Wastewater Treatment Facility on the Savannah River Site includes a six-acre tract along Fourmile Branch and 18 mi of trunk line corridors. Archaeological investigations of the six-acre parcel resulted in the discovery of one small prehistoric site designated 38AK465. This cultural resource does not have the potential to add significantly to archaeological knowledge of human occupation in the region. The Savannah River Archaeological Research Program (SRARP) therefore recommends that 38AK465 is not eligible for nomination to the National Register of Historic Places (NRHP) and further recommends a determination of no effect. Archaeological survey along the trunk line corridors implicated previously recorded sites 38AK92, 38AK145, 38AK415, 38AK417, 38AK419, and 38AK436. Past disturbance from construction had severely disturbed 38AK92 and no archaeological evidence of 38AK145, 38AK419, and 38AK436 was recovered during survey. Lacking further evidence for the existence of these sites, the SRARP recommends that 38AK92, 38AK145, 38AK419, and 38AK436 are not eligible for nomination to the NRHP and thus warrant a determination of no effect. Two of these sites, 38Ak415 and 38AK417, required further investigation to evaluate their archaeological significance. Both of the sites have the potential to yield significant data on the prehistoric period occupation of the Aiken Plateau and the SRARP recommends that they are eligible for nomination to the NRHP. The Savannah River Archaeological Research Program recommends that adverse effects to sites 38AK415 and 38AK417 from proposed construction can be mitigated through avoidance.

  14. Geochemical information for the West Chestnut Ridge Central Waste Disposal Facility for low-level radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, F.G.; Kelmers, A.D.

    1984-06-01

    Geochemical support activities for the Central Waste Disposal Facility (CWDF) project included characterization of site materials, as well as measurement of radionuclide sorption and desorption isotherms and apparent concentration limit values under site-relevant laboratory test conditions. The radionuclide sorption and solubility information is needed as input data for the pathways analysis calculations to model expected radioactivity releases from emplaced waste to the accessible environment under various release scenarios. Batch contact methodology was used to construct sorption and desorption isotherms for a number of radionuclides likely to be present in waste to be disposed of at the site. The sorption rates for uranium and europium were rapid (> 99.8% of the total radionuclide present was adsorbed in approx. 30 min). With a constant-pH isotherm technique, uranium, strontium, cesium, and curium exhibited maximum Rs values of 4800 to > 30,000 L/kg throughout the pH range 5 to 7. Sorption ratios were generally lower at higher or lower pH levels. Retardation factors for uranium, strontium, and cesium, explored by column chromatographic tests, were consistent with the high sorption ratios measured in batch tests for these radionuclides. The addition of as little as 0.01 M organic reagent capable of forming strong soluble complexes with metals (e.g., ethylenediaminetetraacetic acid (EDTA) or citric acid) was found to reduce the sorption ratio for uranium by as much as two orders of magnitude. Substitution of an actual low-level waste site trench water for groundwater in these tests was found to give a similar reduction in the sorption ratio.

  15. Geochemical information for the West Chestnut Ridge Central Waste Disposal Facility for low-level radioactive waste

    International Nuclear Information System (INIS)

    Geochemical support activities for the Central Waste Disposal Facility (CWDF) project included characterization of site materials, as well as measurement of radionuclide sorption and desorption isotherms and apparent concentration limit values under site-relevant laboratory test conditions. The radionuclide sorption and solubility information is needed as input data for the pathways analysis calculations to model expected radioactivity releases from emplaced waste to the accessible environment under various release scenarios. Batch contact methodology was used to construct sorption and desorption isotherms for a number of radionuclides likely to be present in waste to be disposed of at the site. The sorption rates for uranium and europium were rapid (> 99.8% of the total radionuclide present was adsorbed in approx. 30 min). With a constant-pH isotherm technique, uranium, strontium, cesium, and curium exhibited maximum Rs values of 4800 to > 30,000 L/kg throughout the pH range 5 to 7. Sorption ratios were generally lower at higher or lower pH levels. Retardation factors for uranium, strontium, and cesium, explored by column chromatographic tests, were consistent with the high sorption ratios measured in batch tests for these radionuclides. The addition of as little as 0.01 M organic reagent capable of forming strong soluble complexes with metals [e.g., ethylenediaminetetraacetic acid (EDTA) or citric acid] was found to reduce the sorption ratio for uranium by as much as two orders of magnitude. Substitution of an actual low-level waste site trench water for groundwater in these tests was found to give a similar reduction in the sorption ratio

  16. Radionuclide migration pathways analysis for the Oak Ridge Central Waste Disposal Facility on the West Chestnut Ridge site

    International Nuclear Information System (INIS)

    A dose-to-man pathways analysis is performed for disposal of low-level radioactive waste at the Central Waste Disposal Facility on the West Chestnut Ridge Site. Both shallow land burial (trench) and aboveground (tumulus) disposal methods are considered. The waste volumes, characteristics, and radionuclide concentrations are those of waste streams anticipated from the Oak Ridge National Laboratory, the Y-12 Plant, and the Oak Ridge Gaseous Diffusion Plant. The site capacity for the waste streams is determined on the basis of the pathways analysis. The exposure pathways examined include (1) migration and transport of leachate from the waste disposal units to the Clinch River (via the groundwater medium for trench disposal and Ish Creek for tumulus disposal) and (2) those potentially associated with inadvertent intrusion following a 100-year period of institutional control: an individual resides on the site, inhales suspended particles of contaminated dust, ingests vegetables grown on the plot, consumes contaminated water from either an on-site well or from a nearby surface stream, and receives direct exposure from the contaminated soil. It is found that either disposal method would provide effective containment and isolation for the anticipated waste inventory. However, the proposed trench disposal method would provide more effective containment than tumuli because of sorption of some radionuclides in the soil. Persons outside the site boundary would receive radiation doses well below regulatory limits if they were to ingest water from the Clinch River. An inadvertent intruder could receive doses that approach regulatory limits; however, the likelihood of such intrusions and subsequent exposures is remote. 33 references, 31 figures, 28 tables

  17. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    International Nuclear Information System (INIS)

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  18. Prevalence of enterobacteriaceae in Tupinambis merianae (Squamata: Teiidae) from a captive facility in Central Brazil, with a profile of antimicrobial drug resistance in Salmonella enterica

    OpenAIRE

    Andréa de Moraes Carvalho; Ayrton Klier Péres Júnior; Maria Auxiliadora Andrade; Valéria de Sá Jayme

    2013-01-01

    The present study reports the presence of enterobacteriaceae in Tegu Lizards (Tupinambis merianae)from a captive facility in central Brazil. From a total of 30 animals, 10 juveniles and 20 adults (10 males, 10 females), 60 samples were collected, in two periods separated by 15 days. The samples were cultivated in Xylose-lysine-deoxycholate agar (XLT4) and MacConkey agar. The Salmonella enterica were tested for antimicrobial susceptibility. A total of 78 bacteria was isolated, of wich 27 were ...

  19. Improvement of Auditing Data Centralization, in Particular by Using Electronic Computers

    International Nuclear Information System (INIS)

    Nuclear materials in France are distributed among numerous installations of all types: mines, factories, reactors, research centres, storage depots. One of the problems to be solved in the management of these materials is that of centralizing the auditing data produced by all these installations. The first part of the paper gives a brief account of the French system of nuclear materials management and of the problem of auditing data centralization as it arises in this system. The main difficulty in recent years has been the time required for the monthly closing and centralization of accounts for all the installations. The steps taken to cope with these difficulties are described. The second part describes the main method adopted, namely, the use of office machinery for centralization of data, the preparation of balances and the publication of a number of periodical documents. The paper explains why and under what conditions the use of office machinery was considered advantageous, and in particular it describes the coding system employed, the document circuits, and the time involved, mention also being made of the modifications which had to be made to the existing auditing system to meet the requirements imposed by the office machinery. The third part deals with various conclusions, resulting from the introduction of machine accounting, in respect of the cost in manpower and money of this centralization of auditing data and, more generally, the cost of nuclear materials management. (author)

  20. TORAC User's Manual. A computer code for analyzing tornado-induced flow and material transport in nuclear facilities

    International Nuclear Information System (INIS)

    This manual describes the TORAC computer code, which can model tornado-induced flows, pressures, and material transport within structures. Future versions of this code will have improved analysis capabilities. In addition, it is part of a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. TORAC is directed toward the analysis of facility ventilation systems, including interconnected rooms and corridors. TORAC is an improved version of the TVENT computer code. In TORAC, blowers can be turned on and off and dampers can be controlled with an arbitrary time function. The material transport capability is very basic and includes convection, depletion, entrainment, and filtration of material. The input specifications for the code and a variety of sample problems are provided. 53 refs., 62 figs

  1. A Climatology of Midlatitude Continental Clouds from the ARM SGP Central Facility. Part II; Cloud Fraction and Radiative Forcing

    Science.gov (United States)

    Dong, Xiquan; Xi, Baike; Minnis, Patrick

    2006-01-01

    Data collected at the Department of Energy Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) central facility are analyzed for determining the variability of cloud fraction and radiative forcing at several temporal scales between January 1997 and December 2002. Cloud fractions are estimated for total cloud cover and for single-layer low (0-3 km), middle (3-6 km), and high clouds (greater than 6 km) using ARM SGP ground-based paired lidar-radar measurements. Shortwave (SW), longwave (LW), and net cloud radiative forcings (CRF) are derived from up- and down-looking standard precision spectral pyranometers and precision infrared radiometer measurements. The annual averages of total, and single-layer, nonoverlapped low, middle and high cloud fractions are 0.49, 0.11, 0.03, and 0.17, respectively. Total and low cloud amounts were greatest from December through March and least during July and August. The monthly variation of high cloud amount is relatively small with a broad maximum from May to August. During winter, total cloud cover varies diurnally with a small amplitude, mid-morning maximum and early evening minimum, and during summer it changes by more than 0.14 over the daily cycle with a pronounced early evening minimum. The diurnal variations of mean single-layer cloud cover change with season and cloud height. Annual averages of all-sky, total, and single-layer high, middle, and low LW CRFs are 21.4, 40.2, 16.7, 27.2, and 55.0 Wm(sup -2), respectively; and their SW CRFs are -41.5, -77.2, -37.0, -47.0, and -90.5 Wm(sup -2). Their net CRFs range from -20 to -37 Wm(sup -2). For all-sky, total, and low clouds, the maximum negative net CRFs of -40.1, -70, and -69.5 Wm(sup -2), occur during April; while the respective minimum values of -3.9, -5.7, and -4.6 Wm(sup -2), are found during December. July is the month having maximum negative net CRF of -46.2 Wm(sup -2) for middle clouds, and May has the maximum value of -45.9 Wm(sup -2) for high clouds. An

  2. Recent development of spent fuel management in the Central Storage Facility for Spent Fuel (CLAB) in Sweden

    International Nuclear Information System (INIS)

    The Swedish nuclear waste management programme was presented at the Advisory Group Meeting held 15-18 March 1988. As there have been very small changes since then in the main strategy this paper is focusing the ongoing work to expand the capacity of the Central Storage Facility for Spent Fuel, CLAB, from 3.000 tonnes of uranium (tU) to 5.000 tU within the existing storage pools. This expansion makes it possible to postpone the investment in new storage pools in a second rock cavern by 6-8 years which gives a considerable economic advantage. Preliminary studies of several methods to realize the expansion led to a decision to use storage canisters similar to the existing ones. The new BWR canister will be designed to hold 25 fuel assemblies instead of 16 and the new PWR canister 9 assemblies instead of 5. The main problem is to maintain a sufficient margin against criticality with the new and denser fuel pattern. Two methods to achieve this have been more closely investigated: Credit for the burnup of the fuel; Neutron absorbing material in the canisters for reactivity control. A calculational work has been performed in order to analyze the margins which must be applied in the first case. One of the more important factors proved to be the margin required to cover the influence of axial profiles of burnup and Pu-build up for BWR fuel. Together with other necessary reactivity margins the total penalty which has to be applied amounted to about 12 reactivity percent units. With the requirement of a final Keff value smaller than or equal to 0,95 this implied that a far to great proportion of the fuel arriving at CLAB would fall outside the acceptable region defined by initial enrichment and actual burnup. The method of taking full credit for fuel burnup therefore had to be abandoned in favour of the other method. A preliminary analysis has been performed involving different degrees of absorber material in the canister and the fact that burnable absorbers in the fuel

  3. Cardiac tamponade in an infant during contrast infusion through central venous catheter for chest computed tomography; Tamponamento cardiaco durante infusao de contraste em acesso venoso central para realizacao de tomografia computadorizada do torax em lactente

    Energy Technology Data Exchange (ETDEWEB)

    Daud, Danilo Felix; Campos, Marcos Menezes Freitas de; Fleury Neto, Augusto de Padua [Hospital Geral de Palmas, TO (Brazil)

    2013-11-15

    Complications from central venous catheterization include infectious conditions, pneumothorax, hemothorax and venous thrombosis. Pericardial effusion with cardiac tamponade hardly occurs, and in infants is generally caused by umbilical catheterization. The authors describe the case of cardiac tamponade occurred in an infant during chest computed tomography with contrast infusion through a central venous catheter inserted into the right internal jugular vein. (author)

  4. Installation of a small computer for central EDV support of a radiologic clinic

    International Nuclear Information System (INIS)

    The computer of the Department of Radiology of the University of Graz was installed to solve EDP-problems in the fields of nuclear medicine, radiotherapy and clinical documentation. This is accomplished by applying off-line procedures for data acquisition. The computer is used primarily for the analysis of dynamic studies in nuclear medicine, for interactive treatment planing in radiotherapy and for reportdocumentation by free-text analysis. (orig.)

  5. National Ignition Facility system design requirements NIF integrated computer controls SDR004

    International Nuclear Information System (INIS)

    This System Design Requirement document establishes the performance, design, development, and test requirements for the NIF Integrated Computer Control System. The Integrated Computer Control System (ICCS) is covered in NIF WBS element 1.5. This document responds directly to the requirements detailed in the NIF Functional Requirements/Primary Criteria, and is supported by subsystem design requirements documents for each major ICCS Subsystem

  6. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    International Nuclear Information System (INIS)

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above

  7. Evaluation and use of geosphere flow and migration computer programs for near surface trench type disposal facilities

    International Nuclear Information System (INIS)

    This report describes calculations of groundwater flow and radionuclide migration for near surface trench type radioactive waste disposal facilities. Aspects covered are verification of computer programs, detailed groundwater flow calculations for the Elstow site, radionuclide migration for the Elstow site and the effects of using non-linear sorption models. The Elstow groundwater flows are for both the current situation and for projected developments to the site. The Elstow migration calculations serve to demonstrate a methodology for predicting radionuclide transport from near surface trench type disposal facilities. The majority of the work was carried out at the request of and in close collaboration with ANS, the coordinators for the preliminary assessment of a proposed radioactive waste disposal site at Elstow. Hence a large part of the report contains results which were generated for ANS to use in their assessment. (author)

  8. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [FRIB, MSU; Mokhov, Nikolai [FNAL; Niita, Koji [RIST, Japan

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  9. Managing drought risk with a computer model of the Raritan River Basin water-supply system in central New Jersey

    Science.gov (United States)

    Dunne, Paul; Tasker, Gary

    1996-01-01

    The reservoirs and pumping stations that comprise the Raritan River Basin water-supply system and its interconnections to the Delaware-Raritan Canal water-supply system, operated by the New Jersey Water Supply Authority (NJWSA), provide potable water to central New Jersey communities. The water reserve of this combined system can easily be depleted by an extended period of below-normal precipitation. Efficient operation of the combined system is vital to meeting the water-supply needs of central New Jersey. In an effort to improve the efficiency of the system operation, the U.S. Geological Survey (USGS), in cooperation with the NJWSA, has developed a computer model that provides a technical basis for evaluating the effects of alternative patterns of operation of the Raritan River Basin water-supply system. This fact sheet describes the model, its technical basis, and its operation.

  10. Design of a central pattern generator using reservoir computing for learning human motion

    OpenAIRE

    Wyffels, Francis; Schrauwen, Benjamin

    2009-01-01

    To generate coordinated periodic movements, robot locomotion demands mechanisms which are able to learn and produce stable rhythmic motion in a controllable way. Because systems based on biological central pattern generators (CPGs) can cope with these demands, these kind of systems are gaining in success. In this work we introduce a novel methodology that uses the dynamics of a randomly connected recurrent neural network for the design of CPGs. When a randomly connected recurrent neural netwo...

  11. Simple computational modeling for human extracorporeal irradiation using the BNCT facility of the RA-3 Reactor

    International Nuclear Information System (INIS)

    We present a simple computational model of the reactor RA-3 developed using Monte Carlo transport code MCNP. The model parameters are adjusted in order to reproduce experimental measured points in air and the source validation is performed in an acrylic phantom. Performance analysis is carried out using computational models of animal extracorporeal irradiation in liver and lung. Analysis is also performed inside a neutron shielded receptacle use for the irradiation of rats with a model of hepatic metastases.The computational model reproduces the experimental behavior in all the analyzed cases with a maximum difference of 10 percent. (author)

  12. Crustal seismic velocity in the Marche region (Central Italy): computation of a minimum 1-D model with seismic station corrections.

    OpenAIRE

    Scarfì, L.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Catania, Catania, Italia; Imposa, S.; Dipartimento di Scienze Geologiche, University of Catania, Italy; Raffaele, R.; Dipartimento di Scienze Geologiche, Università di Catania, Italy; Scaltrito, A.; Istituto Nazionale di Geofisica e Vulcanologia, Sezione Catania, Catania, Italia

    2008-01-01

    A 1-D velocity model for the Marche region (central Italy) was computed by inverting P- and S-wave arrival times of local earthquakes. A total of 160 seismic events with a minimum of ten observations, a travel time residual ≤ 0.8 s and an azimuthal gap lower than 180° have been selected. This “minimum 1-D velocity model” is complemented by station corrections, which can be used to take into account possible near-surface velocity heterogeneities beneath each station. Using this new P-wave ...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. Environmental assessment: Solid waste retrieval complex, enhanced radioactive and mixed waste storage facility, infrastructure upgrades, and central waste support complex, Hanford Site, Richland, Washington

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) needs to take action to: retrieve transuranic (TRU) waste because interim storage waste containers have exceeded their 20-year design life and could fail causing a radioactive release to the environment provide storage capacity for retrieved and newly generated TRU, Greater-than-Category 3 (GTC3), and mixed waste before treatment and/or shipment to the Waste Isolation Pilot Project (WIPP); and upgrade the infrastructure network in the 200 West Area to enhance operational efficiencies and reduce the cost of operating the Solid Waste Operations Complex. This proposed action would initiate the retrieval activities (Retrieval) from Trench 4C-T04 in the 200 West Area including the construction of support facilities necessary to carry out the retrieval operations. In addition, the proposed action includes the construction and operation of a facility (Enhanced Radioactive Mixed Waste Storage Facility) in the 200 West Area to store newly generated and the retrieved waste while it awaits shipment to a final disposal site. Also, Infrastructure Upgrades and a Central Waste Support Complex are necessary to support the Hanford Site's centralized waste management area in the 200 West Area. The proposed action also includes mitigation for the loss of priority shrub-steppe habitat resulting from construction. The estimated total cost of the proposed action is $66 million

  16. Development of a personal computer code for fire protection analysis of DOE facility air-cleaning systems

    International Nuclear Information System (INIS)

    The United States Department of Energy (DOE) has sponsored development of a computer code to aid analysts performing fire hazards analyses for DOE facilities. The code selected for this application was the FIRAC code developed by the Los Alamos National Laboratory for the Nuclear Regulatory Commission. The original code has been modified by the Westinghouse-Hanford Company. The FIRAC code simulates fire accidents in nuclear facilities and predicts effects of a hypothetical fire within a compartment and its effect throughout the rest of the facility, particularly the air-cleaning systems. The FIRAC code was designed to run on Cray supercomputers. The input format is difficult to use. For this code to be useful to the DOE fire protection community, it had to be converted to run on an IBM PC and couple with a menu-driven pre-processor that would make preparation of the input easy to use for fire protection engineers. In addition, a graphical display of the analysis results was required. In this paper the authors describe the pre-processor, the PC version of FIRAC, and the post-processor graphics package. In the presentation, a demonstration of how to set up a problem and use the code is made. 4 figs

  17. AIRDOS-II computer code for estimating radiation dose to man from airborne radionuclides in areas surrouding nuclear facilities

    International Nuclear Information System (INIS)

    The AIRDOS-II computer code estimates individual and population doses resulting from the simultaneous atmospheric release of as many as 36 radionuclides from a nuclear facility. This report describes the meteorological and environmental models used is the code, their computer implementation, and the applicability of the code to assessments of radiological impact. Atmospheric dispersion and surface deposition of released radionuclides are estimated as a function of direction and distance from a nuclear power plant or fuel-cycle facility, and doses to man through inhalation, air immersion, exposure to contaminated ground, food ingestion, and water immersion are estimated in the surrounding area. Annual doses are estimated for total body, GI tract, bone, thyroid, lungs, muscle, kidneys, liver, spleen, testes, and ovaries. Either the annual population doses (man-rems/year) or the highest annual individual doses in the assessment area (rems/year), whichever are applicable, are summarized in output tables in several ways--by nuclides, modes of exposure, and organs. The location of the highest individual doses for each reference organ estimated for the area is specified in the output data

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. User's Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    International Nuclear Information System (INIS)

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed

  20. The significance of indirect costs—application to clinical laboratory test economics using computer facilities

    OpenAIRE

    Hindriks, F. R.; Bosman, A.; Rademaker, P. F.

    1989-01-01

    The significance of indirect costs in the cost price calculation of clinical chemistry laboratory tests by way of the production centres method has been investigated. A cost structure model based on the ‘production centres’ method, the Academisch Ziekenhuis Groningen (AZG) 1-2-3 model, is used for the calculation of cost and cost prices as an add-in tool to the spreadsheet program Lotus 1-2-3. The system specifications of the AZG 1-2-3 cost structure model have been extended with facilities t...

  1. Computational Insights into the Central Role of Nonbonding Interactions in Modern Covalent Organocatalysis.

    Science.gov (United States)

    Walden, Daniel M; Ogba, O Maduka; Johnston, Ryne C; Cheong, Paul Ha-Yeon

    2016-06-21

    The flexibility, complexity, and size of contemporary organocatalytic transformations pose interesting and powerful opportunities to computational and experimental chemists alike. In this Account, we disclose our recent computational investigations of three branches of organocatalysis in which nonbonding interactions, such as C-H···O/N interactions, play a crucial role in the organization of transition states, catalysis, and selectivity. We begin with two examples of N-heterocyclic carbene (NHC) catalysis, both collaborations with the Scheidt laboratory at Northwestern. In the first example, we discuss the discovery of an unusual diverging mechanism in a catalytic kinetic resolution of a dynamic racemate that depends on the stereochemistry of the product being formed. Specifically, the major product is formed through a concerted asynchronous [2 + 2] aldol-lactonization, while the minor products come from a stepwise spiro-lactonization pathway. Stereoselectivity and catalysis are the results of electrophilic activation from C-H···O interactions between the catalyst and the substrate and conjugative stabilization of the electrophile. In the second example, we show how knowledge and understanding of the computed transition states led to the development of a more enantioselective NHC catalyst for the butyrolactonization of acyl phosphonates. The identification of mutually exclusive C-H···O interactions in the computed major and minor TSs directly resulted in structural hypotheses that would lead to targeted destabilization of the minor TS, leading to enhanced stereoinduction. Synthesis and evaluation of the newly designed NHC catalyst validated our hypotheses. Next, we discuss two works related to Lewis base catalysis involving 4-dimethylaminopyridine (DMAP) and its derivatives. In the first, we discuss our collaboration with the Smith laboratory at St Andrews, in which we discovered the origins of the regioselectivity in carboxyl transfer reactions. We

  2. Hanford facility dangerous waste Part A, Form 3 and Part B permit application documentation, Central Waste Complex (WA7890008967)(TSD: TS-2-4)

    Energy Technology Data Exchange (ETDEWEB)

    Saueressig, D.G.

    1998-05-20

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998.

  3. Hanford facility dangerous waste Part A, Form 3, and Part B permit application documentation for the Central Waste Complex (WA7890008967) (TSD: TS-2-4)

    International Nuclear Information System (INIS)

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, operating, treatment, storage, and/or disposal units, such as the Central Waste Complex (this document, DOE/RL-91-17). Both the General Information and Unit-Specific portions of the Hanford Facility Dangerous Waste Permit Application address the content of the Part B permit application guidance prepared by the Washington State Department of Ecology (Ecology 1996) and the U.S. Environmental Protection Agency (40 Code of Federal Regulations 270), with additional information needed by the Hazardous and Solid Waste Amendments and revisions of Washington Administrative Code 173-303. For ease of reference, the Washington State Department of Ecology alpha-numeric section identifiers from the permit application guidance documentation (Ecology 1996) follow, in brackets, the chapter headings and subheadings. A checklist indicating where information is contained in the Central Waste Complex permit application documentation, in relation to the Washington State Department of Ecology guidance, is located in the Contents section. Documentation contained in the General Information Portion is broader in nature and could be used by multiple treatment, storage, and/or disposal units (e.g., the glossary provided in the General Information Portion). Wherever appropriate, the Central Waste Complex permit application documentation makes cross-reference to the General Information Portion, rather than duplicating text. Information provided in this Central Waste Complex permit application documentation is current as of May 1998

  4. Computer simulated building energy consumption for verification of energy conservation measures in network facilities

    Science.gov (United States)

    Plankey, B.

    1981-01-01

    A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.

  5. Activity flow assessment based on a central radiological computer system 'CRCS' and a comprehensive plant-wide radiation monitoring system 'PW-RMS'

    International Nuclear Information System (INIS)

    The comprehensive plant-wide radiation monitoring concept combines radiation measurement of processes and systems in the plant, online environmental measurements and personal doses in order to allow the assessment of the activity transport and to plan event-related job doses. The plant wide radiation monitoring system 'PW-RMS' consists of a network of radiation monitors and their auxiliary measurements inside the buildings of the plant, at points of release to the environment and in the vicinity of the plant. Data are transmitted to the central radiological computer system 'CRCS' with its radiological assessment features. CRCS is of a modular design and if necessary can be enlarged and modified according to the plant-related performance profile. The system is the result of a long time development based on the experience with Radiation Monitoring Systems 'RMS' in power plants and in the environment and with the Radiological Computer Systems CRCS, installed in nuclear facility and authorities as well. RM-Systems with basic functions of CRCS are running in German NPPs since a long time. CRCS, provided with process data from the plant and its environment are operating successfully in several networks in federal German countries and in Switzerland. A plant- wide radiation monitoring system based on a combination of Russian and German measuring units and with CRCS is in operation in a Slovak nuclear power plant since 2002/2003. The new plant Olkiluoto 3 in Finland, which is currently to be designed, will be equipped with a plant- wide radiation monitoring system and a CRCS. (authors)

  6. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. COMPUTER DYNAMICS MODELING OF TRANSPORT FACILITIES WITH ANTI-LOCKING AND ANTI-SLIPPAGE SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. N. Hurski

    2014-12-01

    Full Text Available The paper considers a methodology for testing anti-locking system (ALS of new generation. Results of computer  dynamics  modeling of automobile motion at braking with ALS and without ALS are given in the paper. The paper also contains an analysis of  basic characteristics: a braking distance, a value of longitudinal delay, an influence of modulator speed on the efficiency of a braking 

  8. Cambridge-Cranfield High Performance Computing Facility (HPCF) purchases ten Sun Fire(TM) 15K servers to dramatically increase power of eScience research

    CERN Multimedia

    2002-01-01

    "The Cambridge-Cranfield High Performance Computing Facility (HPCF), a collaborative environment for data and numerical intensive computing privately run by the University of Cambridge and Cranfield University, has purchased 10 Sun Fire(TM) 15K servers from Sun Microsystems, Inc.. The total investment, which includes more than $40 million in Sun technology, will dramatically increase the computing power, reliability, availability and scalability of the HPCF" (1 page).

  9. Land classification of south-central Iowa from computer enhanced images

    Science.gov (United States)

    Lucas, J. R.; Taranik, J. V.; Billingsley, F. C. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. Enhanced LANDSAT imagery was most useful for land classification purposes, because these images could be photographically printed at large scales such as 1:63,360. The ability to see individual picture elements was no hindrance as long as general image patterns could be discerned. Low cost photographic processing systems for color printings have proved to be effective in the utilization of computer enhanced LANDSAT products for land classification purposes. The initial investment for this type of system was very low, ranging from $100 to $200 beyond a black and white photo lab. The technical expertise can be acquired from reading a color printing and processing manual.

  10. An Examination of Computer Attitudes, Anxieties, and Aversions Among Diverse College Populations: Issues Central to Understanding Information Sciences in the New Millennium

    OpenAIRE

    William H. Burkett; David M. Compton; Gail G. Burkett

    2001-01-01

    Studying the impact of computer attitudes on the production of knowledge is central to the understanding of information sciences in the new millennium. The major results from a survey of diverse college populations suggest that Liberal Arts College (LAC) students, in this demographic, have somewhat more ambivalence toward computers than students in a Community College (CC) or a nontraditional Business College (BC) environment. The respondents generally agreed that computers were an important ...

  11. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    OpenAIRE

    S. A. McFarlane; K. L. Gaustad; E. J. Mlawer; C. N. Long; Delamere, J

    2011-01-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, partial vegetation...

  12. Development of a high spectral resolution surface albedo product for the ARM Southern Great Plains central facility

    OpenAIRE

    S. A. McFarlane; K. L. Gaustad; E. J. Mlawer; C. N. Long; Delamere, J

    2011-01-01

    We present a method for identifying dominant surface type and estimating high spectral resolution surface albedo at the Atmospheric Radiation Measurement (ARM) facility at the Southern Great Plains (SGP) site in Oklahoma for use in radiative transfer calculations. Given a set of 6-channel narrowband visible and near-infrared irradiance measurements from upward and downward looking multi-filter radiometers (MFRs), four different surface types (snow-covered, green vegetation, ...

  13. On dosimetry of radiodiagnosis facilities, mainly focused on computed tomography units

    International Nuclear Information System (INIS)

    The 'talk' refers to the Dosimetry of computed tomography units and it has been thought and structured in three parts, more or less stressed each of them, thus: 1) Basics of image acquisition using computed tomography technique; 2) Effective Dose calculation for a patient and its assessment using BERT concept; 3) Recommended actions of getting a good compromise in between related dose and the image quality. The aim of the first part is that the reader to become acquainted with the CT technique in order to be able of understanding the Effective Dose calculation given example and its conversion into time units using the BERT concept . The drown conclusion is that: 1) Effective dose calculation accomplished by the medical physicist (using a special soft for the CT scanner and the exam type) and, converted in time units through BERT concept, could be then communicated by the radiologist together with the diagnostic notes. Thus, it is obviously necessary a minimum informal of the patients as regards the nature and type of radiation, for instance, by the help of some leaflets. In the third part are discussed the factors which lead to get a good image quality taking into account the ALARA principle of Radiation Protection which states the fact that the dose should be 'as low as reasonable achievable'. (author)

  14. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    International Nuclear Information System (INIS)

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants

  15. Computer assisted quantitative densitometric analysis of 125I-apamin binding sites in the central nervous system.

    Science.gov (United States)

    Janicki, P K; Seibold, G; Siembab, D; Paulo, E A; Szreniawski, Z

    1989-01-01

    The binding sites for 125I-monoiododerivative of apamin in the central nervous system of rat, guinea-pig, chicken and frog were analysed and compared by computer assisted quantitative densitometric autoradiography on X-ray film. The highest level of binding sites in the rat and guinea-pig brain was found in the limbic-olfactory system and in the substantia gelatinosa of the spinal cord. In the chicken brain apamin binds preferentially to the tectum opticum and nuclei isthmi. In the frog brain no specific apamin binding sites were found. The role of presented topography for apamin binding sites is discussed in relation to neurotoxic properties of apamin. PMID:2641421

  16. Experimental and computational investigation of LBE-water interaction in LIFUS 5 facility

    Energy Technology Data Exchange (ETDEWEB)

    Ciampichetti, A. [ENEA FIS ING, C.R. Brasimone, 40032 Camugnano, BO (Italy); Pellini, D. [Universita di Pisa, DIMNP, Via Diotisalvi 2, 56100 Pisa (Italy); Agostini, P.; Benamati, G. [ENEA FIS ING, C.R. Brasimone, 40032 Camugnano, BO (Italy); Forgione, N. [Universita di Pisa, DIMNP, Via Diotisalvi 2, 56100 Pisa (Italy)], E-mail: nicola.forgione@ing.unipi.it; Oriolo, F. [Universita di Pisa, DIMNP, Via Diotisalvi 2, 56100 Pisa (Italy)

    2009-11-15

    The interaction between heavy liquid metal, such as LBE, and pressurized water has been analyzed at ENEA Brasimone Research Centre, in order to investigate the evolution of this phenomenon over a wide range of conditions. The study includes an experimental campaign on LIFUS 5 facility and a numerical simulation activity performed with SIMMER III code. The first test of the experimental program was carried out injecting water at 7 MPa and 235 {sup o}C in a reaction vessel containing LBE at 350 {sup o}C. A pressurization up to 8 MPa was observed in the test section during the short term (about 2 s) of the transient. In the post-test analysis performed with SIMMER III code, two different geometrical models were developed in order to reproduce in the best manner the experimental results and, therefore, to confirm the code's capabilities of reproducing the phenomenology of the LBE-water interaction. The data calculated through both models agreed in a good way with the experimental results, despite the necessary simplifications adopted in the models due to the 2D features of the code.

  17. Establishment of scatter factors for use in shielding calculations and risk assessment for computed tomography facilities

    International Nuclear Information System (INIS)

    The specification of shielding for CT facilities in the UK and many other countries has been based on isodose scatter curves supplied by the manufacturers combined with the scanner's mAs workload. Shielding calculations for radiography and fluoroscopy are linked to a dose measurement of radiation incident on the patient called the kerma–area product (KAP), and a related quantity, the dose-length product (DLP), is now employed for assessment of CT patient doses. In this study the link between scatter air kerma and DLP has been investigated for CT scanners from different manufacturers. Scatter air kerma values have been measured and scatter factors established that can be used to estimate air kerma levels within CT scanning rooms. Factors recommended to derive the scatter air kerma at 1 m from the isocentre are 0.36 µGy (mGy cm)−1 for the body and 0.14 µGy (mGy cm)−1 for head scans. The CT scanner gantries only transmit 10% of the scatter air kerma level and this can also be taken into account when designing protection. The factors can be used to predict scatter air kerma levels within a scanner room that might be used in risk assessments relating to personnel whose presence may be required during CT fluoroscopy procedures.

  18. Study on cranial computed tomography in infants and children with central nervous system disorders, 2

    International Nuclear Information System (INIS)

    110 patients with cerebral palsy were studied by cranial computed tomography (CT) and electroencephalography (EEG) and the following results were obtained: 1) Abnormal brain findings on CT were present in 69% of spastic quadriplegia type, in 75% of spastic hemiplegia type, in 23% of athetotic type and in 50% of mixed type. 2) Most patients with spastic quadriplegia revealed diffuse cerebral atrophy and patients with spastic hemiplegia mostly showed hemispherial cerebral atrophy at the contralateral side to the motor paralysis on CT. Most patients with athetotis revealed normal CT-findings, but a few indicated slight diffuse cerebral atrophy on CT. 3) The severer was mental retardation of the patients, the more frequent and severer were CT-abnormalities. 4) Patients with epileptic seizure showed CT-abnormalities more often than patients without the seizure. 5) There was a good correlation between the abnormality of background activities on EEG and that on CT, in which their laterality coincided in most cases. 6) Sides of seizure discharges on EEG were the same as those of CT-abnormalities in 1/3 to 1/2 of patients, but the localization of seizure discharges corresponded to that of CT-abnormalities only in 11% of the cases. (author)

  19. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    International Nuclear Information System (INIS)

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs

  20. MASFLO: a computer code to calculate mass flow rates in the Thermal-Hydraulic Test Facility (THTF). Technical report

    International Nuclear Information System (INIS)

    This report documents a modular data interpretation computer code. The MASFLO code is a Fortran code used in the Oak Ridge National Laboratory Blowdown Heat Transfer Program to convert measured quantities of density, volumetric flow, and momentum flux into a calculated quantity: mass flow rate. The code performs both homogeneous and two-velocity calculations. The homogeneous models incorporate various combinations of the Thermal-Hydraulic Test Facility instrumented spool piece turbine flow meter, gamma densitometer, and drag disk readings. The two-velocity calculations also incorporate these instruments, but in models developed by Aya, Rouhani, and Popper. Each subroutine is described briefly, and input instructions are provided in the appendix along with a sample of the code output

  1. ECED 2013: Eastern and Central Europe Decommissioning. International Conference on Decommissioning of Nuclear Facilities. Conference Guide and Book of Abstracts

    International Nuclear Information System (INIS)

    The Conference included the following sessions: (I) Opening session (2 contributions); (II) Managerial and Funding Aspects of Decommissioning (5 contributions); (III) Technical Aspects of Decommissioning I (6 contributions); (IV) Experience with Present Decommissioning Projects (4 contributions); (V) Poster Session (14 contributions); (VI) Eastern and Central Europe Decommissioning - Panel Discussion; (VII) Release of Materials, Waste Management and Spent Fuel Management (6 contributions); (VIII) Technical Aspects of Decommissioning II (5 contributions).

  2. Medication supply to residential aged care facilities in Western Australia using a centralized medication chart to replace prescriptions

    OpenAIRE

    Hoti Kreshnik; Hughes Jeffery; Sunderland Bruce

    2012-01-01

    Abstract Background Current model of medication supply to R(RACFs) in Australia is dependent on paper-based prescriptions. This study is aimed at assessing the use of a centralized medication chart as a prescription-less model for supplying medications to RACFs. Methods Two separate focus groups were conducted with general practitioners (GPs) and pharmacists, and another three with registered nurses (RNs) and carers combined. All focus group participants were working with RACFs. Audio-recorde...

  3. Computational analysis of vapour condensation in presence of air in the TOSQAN facility

    International Nuclear Information System (INIS)

    Full text of publication follows: Condensation heat transfer in the presence of noncondensable gases is a relevant aspect in many industrial applications, including nuclear reactors. In particular the design of nuclear reactor containments is heavily dependent on an accurate description of heat and mass transfer phenomena: after a postulated accident, the steam release into the containment atmosphere causes the pressure of the steam-air mixture to rise up to levels which are affected by condensation on the containment walls. At the present, in addition to the quantitative prediction of the phenomenon by engineering laws and correlations, normally based on the heat and mass transfer analogy and conceived for design purposes, the interest is greatly shifted towards detailed analysis by computational fluid-dynamic codes, which are capable of a mechanistic approach to the problem, at the price of a computational effort which is becoming more and more affordable as computer speed and general capabilities improve. The present work is focused on the computational analysis of the TOSQAN benchmark known as International Standard Problem (ISP) 47. In the test, air was initially present in the vessel (about 7 m3) and steam, air and helium were injected during different phases at various mass flow rates. The thermal-hydraulic behaviour of the containment atmosphere was determined by the dominant physical phenomena: gas injection, steam condensation, heat transfer and buoyant flow. During certain phases, steady states were reached when the steam condensation rate became equal to the steam injection rate, with all boundary conditions (wall temperatures) remaining constant. The aim of the study is to contribute to the understanding of the heat and mass transfer mechanisms involved in the problem and to check the possibility to make use of a multipurpose commercial CFD code for simulating the containment transient and the mass transfer phenomena of interest in the nuclear field

  4. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  5. Coupled Monte Carlo - Discrete ordinates computational scheme for three-dimensional shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport simulation technique. This work proposes a dedicated computational approach for coupled Monte Carlo - deterministic transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. To enable the coupling of these two different computational methods, a mapping approach has been developed for calculating the discrete ordinates angular flux distribution from the scored data of the Monte Carlo particle tracks crossing a specified surface. The approach has been implemented in an interface program and validated by means of test calculations using a simplified three-dimensional geometric model. Satisfactory agreement was obtained for the angular fluxes calculated by the mapping approach using the MCNP code for the Monte Carlo calculations and direct three-dimensional discrete ordinates calculations using the TORT code. In the next step, a complete program system has been developed for coupled three-dimensional Monte Carlo deterministic transport calculations by integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and the mapping interface program. Test calculations with two simple models have been performed to validate the program system by means of comparison calculations using the

  6. Central axis dose verification in patients treated with total body irradiation of photons using a Computed Radiography system

    International Nuclear Information System (INIS)

    To propose and evaluate a method for the central axis dose verification in patients treated with total body irradiation (TBI) of photons using images obtained through a Computed Radiography (CR) system. It was used the Computed Radiography (Fuji) portal imaging cassette readings and correlate with measured of absorbed dose in water using 10 x 10 irradiation fields with ionization chamber in the 60Co equipment. The analytical and graphic expression is obtained through software 'Origin8', the TBI patient portal verification images were processed using software ImageJ, to obtain the patient dose. To validate the results, the absorbed dose in RW3 models was measured with ionization chamber with different thickness, simulating TBI real conditions. Finally it was performed a retrospective study over the last 4 years obtaining the patients absorbed dose based on the reading in the image and comparing with the planned dose. The analytical equation obtained permits estimate the absorbed dose using image pixel value and the dose measured with ionization chamber and correlated with patient clinical records. Those results are compared with reported evidence obtaining a difference less than 02%, the 3 methods were compared and the results are within 10%. (Author)

  7. Registration of central paths and colonic polyps between supine and prone scans in computed tomography colonography: Pilot study

    International Nuclear Information System (INIS)

    Computed tomography colonography (CTC) is a minimally invasive method that allows the evaluation of the colon wall from CT sections of the abdomen/pelvis. The primary goal of CTC is to detect colonic polyps, precursors to colorectal cancer. Because imperfect cleansing and distension can cause portions of the colon wall to be collapsed, covered with water, and/or covered with retained stool, patients are scanned in both prone and supine positions. We believe that both reading efficiency and computer aided detection (CAD) of CTC images can be improved by accurate registration of data from the supine and prone positions. We developed a two-stage approach that first registers the colonic central paths using a heuristic and automated algorithm and then matches polyps or polyp candidates (CAD hits) by a statistical approach. We evaluated the registration algorithm on 24 patient cases. After path registration, the mean misalignment distance between prone and supine identical anatomic landmarks was reduced from 47.08 to 12.66 mm, a 73% improvement. The polyp registration algorithm was specifically evaluated using eight patient cases for which radiologists identified polyps separately for both supine and prone data sets, and then manually registered corresponding pairs. The algorithm correctly matched 78% of these pairs without user input. The algorithm was also applied to the 30 highest-scoring CAD hits in the prone and supine scans and showed a success rate of 50% in automatically registering corresponding polyp pairs. Finally, we computed the average number of CAD hits that need to be manually compared in order to find the correct matches among the top 30 CAD hits. With polyp registration, the average number of comparisons was 1.78 per polyp, as opposed to 4.28 comparisons without polyp registration

  8. Use of Monte Carlo simulation for computational analysis of critical systems on IPPE's facility addressing needs of nuclear safety

    Energy Technology Data Exchange (ETDEWEB)

    Pavlova, Olga; Tsibulya, Anatoly [FSUE ' SSC RF-IPPE' , 249033, Bondarenko Square 1, Obninsk (Russian Federation)

    2008-07-01

    The critical facility BFS-1 critical facility was built at the Institute of Physics and Power Engineering (Obninsk, Russia) for full-scale modeling of fast-reactor cores, blankets, in-vessel shielding, and storage. Whereas BFS-1 is a fast-reactor assembly; however, it is a very flexible assembly that can easily be reconfigured to represent numerous other types of reactor designs. This paper describes specific problems with calculation of evaluation neutron physics characteristics of integral experiments performed on BFS facility. The analysis available integral experiments performed on different critical configuration of BFS facility were performed. Calculations of criticality, central reaction rate ratios, and fission rate distributions were carried out by the MCNP5 Monte-Carlo code with different files of evaluated nuclear data. MCNP calculations with multigroup library with 299 energy groups were also made for comparison with pointwise library calculations. (authors)

  9. A computer code TERFOC-N to calculate doses to the public due to atmospheric releases of radionuclides in normal operations of nuclear facilities

    International Nuclear Information System (INIS)

    A computer code TERFOC-N has been developed to calculate doses to the public due to atmospheric releases of radionuclides in normal operations of nuclear facilities. This code calculates the highest individual dose and the collective dose from 4 exposure pathway; internal doses from ingestion and inhalation, external doses from cloudshine and groundshine. Foodchain models, which are originally referred to the U. S. Nuclear Regulatory Guide 1.109, have been improved to apply not only LWRs but also to other nuclear facilities. This report describes the employed models and the computer code, and gives a sample run performed by this code. (author)

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    International Nuclear Information System (INIS)

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3 degree to 20 degree and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations

  12. Medication supply to residential aged care facilities in Western Australia using a centralized medication chart to replace prescriptions

    Directory of Open Access Journals (Sweden)

    Hoti Kreshnik

    2012-07-01

    Full Text Available Abstract Background Current model of medication supply to R(RACFs in Australia is dependent on paper-based prescriptions. This study is aimed at assessing the use of a centralized medication chart as a prescription-less model for supplying medications to RACFs. Methods Two separate focus groups were conducted with general practitioners (GPs and pharmacists, and another three with registered nurses (RNs and carers combined. All focus group participants were working with RACFs. Audio-recorded data were compared with field notes, transcribed and imported into NVivo® where it was thematically analyzed. Results A prescription-less medication chart model was supported and it appeared to potentially improve medication supply to RACF residents. Centralization of medication supply, clarification of medication orders and responding in real-time to therapy changes made by GPs were reasons for supporting the medication chart model. Pharmacists preferred an electronic version of this model. All health professionals cautioned against the need for GPs regularly reviewing the medication chart and proposed a time interval of four to six months for this review to occur. Therapy changes during weekends appeared a potential difficulty for RNs and carers whereas pharmacists cautioned about legible writing and claiming of medications dispensed according to a paper-based model. GPs cautioned on the need to monitor the amount of medications dispensed by the pharmacy. Conclusion The current use of paper prescriptions in nursing homes was identified as burdensome. A prescription-less medication chart model was suggested to potentially improve medication supply to RACF residents. An electronic version of this model could address main potential difficulties raised.

  13. Animal facilities

    International Nuclear Information System (INIS)

    The animal facilities in the Division are described. They consist of kennels, animal rooms, service areas, and technical areas (examining rooms, operating rooms, pathology labs, x-ray rooms, and 60Co exposure facilities). The computer support facility is also described. The advent of the Conversational Monitor System at Argonne has launched a new effort to set up conversational computing and graphics software for users. The existing LS-11 data acquisition systems have been further enhanced and expanded. The divisional radiation facilities include a number of gamma, neutron, and x-ray radiation sources with accompanying areas for related equipment. There are five 60Co irradiation facilities; a research reactor, Janus, is a source for fission-spectrum neutrons; two other neutron sources in the Chicago area are also available to the staff for cell biology studies. The electron microscope facilities are also described

  14. Annual report to the Laser Facility Committee, 1982

    International Nuclear Information System (INIS)

    The report covers the work done at, or in association with, the Central Laser Facility during the year April 1981 to March 1982 under the headings; glass laser facility development, gas laser development, laser plasma interactions, transport and particle emission studies, ablative acceleration and compression studies, spectroscopy and XUV lasers, and, theory and computation. Publications based on the work of the facility which have either appeared or been accepted for publication during the year are listed. (U.K.)

  15. Computer-assisted cartography using topographic properties: precision and accuracy of local soil maps in central Mexico

    Directory of Open Access Journals (Sweden)

    Gustavo Cruz-Cárdenas

    2011-06-01

    Full Text Available Map units directly related to properties of soil-landscape are generated by local soil classes. Therefore to take into consideration the knowledge of farmers is essential to automate the procedure. The aim of this study was to map local soil classes by computer-assisted cartography (CAC, using several combinations of topographic properties produced by GIS (digital elevation model, aspect, slope, and profile curvature. A decision tree was used to find the number of topographic properties required for digital cartography of the local soil classes. The maps produced were evaluated based on the attributes of map quality defined as precision and accuracy of the CAC-based maps. The evaluation was carried out in Central Mexico using three maps of local soil classes with contrasting landscape and climatic conditions (desert, temperate, and tropical. In the three areas the precision (56 % of the CAC maps based on elevation as topographical feature was higher than when based on slope, aspect and profile curvature. The accuracy of the maps (boundary locations was however low (33 %, in other words, further research is required to improve this indicator.

  16. Focal axonal swellings and associated ultrastructural changes attenuate conduction velocity in central nervous system axons: a computer modeling study.

    Science.gov (United States)

    Kolaric, Katarina V; Thomson, Gemma; Edgar, Julia M; Brown, Angus M

    2013-08-01

    The constancy of action potential conduction in the central nervous system (CNS) relies on uniform axon diameter coupled with fidelity of the overlying myelin providing high-resistance, low capacitance insulation. Whereas the effects of demyelination on conduction have been extensively studied/modeled, equivalent studies on the repercussions for conduction of axon swelling, a common early pathological feature of (potentially reversible) axonal injury, are lacking. The recent description of experimentally acquired morphological and electrical properties of small CNS axons and oligodendrocytes prompted us to incorporate these data into a computer model, with the aim of simulating the effects of focal axon swelling on action potential conduction. A single swelling on an otherwise intact axon, as occurs in optic nerve axons of Cnp1 null mice caused a small decrease in conduction velocity. The presence of single swellings on multiple contiguous internodal regions (INR), as likely occurs in advanced disease, caused qualitatively similar results, except the dimensions of the swellings required to produce equivalent attenuation of conduction were significantly decreased. Our simulations of the consequences of metabolic insult to axons, namely, the appearance of multiple swollen regions, accompanied by perturbation of overlying myelin and increased axolemmal permeability, contained within a single INR, revealed that conduction block occurred when the dimensions of the simulated swellings were within the limits of those measured experimentally, suggesting that multiple swellings on a single axon could contribute to axonal dysfunction, and that increased axolemmal permeability is the decisive factor that promotes conduction block. PMID:24303138

  17. Computer-Enriched Instruction (CEI) Is Better for Preview Material Instead of Review Material: An Example of a Biostatistics Chapter, the Central Limit Theorem

    Science.gov (United States)

    See, Lai-Chu; Huang, Yu-Hsun; Chang, Yi-Hu; Chiu, Yeo-Ju; Chen, Yi-Fen; Napper, Vicki S.

    2010-01-01

    This study examines the timing using computer-enriched instruction (CEI), before or after a traditional lecture to determine cross-over effect, period effect, and learning effect arising from sequencing of instruction. A 2 x 2 cross-over design was used with CEI to teach central limit theorem (CLT). Two sequences of graduate students in nursing…

  18. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  19. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana

    International Nuclear Information System (INIS)

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s-1. An average dose equivalent rate estimated for supervised areas is 3.4±0.27 μSv week-1 and that for the controlled area is 18.0±0.15 μSv week-1, which are within acceptable values. (authors)

  20. Overview on CSNI Separate Effects Test Facility Matrices for Validation of Best Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    An internationally agreed separate effects test (Set) Validation Matrix for thermal-hydraulic system codes has been established by a sub-group of the Task Group on Thermal Hydraulic System Behaviour as requested by the OECD/Nea Committee on Safety of Nuclear Installations (CSNI) Principal Working Group No. 2 on Coolant System Behaviour. The construction of such a Matrix is an attempt to collect together in a systematic way the best sets of openly available test data for code validation, assessment and improvement and also for quantitative code assessment with respect to quantification of uncertainties to the modelling of individual phenomena by the codes. The methodology that has been developed during the process of establishing CSNI-Set validation matrix was an important outcome of the work on Set matrix. In the paper, the methodology developed will be discussed in detail, together with the examples from the Set matrix. In addition, all the choices, which have been made from the 187 identified facilities covering the 67 phenomena, will be investigated together with some discussions on the data-base. Facilities and phenomena have been cross-referenced in a separate effects test cross reference matrix, while the selected separate effects tests themselves are listed against the thermal-hydraulic phenomena for which they can provide validation data. As a preliminary to the classification of facilities and test data, it was necessary to identify a sufficiently complete list of relevant phenomena for LOCA and non-LOCA transient applications of PWRs and BWRs. The majority of these phenomena are also relevant to Advanced Water Cooled Reactors. To this end, 67 phenomena were identified for inclusion in the Set matrix and, in all; about 2094 tests are included in the Set matrix. The Set matrix, as it stands, is representative of the major part of the experimental work, which has been carried out in the LWR-safety thermal hydraulics field, covering a large number of

  1. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on

  2. Computer code analysis of steam generator in thermal-hydraulic test facility simulating nuclear power plant; Ydinvoimalaitosta kuvaavan koelaitteiston hoeyrystimien analysointi tietokoneohjelmilla

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.

    1995-12-31

    In the study three loss-of-feedwater type experiments which were preformed with the PACTEL facility has been calculated with two computer codes. The purpose of the experiments was to gain information about the behaviour of horizontal steam generator in a situation where the water level on the secondary side of the steam generator is decreasing. At the same time data that can be used in the assessment of thermal-hydraulic computer codes was assembled. The purpose of the work was to study the capabilities of two computer codes, APROS version 2.11 and RELAP5/MOD3.1, to calculate the phenomena in horizontal steam generator. In order to make the comparison of the calculation results easier the same kind of model of the steam generator was made for both codes. Only the steam generator was modelled, the rest of the facility was given for the codes as a boundary condition. (23 refs.).

  3. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  4. Dynamic Thermal Loads and Cooling Requirements Calculations for V ACs System in Nuclear Fuel Processing Facilities Using Computer Aided Energy Conservation Models

    International Nuclear Information System (INIS)

    In terms of nuclear safety, the most important function of ventilation air conditioning (VAC) systems is to maintain safe ambient conditions for components and structures important to safety inside the nuclear facility and to maintain appropriate working conditions for the plant's operating and maintenance staff. As a part of a study aimed to evaluate the performance of VAC system of the nuclear fuel cycle facility (NFCF) a computer model was developed and verified to evaluate the thermal loads and cooling requirements for different zones of fuel processing facility. The program is based on transfer function method (TFM) and it is used to calculate the dynamic heat gain by various multilayer walls constructions and windows hour by hour at any orientation of the building. The developed model was verified by comparing the obtained calculated results of the solar heat gain by a given building with the corresponding calculated values using finite difference method (FDM) and total equivalent temperature different method (TETD). As an example the developed program is used to calculate the cooling loads of the different zones of a typical nuclear fuel facility the results showed that the cooling capacities of the different cooling units of each zone of the facility meet the design requirements according to safety regulations in nuclear facilities.

  5. Coldleg Loss of Coolant Accident (LOCA) Analysis of the Modified Reactor Thermohydraulic Test Facility Using CATHENA Computer Code

    International Nuclear Information System (INIS)

    A LOCA analysis at coldleg of the reactor thermal hydraulic test facility using CATHENA computer code has been completely conducted. The analysis was performed by modeling the reactor thermal hydraulic test loop into generic models of the CATHENA such as PUMP, VALVE, VOLUME, ACCUMULATOR, TANK, RESERVOIR, DISCHARGE, GENERALIZED TANK, and GENHTP. The primary system was simulated at power of 1 MWatt with pressure and mass flow at 15.5 MPa and 9.4 kg/s respectively. At secondary side, feedwater flowed at 15.0 kg/s with temperature of 27 oC and pressure of 0.8 MPa. The calculation showed that during steady state the inlet and outlet temperature of test section were 121 oC and 146 oC. After calculating steady state condition, the calculation was followed by transient calculation. The transient was triggered by pipe break at coldleg with diameter of the break was 2 mm. Due to this break, the pressure decreased dramatically. When the pressure reached 4.2 MPa, the accumulator started supplying water into the system. A moment later, the pump was also tripped because of the continuing pressure drop that reached 4.0 MPa. As a consequence the coolant flow was also dropped At the coolant 40 % of normal flow, the power of heated rods then shut down. The result of calculation showed that during the transient, the maximum coolant temperature was 159 oC and the maximum temperature of heated rods was 223 oC. Based on these results, it can be concluded that during the transient, the heated rods were not in danger. (author)

  6. Application of pathways analyses for site performance prediction for the Gas Centrifuge Enrichment Plant and Oak Ridge Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    The suitability of the Gas Centrifuge Enrichment Plant and the Oak Ridge Central Waste Disposal Facility for shallow-land burial of low-level radioactive waste is evaluated using pathways analyses. The analyses rely on conservative scenarios to describe the generation and migration of contamination and the potential human exposure to the waste. Conceptual and numerical models are developed using data from comprehensive laboratory and field investigations and are used to simulate the long-term transport of contamination to man. Conservatism is built into the analyses when assumptions concerning future events have to be made or when uncertainties concerning site or waste characteristics exist. Maximum potential doses to man are calculated and compared to the appropriate standards. The sites are found to provide adequate buffer to persons outside the DOE reservations. Conclusions concerning site capacity and site acceptability are drawn. In reaching these conclusions, some consideration is given to the uncertainties and conservatisms involved in the analyses. Analytical methods to quantitatively assess the probability of future events to occur and the sensitivity of the results to data uncertainty may prove useful in relaxing some of the conservatism built into the analyses. The applicability of such methods to pathways analyses is briefly discussed. 18 refs., 9 figs

  7. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    OpenAIRE

    M. Benčík; Zakutaev, M. O.; Zaitsev, S. I.; Schekoldin, V. I.; F. D'Auria; I. V. Elkin; Melikhov, O. I.; Adorni, M.; Del Nevo, A.

    2012-01-01

    The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents), a primary-to-secondary leak, and a parametric study (natural circulation test) aimed at characterizing the VVER system...

  8. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  9. Review of recent progress in laser plasma interaction and laser compression work at the Science Research Council Rutherford Laboratory Central Laser Facility

    International Nuclear Information System (INIS)

    The SRC two beam Nd glass laser facility with output of up to 40J per beam in 100 ps pulses and 80J per beam in 1.6 ns pulses has been used to study interactions in the critical density region, energy transport in solid targets and compression of spherical targets. Magnetic field generation and ponderomotive force modifications to density gradients have been observed by optical probing using Faraday rotation and interferometry. Electron energy transport has been studied with layered targets. The ablation front plasma parameters and the ablation depth have been recorded for spherical targets and related to numerical modelling of the inhibited thermal transport. The temporal and spatial development of transmission in thin plane polymer film has been used as an indicator of lateral thermal transport. X-ray spectroscopy of compression cores in exploding pusher targets has been refined by new line profile calculations and improved computer fitting methods, and some further results are presented. Dense implosions created by exploding glass shell compression of a high density gas fill have been studied by pulsed X-ray shadowgraphy. 40 references. (UK)

  10. Experience with a mobile data storage device for transfer of studies from the critical care unit to a central nuclear medicine computer

    International Nuclear Information System (INIS)

    The introduction of mobile scintillation cameras has enabled the more immediate provision of nuclear medicine services in areas remote from the central nuclear medicine laboratory. Since a large number of such studies involve the use of a computer for data analysis, the concurrent problem of how to transmit those data to the computer becomes critical. A device is described using hard magnetic discs as the recording media and which can be wheeled from the patient's bedside to the central computer for playback. Some initial design problems, primarily associated with the critical timing which is necessary for the collection of gated studies, were overcome and the unit has been in service for the past two years. The major limitations are the relatively small capacity of the discs and the fact that the data are recorded in list mode. These constraints result in studies having poor statistical validity. The slow turn-around time, which results from the necessity to transport the system to the department and replay the study into the computer before analysis can begin, is also of particular concern. The use of this unit has clearly demonstrated the very important role that nuclear medicine can play in the care of the critically ill patient. The introduction of a complete acquisition and analysis unit is planned so that prompt diagnostic decisions can be made available within the intensive care unit. (author)

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. A rare case of dilated invaginated odontome with talon cusp in a permanent maxillary central incisor diagnosed by cone beam computed tomography

    International Nuclear Information System (INIS)

    It has been a challenge to establish the accurate diagnosis of developmental tooth anomalies based on periapical radiographs. Recently, three-dimensional imaging by cone beam computed tomography has provided useful information to investigate the complex anatomy of and establish the proper management for tooth anomalies. The most severe variant of dens invaginatus, known as dilated odontome, is a rare occurrence, and the cone beam computed tomographic findings of this anomaly have never been reported for an erupted permanent maxillary central incisor. The occurrence of talon cusp occurring along with dens invaginatus is also unusual. The aim of this report was to show the importance of cone beam computed tomography in contributing to the accurate diagnosis and evaluation of the complex anatomy of this rare anomaly.

  13. A rare case of dilated invaginated odontome with talon cusp in a permanent maxillary central incisor diagnosed by cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jaya, Ranganathan; Kumar, Rangarajan Sundaresan Mohan; Srinivasan, Ramasamy [Dept. of Conservative Dentistry and Endodontics, Priyadarshini Dental College and Hospital, Chennai (India)

    2013-09-15

    It has been a challenge to establish the accurate diagnosis of developmental tooth anomalies based on periapical radiographs. Recently, three-dimensional imaging by cone beam computed tomography has provided useful information to investigate the complex anatomy of and establish the proper management for tooth anomalies. The most severe variant of dens invaginatus, known as dilated odontome, is a rare occurrence, and the cone beam computed tomographic findings of this anomaly have never been reported for an erupted permanent maxillary central incisor. The occurrence of talon cusp occurring along with dens invaginatus is also unusual. The aim of this report was to show the importance of cone beam computed tomography in contributing to the accurate diagnosis and evaluation of the complex anatomy of this rare anomaly.

  14. User guide to the SRS data logging facility

    International Nuclear Information System (INIS)

    The state of the SRS is recorded every two minutes, thus providing a detailed History of its parameters. Recording of History is done via the SRS Computer Network. This consists of a Master Computer, an Interdata 7/32, and three Minicomputers, Interdata 7/16s. Each of the Minicomputers controls one of the accelerators, Linac, Booster and Storage Ring. The Master Computer is connected to the Central Computer, an IBM 370/165, for jobs where greater computing power and storage are required. The Master Computer has a total of 20 Megabytes of fixed and movable disc space but only about 5 Megabytes are available for History storage. The Minicomputers have no storage facilities. The user guide is set out as follows: History filing system, History storage on the Master Computer, transfer of the History to the Central Computer, transferring History to tapes, job integrity, the SRS tape catalogue system. (author)

  15. Modeling Potential Carbon Monoxide Exposure Due to Operation of a Major Rocket Engine Altitude Test Facility Using Computational Fluid Dynamics

    Science.gov (United States)

    Blotzer, Michael J.; Woods, Jody L.

    2009-01-01

    This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.; ; Pavlasek, Tomas J. F.; ; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data......, and for the fields inside an absorber lined chamber....

  20. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda–a cross sectional study

    OpenAIRE

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham; Clarke, Sîan E.; Hansen, Kristian S; Magnussen, Pascal; LaRussa, Philip; Mbonye, Anthony K

    2016-01-01

    Background Private facilities are the first place of care seeking for many sick children. Involving these facilities in child health interventions may provide opportunities to improve child welfare. The objective of this study was to assess the potential of rural and urban private facilities in diagnostic capabilities, operations and human resource in the management of malaria, pneumonia and diarrhoea. Methods A survey was conducted in pharmacies, private clinics and drug shops in Mukono dist...

  1. Expertise concerning the request by the ZWILAG Intermediate Storage Facility Wuerenlingen AG for granting of a licence for the building and operation of the Central Intermediate Storage Facility for radioactive wastes

    International Nuclear Information System (INIS)

    On July 15, 1993, the Intermediate Storage Facility Wuerenlingen AG (ZWILAG) submitted a request to the Swiss Federal Council for granting of a license for the construction and operation of a central intermediate storage facility for radioactive wastes. The project foresees intermediate storage halls as well as conditioning and incineration installations. The Federal Agency for the Safety of Nuclear Installations (HSK) has to examine the project from the point of view of nuclear safety. The present report presents the results of this examination. Different waste types have to be treated in ZWILAG: spent fuel assemblies from Swiss nuclear power plants (KKWs); vitrified, highly radioactive wastes from reprocessing; intermediate and low-level radioactive wastes from KKWs and from reprocessing; wastes from the dismantling of nuclear installations; wastes from medicine, industry and research. The wastes are partitioned into three categories: high-level (HAA) radioactive wastes containing, amongst others, α-active nuclides, intermediate-level (MAA) radioactive wastes and low-level (SAA) radioactive wastes. The projected installation consists of three repository halls for each waste category, a hot cell, a conditioning plant and an incineration and melting installation. The HAA repository can accept 200 transport and storage containers with vitrified high-level wastes or spent fuel assemblies. The expected radioactivity amounts to 1020 Bq, including 1018 Bq of α-active nuclides. The thermal power produced by decay is released to the environment by natural circulation of air. The ventilation system is designed for a maximum power of 5.8 MW. Severe conditions are imposed to the containers as far as tightness and shielding against radiation is concerned. In the repository for MAA wastes the maximum radioactivity is 1018 Bq with 1015 Bq of α-active nuclides. The maximum thermal power of 250 kW is removed by forced air cooling. Because of the high level of radiation the

  2. Non-symbolically mediated programming environments for the development of computational thinking. An experience in the training of computer science professors of Ecuador Central University

    OpenAIRE

    Pérez Narváez, Hamilton Omar; Roig-Vila, Rosabel

    2015-01-01

    Este artículo aborda la investigación, realizada con los estudiantes del primer semestre de la titulación de Informática de la Facultad de Filosofía, Letras y Ciencias de la Educación de la Universidad Central del Ecuador, cuyo propósito ha sido analizar el uso de entornos de programación no mediados simbólicamente como herramienta didáctica para el desarrollo del pensamiento computacional. Se pretende establecer las posibles ventajas de aplicar este tipo de entorno para que los estudiantes d...

  3. Integrated computer network high-speed parallel interface

    International Nuclear Information System (INIS)

    As the number and variety of computers within Los Alamos Scientific Laboratory's Central Computer Facility grows, the need for a standard, high-speed intercomputer interface has become more apparent. This report details the development of a High-Speed Parallel Interface from conceptual through implementation stages to meet current and future needs for large-scle network computing within the Integrated Computer Network. 4 figures

  4. Focal axonal swellings and associated ultrastructural changes attenuate conduction velocity in central nervous system axons: a computer modeling study

    OpenAIRE

    Kolaric, Katarina V; Thomson, Gemma; Edgar, Julia M; Brown, Angus M.

    2013-01-01

    The constancy of action potential conduction in the central nervous system (CNS) relies on uniform axon diameter coupled with fidelity of the overlying myelin providing high-resistance, low capacitance insulation. Whereas the effects of demyelination on conduction have been extensively studied/modeled, equivalent studies on the repercussions for conduction of axon swelling, a common early pathological feature of (potentially reversible) axonal injury, are lacking. The recent description of ex...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Computational approaches to the prediction of blood-brain barrier permeability: A comparative analysis of central nervous system drugs versus secretase inhibitors for Alzheimer's disease.

    Science.gov (United States)

    Rishton, Gilbert M; LaBonte, Kristen; Williams, Antony J; Kassam, Karim; Kolovanov, Eduard

    2006-05-01

    This review summarizes progress made in the development of fully computational approaches to the prediction of blood-brain barrier (BBB) permeability of small molecules, with a focus on rapid computational methods suitable for the analysis of large compound sets and virtual screening. A comparative analysis using the recently developed Advanced Chemistry Development (ACD/Labs) Inc BBB permeability algorithm for the calculation of logBB values for known Alzheimer's disease medicines, selected central nervous system drugs and new secretase inhibitors for Alzheimer's disease, is presented. The trends in logBB values and the associated physiochemical properties of these agents as they relate to the potential for BBB permeability are also discussed. PMID:16729726

  11. Developing and implementing a computer assisted emergency facility for assessing off-site consequences due to accidents in UK nuclear power reactors

    International Nuclear Information System (INIS)

    This paper outlines considerations in the development of the RAD computer code as used in the Emergency Room at HM NII for assessing off-site consequences of accidents in UK civil nuclear power reactors. A wide range of requirements have been accommodated within the facility, particularly the need of HM NII to meet its responsibilities by producing realistic and timely estimates of a suitably high quality for propagating advice. The development of the computer code has required the balancing of many competing factors. Valuable experience has been gained by using the code during emergency exercises. Importance is laid on the feedback of field measurements to enhance the accuracy of estimated radiological consequences. (author)

  12. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  13. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  15. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  18. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  19. Unifying model of carpal mechanics based on computationally derived isometric constraints and rules-based motion - the stable central column theory.

    Science.gov (United States)

    Sandow, M J; Fisher, T J; Howard, C Q; Papas, S

    2014-05-01

    This study was part of a larger project to develop a (kinetic) theory of carpal motion based on computationally derived isometric constraints. Three-dimensional models were created from computed tomography scans of the wrists of ten normal subjects and carpal spatial relationships at physiological motion extremes were assessed. Specific points on the surface of the various carpal bones and the radius that remained isometric through range of movement were identified. Analysis of the isometric constraints and intercarpal motion suggests that the carpus functions as a stable central column (lunate-capitate-hamate-trapezoid-trapezium) with a supporting lateral column (scaphoid), which behaves as a 'two gear four bar linkage'. The triquetrum functions as an ulnar translation restraint, as well as controlling lunate flexion. The 'trapezoid'-shaped trapezoid places the trapezium anterior to the transverse plane of the radius and ulna, and thus rotates the principal axis of the central column to correspond to that used in the 'dart thrower's motion'. This study presents a forward kinematic analysis of the carpus that provides the basis for the development of a unifying kinetic theory of wrist motion based on isometric constraints and rules-based motion. PMID:24072199

  20. Temporal evaluation of computed tomographic scans at a Level 1 trauma department in a central South African hospital

    OpenAIRE

    Tony Tiemesmann; Jacques Raubenheimer; Coert de Vries

    2016-01-01

    Background: Time is a precious commodity, especially in the trauma setting, which requires continuous evaluation to ensure streamlined service delivery, quality patient care and employee efficiency.Objectives: The present study analyses the authors’ institution’s multi-detector computed tomography (MDCT) scan process as part of the imaging turnaround time of trauma patients. It is intended to serve as a baseline for the institution, to offer a comparison with institutions worldwide and to imp...

  1. FMIT facility control system

    International Nuclear Information System (INIS)

    The control system for the Fusion Materials Irradiation Test (FMIT) Facility, under construction at Richland, Washington, uses current techniques in distributed processing to achieve responsiveness, maintainability and reliability. Developmental experience with the system on the FMIT Prototype Accelerator (FPA) being designed at the Los Alamos National Laboratory is described as a function of the system's design goals and details. The functional requirements of the FMIT control system dictated the use of a highly operator-responsive, display-oriented structure, using state-of-the-art console devices for man-machine communications. Further, current technology has allowed the movement of device-dependent tasks into the area traditionally occupied by remote input-output equipment; the system's dual central process computers communicate with remote communications nodes containing microcomputers that are architecturally similar to the top-level machines. The system has been designed to take advantage of commercially available hardware and software

  2. Design/Installation and Structural Integrity Assessment of Bethel Valley Low-Level Waste collection and transfer system upgrade for Building 3092 (central off-gas scrubber facility) at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    This document describes and assesses planned modifications to be made to the Building 3092 Central Off-Gas Scrubber Facility of the Oak Ridge National Laboratory, Oak Ridge, Tennessee. The modifications are made in response to the requirements of 40CFR264 Subpart J, relating to environmental protection requirements for buried tank systems. The modifications include the provision of a new scrubber recirculation tank in a new, below ground, lined concrete vault, replacing an existing recirculation sump that does not provide double containment. A new buried, double contained pipeline is provided to permit discharge of spent scrubber recirculation fluid to the Central Waste Collection Header. The new vault, tank, and discharge line are provided with leak detection and provisions to remove accumulated liquid. Ne scrubber recirculation pumps, piping, and accessories are also provided. This assessment concludes that the planned modifications comply with applicable requirements of 40CFR264 Subpart J, as set forth in Appendix F to the Federal Facility Agreement, Docket No. 89-04-FF, covering the Oak Ridge Reservation. A formal design certification statement is included herein on Page 53, a certification covering the installation shall be executed prior to placing the modified facility into service

  3. The John von Neumann Institute for Computing (NIC): A survey of its supercomputer facilities and its Europe-wide computational science activities

    International Nuclear Information System (INIS)

    The John von Neumann Institute for Computing (NIC) at the Research Centre Juelich, Germany, is one of the leading supercomputing centres in Europe. Founded as a national centre in the mid-eighties it now provides more and more resources to European scientists. This happens within EU-funded projects (I3HP, DEISA) or Europe-wide scientific collaborations. Beyond these activities NIC started an initiative towards the new EU member states in summer 2004. Outstanding research groups are offered to exploit the supercomputers at NIC to accelerate their investigations on leading-edge technology. The article gives an overview of the organisational structure of NIC, its current supercomputer systems, and its user support. Transnational Access (TA) within I3HP is described as well as access by the initiative for new EU member states. The volume of these offers and the procedure of how to apply for supercomputer resources is introduced in detail

  4. The John von Neumann Institute for Computing (NIC): A survey of its supercomputer facilities and its Europe-wide computational science activities

    Science.gov (United States)

    Attig, N.

    2006-03-01

    The John von Neumann Institute for Computing (NIC) at the Research Centre Jülich, Germany, is one of the leading supercomputing centres in Europe. Founded as a national centre in the mid-eighties it now provides more and more resources to European scientists. This happens within EU-funded projects (I3HP, DEISA) or Europe-wide scientific collaborations. Beyond these activities NIC started an initiative towards the new EU member states in summer 2004. Outstanding research groups are offered to exploit the supercomputers at NIC to accelerate their investigations on leading-edge technology. The article gives an overview of the organisational structure of NIC, its current supercomputer systems, and its user support. Transnational Access (TA) within I3HP is described as well as access by the initiative for new EU member states. The volume of these offers and the procedure of how to apply for supercomputer resources is introduced in detail.

  5. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.

  6. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  7. A mathematical model using computed tomography for the diagnosis of metastatic central compartment lymph nodes in papillary thyroid carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Tianrun [The Sixth Affiliated Hospital of Sun Yat-sen University, Guangzhou (China); Su, Xuan; Chen, Weichao; Zheng, Lie; Li, Li; Yang, AnKui [Sun Yat-sen University Cancer Center, Guangzhou (China)

    2014-11-15

    The purpose of this study was to establish a potential mathematical model for the diagnosis of the central compartment lymph node (LN) metastases of papillary thyroid carcinoma (PTC) using CT imaging. 303 patients with PTC were enrolled. We determined the optimal cut-off points of LN size and nodal grouping by calculating the diagnostic value of each cut-off point. Then, calcification, cystic or necrotic change, abnormal enhancement, size, and nodal grouping were analysed by univariate and multivariate statistical methods. The mathematical model was obtained using binary logistic regression analysis, and a scoring system was developed for convenient use in clinical practice. 30 mm{sup 2} for LNs area (size) and two LNs as the nodal grouping criterion had the best diagnostic value. The mathematical model was: p = e{sup y} /(1+ e {sup y}), y = -0.670-0.087 x size + 1.010 x cystic or necrotic change + 1.371 x abnormal enhancement + 0.828 x nodal grouping + 0.909 x area. We assigned the value for cystic or necrotic change, abnormal enhancement, size and nodal grouping value as 25, 33, 20, and 22, respectively, yielding a scoring system. This mathematical model has a high diagnostic value and is a convenient clinical tool. (orig.)

  8. Annual report to the Laser Facility Committee 1980

    International Nuclear Information System (INIS)

    The work done at, or in conjunction with, the Central Laser Facility during the year April 1979 to March 1980, is reported and discussed. The fields covered in the seven chapters are: (1) Glass laser facility development. (2) Gas laser development. (3) Laser plasma interaction. (4) Transport and particle emission. (5) Ablative compression studies. (6) X-ray spectroscopy and XUV lasers. (7) Theory and computation. Publications based on work at, or in conjunction with, the Facility, which have appeared or been accepted for publication during the year are listed in an appendix. (U.K.)

  9. Report on the control of the safety and security of nuclear facilities. Part 2: the reconversion of military plutonium stocks. The use of the helps given to central and eastern Europe countries and to the new independent states

    International Nuclear Information System (INIS)

    This report deals with two different aspects of the safety and security of nuclear facilities. The first aspect concerns the reconversion of weapon grade plutonium stocks: the plutonium in excess, plutonium hazards and nuclear fuel potentialities, the US program, the Russian program, the actions of European countries (France, Germany), the intervention of other countries, the unanswered questions (political aspects, uncertainties), the solutions of the future (improvement of reactors, the helium-cooled high temperature reactor technology (gas-turbine modular helium reactor: GT-MHR), the Carlo Rubbia's project). The second aspect concerns the actions carried out by the European Union in favor of the civil nuclear facilities of central and eastern Europe: the European Union competencies through the Euratom treaty, the conclusions of the European audit office about the PHARE and TACIS nuclear programs, the status of committed actions, the coming planned actions, and the critical analysis of the policy adopted so far. (J.S.)

  10. Liquid Effluent Retention Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Liquid Effluent Retention Facility (LERF) is located in the central part of the Hanford Site. LERF is permitted by the State of Washington and has three liquid...

  11. Central odontogenic fibroma (simple type) in a four year old boy: Atypical cone-beam computed tomographic appearance with periosteal reaction

    International Nuclear Information System (INIS)

    Central odontogenic fibroma (COF) is a rare benign tumor that accounts for 0.1% of all odontogenic tumors. A case of COF (simple type) of the mandible in a four-year-old boy is described in this report. The patient showed asymptomatic swelling in the right inferior border of the lower jaw for one week. A panoramic radiograph showed a poorly-defined destructive unilocular radiolucent area. Cone-beam computed tomography showed expansion and perforation of the adjacent cortical bone plates. A periosteal reaction with the Codman triangle pattern was clearly visible in the buccal cortex. Since the tumor had destroyed a considerable amount of bone, surgical resection was performed. No recurrence was noted

  12. Retrospective study of root canal configurations of maxillary third molars in Central India population using cone beam computed tomography Part- I

    Science.gov (United States)

    Rawtiya, Manjusha; Somasundaram, Pavithra; Wadhwani, Shefali; Munuga, Swapna; Agarwal, Manish; Sethi, Priyank

    2016-01-01

    Objective: The aim of this study was to investigate the root and canal morphology of maxillary third molars in Central India population using cone-beam computed tomography (CBCT) analysis. Materials and Methods: CBCT images of 116 maxillary third molars were observed, and data regarding the number of roots, the number of canals, and Vertucci's Classification in each root was statistically evaluated. Results: Majority of Maxillary third molars had three roots (55.2%) and three canals (37.9%). Most MB root (43.8%), DB root (87.5%), and palatal root (100%) of maxillary third molars had Vertucci Type I. Mesiobuccal root of three-rooted maxillary third molars had Vertucci Type I (43.8%) and Type IV (40.6%) configuration. Overall prevalence of C-shaped canals in maxillary third molars was 3.4%. Conclusion: There was a high prevalence of three-rooted maxillary molars with three canals. PMID:27011747

  13. Central odontogenic fibroma (simple type) in a four year old boy: Atypical cone-beam computed tomographic appearance with periosteal reaction

    Energy Technology Data Exchange (ETDEWEB)

    Anbiaee, Najme; Ebrahimnejad, Hamed; Sanaei, Alireza [Dept. of , Oral and Maxillofacial Radiology, Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2015-06-15

    Central odontogenic fibroma (COF) is a rare benign tumor that accounts for 0.1% of all odontogenic tumors. A case of COF (simple type) of the mandible in a four-year-old boy is described in this report. The patient showed asymptomatic swelling in the right inferior border of the lower jaw for one week. A panoramic radiograph showed a poorly-defined destructive unilocular radiolucent area. Cone-beam computed tomography showed expansion and perforation of the adjacent cortical bone plates. A periosteal reaction with the Codman triangle pattern was clearly visible in the buccal cortex. Since the tumor had destroyed a considerable amount of bone, surgical resection was performed. No recurrence was noted.

  14. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    Energy Technology Data Exchange (ETDEWEB)

    Chan, M.K.; Ballinger, M.Y.; Owczarski, P.C.

    1989-02-01

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs.

  15. Description of NORMTRI: a computer program for assessing the off-site consequences from air-borne releases of tritium during normal operation of nuclear facilities

    International Nuclear Information System (INIS)

    The computer program NORMTRI has been developed to calculate the behaviour of tritium in the environment released into the atmosphere under normal operation of nuclear facilities. It is possible to investigate the two chemical forms tritium gas and tritiated water vapour. The conversion of tritium gas into tritiated water followed by its reemission back to the atmosphere as well as the conversion into organically bound tritium is considered. NORMTRI is based on the statistical Gaussian dispersion model ISOLA, which calculates the activity concentration in air near the ground contamination due to dry and wet deposition at specified locations in a polar grid system. ISOLA requires a four-parametric meteorological statistics derived from one or more years synoptic recordings of 1-hour-averages of wind speed, wind direction, stability class and precipitation intensity. Additional features of NORMTRI are the possibility to choose several dose calculation procedures, ranging from the equations of the German regulatory guidelines to a pure specific equilibrium approach. (orig.)

  16. ORION-WIN. A computer code to estimate environmental concentration and radiation dose due to airborne discharge of radioactive materials from nuclear facilities

    International Nuclear Information System (INIS)

    A computer code, ORION-WIN, has been developed to estimate environmental concentration and radiation dose to public due to airborne discharge of radioactive materials from multiple sources of nuclear fuel cycle facilities. The modified Gaussian plume model is applied to calculate atmospheric dispersion of the discharged radioactive material. The plume depletion processes such as gravitational settling, dry deposition, precipitation scavenging and radioactive decay are considered, and resuspension from the ground and product of progeny from the parent radionuclides are also considered. Inhalation and oral intake are considered in internal pathways, and submersion in the radioactive cloud and external exposure to contaminated ground surface are considered in external pathways. Radiation dose to an individual is calculated. Because ORION-WIN has many options, it is applicable to the sensitivity study of safety assessment for the nuclear fuel cycle facilities and to dose assessment based on the amount of radioactive materials discharged from the stack. The ORION-WIN is an updated version of ORION-II and runs on PC with Windows OS, and has a graphical user interface for inputting the parameters and referring output files. (author)

  17. Quality assessment of out sourced after-hours computed tomography teleradiology reports in a Central London University Hospital

    International Nuclear Information System (INIS)

    The study was designed to assess the quality of out sourced after-hours computed tomography teleradiology service reports. We evaluated 1028 patients over a time period of five month in 2009/2010 (437 female, 591 male, mean age: 51 years, range: 0–97 years) who were referred either by the A and E or other in house departments from 7 pm to 8 am for different reasons. Reporting was done by a teleradiology service provider located in the UK and Australia. Reports were assessed during the routinely performed morning meeting by a panel of in house radiologists. Assessment was done by a five point agreement scale (5 = “No disagreement”, 1 = “…unequivocal potential for serious morbidity or threat to life”). In 811 (79%) patients no disagreement was found, 164 (16%) were rated as category 4, 40 (4%) as category 3 (“…likelihood of harm is low”). In 13 (1.3%) patients a decision of category 2 was made (“…strong likelihood of moderate morbidity but not threat to life”). No category 1 decision was made. As this was just a discrepancy decision, a follow up of the category 2 patients was done over a period of a maximum of 6 months. In 8 (0.8%) patients the in house reports were correct, in 2 (0.2%) patients the teleradiology service provider was right and in 3 (0.3%) patients the final diagnoses remained unclear. In conclusion there was a small rate (0.8%) of proven serious misinterpretations by the teleradiology service provider, but these were less than in comparable studies with preliminary in house staff reports (1.6–24.6%).

  18. Temporal evaluation of computed tomographic scans at a Level 1 trauma department in a central South African hospital

    Directory of Open Access Journals (Sweden)

    Tony Tiemesmann

    2016-03-01

    Full Text Available Background: Time is a precious commodity, especially in the trauma setting, which requires continuous evaluation to ensure streamlined service delivery, quality patient care and employee efficiency.Objectives: The present study analyses the authors’ institution’s multi-detector computed tomography (MDCT scan process as part of the imaging turnaround time of trauma patients. It is intended to serve as a baseline for the institution, to offer a comparison with institutions worldwide and to improve service delivery.Method: Relevant categorical data were collected from the trauma patient register and radiological information system (RIS from 01 February 2013 to 31 January 2014. A population of 1107 trauma patients who received a MDCT scan was included in the study. Temporal data were analysed as a continuum with reference to triage priority, time of day, type of CT scan and admission status. Results: The median trauma arrival to MDCT scan time (TTS and reporting turnaround time (RTAT were 69 (39–126 and 86 (53–146 minutes respectively. TTS was subdivided into the time when the patient arrived at trauma to the radiology referral (TTRef and submission of the radiology request, to the arrival at the MDCT (RefTS location. TTRef was statistically significantly longer than RefTS (p < 0.0001. RTAT was subdivided into the arrival at the MDCT to the start of the radiology report (STR and time taken to complete the report (RT. STR was statistically significantly longer than RT (p < 0.0001. Conclusion: The time to scan (TTS was comparable to, but unfortunately the report turnaround time (RTAT lagged behind, the findings of some first-world institutions.[PDF to follow

  19. Development of a model for geomorphological assessment at U.S. DOE chemical/radioactive waste disposal facilities in the central and eastern United States; Weldon spring site remedial action project, Weldon Spring, Missouri

    International Nuclear Information System (INIS)

    Landform development and long-term geomorphic stability is the result of a complex interaction of a number of geomorphic processes. These processes may be highly variable in intensity and duration under different physiographic settings. This limitation has influenced the applicability of previous geomorphological stability assessments conducted in the arid or semi-arid western United States to site evaluations in more temperate and humid climates. The purpose of this study was to develop a model suitable for evaluating both long-term and short-term geomorphic processes which may impact landform stability and hence the stability of disposal facilities located in the central and eastern United States. The model developed for the geomorphological stability assessment at the Weldon Spring Site Remedial Action Project (WSSRAP) near St. Louis, Missouri, included an evaluation of existing landforms and consideration of the impact of both long-term and short-term geomorphic processes. These parameters were evaluated with respect to their impact and contribution to three assessment criteria considered most important with respect to the stability analysis; evaluation of landform age, evaluation of present geomorphic process activity and; determination of the impact of the completed facility on existing geomorphic processes. The geomorphological assessment at the Weldon Spring site indicated that the facility is located in an area of excellent geomorphic stability. The only geomorphic process determined to have a potential detrimental effect on long-term facility performance is an extension of the drainage network. A program of mitigating measures has been proposed to minimize the impact that future gully extension could have on the integrity of the facility

  20. Computational fluid dynamics as a virtual facility for R and D in the IRIS project: an overview

    International Nuclear Information System (INIS)

    The pressurized light water cooled, medium power (1000 MWt) IRIS (International Reactor Innovative and Secure) has been under development for four years by an international consortium of over 21 organizations from ten countries. The plant conceptual design was completed in 2001 and the preliminary design is nearing completion. The pre-application licensing process with NRC started in October, 2002 and IRIS is one of the designs considered by US utilities as part of the ESP (Early Site Permit) process. The development of a new nuclear plant concept presents the opportunity and potential for a significant usage of computational fluid dynamics (CFD) in the design process, as it is in many conventional applications related to power generation. A CFD-Group of international scientists has been given the mission of investigating the many application opportunities for CFD related to the IRIS project and to verify the support that the IRIS design process may gain from CFD in terms of time, costs, resource saving, and visibility. The key objective identified is the use of CFD as a design tool for virtual tests in order to simplify the optimization effort for the nuclear plant's components and support the IRIS testing program. In this paper, the CFD-Group is described in terms of their resources and capabilities. A program of activities with identified goals and a possible schedule is also presented.(author)

  1. Procedures and techniques for monitoring the radiation detection, signalization and alarm systems in the centralized ambience monitoring systems of the basic nuclear facilities of the CEN Saclay

    International Nuclear Information System (INIS)

    After referring to the regulations governing the 'systematic ambience monitoring' in the basic nuclear facilities, the main radiation detection, signalization and alarm devices existing at present in these facilities of the Saclay Nuclear Study Centre are described. The analysis of the operating defects of the measuring channels and detection possibilities leads to the anomalies being classified in two separate groups: the anomalies of the logical 'all or nothing' type of which all the possible origins are integrated into a so-called 'continuity' line and the evolutive anomalies of various origins corresponding to poor functioning extending possibly to a complete absence of signal. The techniques for testing the detection devices of the radiation monitoring board set up in the 'Departement de Rayonnements' at the Saclay Nuclear Study Centre are also described

  2. Using On-line Analytical Processing (OLAP) and data mining to estimate emergency room activity in DoD Medical Treatment Facilities in the Tricare Central Region

    OpenAIRE

    Ferguson, Cary V.

    2001-01-01

    On-line Analytical Processing (OLAP) and data mining can greatly enhance the ability of the Military Medical Treatment Facility (MTF) emergency room (ER) manager to improve ER staffing and utilization. MTF ER managers use statistical data analysis to help manage the efficient operation and use of ERs. As the size and complexity of databases increase, traditional statistical analysis becomes limited in the amount and type of information it can extract. OLAP tools enable the analysis of multi-d...

  3. Design/installation and structural integrity assessment of Bethel Valley low-level waste collection and transfer system upgrade for Building 3092 (Central Off-Gas Scrubber Facility) at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    This document describes and assesses planned modifications to be made to the Building 3092 Central Off-Gas Scrubber Facility of the Oak Ridge National Laboratory, Oak Ridge, Tennessee. The modifications are made in responsible to the requirements of 40CFR264 Subpart J, relating to environmental protection requirements for buried tank systems. The modifications include the provision of a new scrubber recirculation tank in a new, below ground, lines concrete vault, replacing and existing recirculation sump that does not provide double containment. A new buried, double contained pipeline is provided to permit discharge of spent scrubber recirculation fluid to the Central Waste Collection Header. The new vault, tank, and discharge line are provided with leak detection and provisions to remove accumulated liquid. New scrubber recirculation pumps, piping, and accessories are also provided. This assessment concludes that the planned modifications comply with applicable requirements of 40CFR264 Subpart J, as set forth in Appendix F to the Federal Facility Agreement, Docket No. 89-04-FF, covering the Oak Ridge Reservation

  4. Discussion paper: siting a low-level radioactive waste disposal facility in the Central Interstate Low-Level Radioactive Waste Compact region. A model for other regions

    International Nuclear Information System (INIS)

    While this paper is intended to be broad enough in scope to cover the concerns and implementation guidelines of any compact commission, the focus will be on the procedures and guidelines that must be followed under the terms of the Central Interstate Low-Level Radioactive Waste Compact. This is not necessarily an endorsement of the Central States Compact approach to siting but is rather an attempt to discuss the siting process by using an existing compact as an example. As stated earlier, each compact commission will have to follow the specific guidelines of its compact. Many of the procedures to be followed and technical standards to be considered, however, apply to all the compacts

  5. Evaluation of the DYMAC demonstration program. Phase III report. [LASL Plutonium Processing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Malanify, J.J.; Bearse, R.C. (comps.)

    1980-12-31

    An accountancy system based on the Dynamic Materials Accountability (DYMAC) System has been in operation at the Plutonium Processing Facility at the Los Alamos National Laboratory since January 1978. This system, now designated the Plutonium Facility/Los Alamos Safeguards System (PF/LASS), has enhanced nuclear material accountability and process control at the Los Alamos facility. The nondestructive assay instruments and the central computer system are operating accurately and reliably. As anticipated, several uses of the system, notably scrap control and quality control, have developed in addition to safeguards. The successes of this experiment strongly suggest that implementation of DYMAC-based systems should be attempted at other facilities.

  6. Assessing the potential of rural and urban private facilities in implementing child health interventions in Mukono district, central Uganda-a cross sectional study

    DEFF Research Database (Denmark)

    Rutebemberwa, Elizeus; Buregyeya, Esther; Lal, Sham;

    2016-01-01

    in diagnostic capabilities, operations and human resource in the management of malaria, pneumonia and diarrhoea. METHODS: A survey was conducted in pharmacies, private clinics and drug shops in Mukono district in October 2014. An assessment was done on availability of diagnostic equipment for malaria, record...... attended to at least one sick child in the week prior to the interview. CONCLUSION: There were big gaps between rural and urban private facilities with rural ones having less trained personnel and less zinc tablets' availability. In both rural and urban areas, record keeping was low. Child health...

  7. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  8. Computer programming to calculate the variations of characteristic angles of heliostats as a function of time and position in a central receiver solar power plant

    Energy Technology Data Exchange (ETDEWEB)

    Mehrabian, M.A.; Aseman, R.D. [Mechanical Engineering Dept., Shahid Bahonar Univ., Kerman (Iran, Islamic Republic of)

    2008-07-01

    The central receiver solar power plant is composed of a large number of individually stirred mirrors (heliostats), focusing the solar radiation onto a tower-mounted receiver. In this paper, an algorithm is developed based on vector geometry to pick an individual heliostat and calculate its characteristic angles at different times of the day and different days of the year. The algorithm then picks the other heliostats one by one and performs the same calculations as did for the first one. These data are used to control the orientation of heliostats for improving the performance of the field. This procedure is relatively straight-forward, and quite suitable for computer programming. The effect of major parameters such as shading and blocking on the performance of the heliostat field is also studied using this algorithm. The results of computer simulation are presented in three sections: (1) the characteristic angles of individual heliostats, (2) the incidence angle of the sun rays striking individual heliostats, and (3) the blocking and shading effect of each heliostat. The calculations and comparisons of results show that: (a) the average incidence angle in the northern hemisphere at the north side of the tower is less than that at its south side, (b) the cosine losses are reduced as the latitude is increased or the tower height is increased, (c) the blocking effect is more important in winter and its effect is much more noticeable than shading for large fields, (d) the height of the tower does not considerably affect shading; but significantly reduces the blocking effect, and (e) to have no blocking effect throughout the year, the field design should be performed for the winter solstice noon. (orig.)

  9. Reliable Facility Location Problem with Facility Protection.

    Science.gov (United States)

    Tang, Luohao; Zhu, Cheng; Lin, Zaili; Shi, Jianmai; Zhang, Weiming

    2016-01-01

    This paper studies a reliable facility location problem with facility protection that aims to hedge against random facility disruptions by both strategically protecting some facilities and using backup facilities for the demands. An Integer Programming model is proposed for this problem, in which the failure probabilities of facilities are site-specific. A solution approach combining Lagrangian Relaxation and local search is proposed and is demonstrated to be both effective and efficient based on computational experiments on random numerical examples with 49, 88, 150 and 263 nodes in the network. A real case study for a 100-city network in Hunan province, China, is presented, based on which the properties of the model are discussed and some managerial insights are analyzed. PMID:27583542

  10. Diagnosis of an infected central venous catheter with ultrasound and computed tomography; Diagnose eines infizierten Thrombus der Vena cava inferior mit Sonographie und Computertomographie

    Energy Technology Data Exchange (ETDEWEB)

    Tacke, J. [RWTH Aachen (Germany). Klinik fuer Radiologische Diagnostik; Adam, G. [RWTH Aachen (Germany). Klinik fuer Radiologische Diagnostik; Sliwka, U. [RWTH Aachen (Germany). Neurologische Klinik; Klosterhalfen, B. [RWTH Aachen (Germany). Inst. fuer Pathologie; Schoendube, F. [RWTH Aachen (Germany). Klinik fuer Thorax- Herz- und Gefaesschirurgie

    1995-08-01

    The authors report the case of a 16-year-old male patient, who suffered from meningitis and Waterhouse-Friderichsen syndrome. After initial improvement in the intensive care unit, he developed septic temperatures, caused by an infected thrombus of a central venous catheter in the inferior vena cava, Color-coded ultrasound showed hyperechogenic signals and missing flow detection at the catheter tip. Computed tomography showed air bubbles in the thrombosed catheter tip and confirmed the diagnosis. Vasuclar surgery was done and an infected, 17-cm-long infected thrombus was removed. (orig./VHE) [Deutsch] Die Autoren berichten ueber den Fall eines 16jaehrigen Patienten, dem wegen einer Meningitis und der Zeichen eines Waterhouse-Friderichsen-Syndroms ein femoralvenoeser Zentralkatheter gelegt wurde. Nach initialer Entfieberung entwickelte sich eine Sepsis, deren Ursache in einem infizierten Thrombus des Zentralvenenkatheters lag. Die Diagnose wurde sonographisch gestellt und nachfolgend computertomographisch bestaetigt. In beiden Verfahren wiesen Lufteinschluesse im Katheterthrombus auf die Injektion hin. Der Befund wurde durch eine gefaesschirurgische Thrombektomie bestaetigt und therapiert. (orig./VHE)

  11. Development of a computational code for calculations of shielding in dental facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em instalacoes odontologicas

    Energy Technology Data Exchange (ETDEWEB)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L., E-mail: deise_dy@hotmail.com, E-mail: diogosb@outlook.com, E-mail: raoniwa@yahoo.com.br, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report.

  12. Reference design for a centralized waste processing and storage facility. Technical manual for the management of low and intermediate level wastes generated at small nuclear research centres and by radioisotope users in medicine, research and industry

    International Nuclear Information System (INIS)

    The objective of this report is to present the generic reference design of a centralized waste processing and storage facility (WPSF) intended for countries producing small but significant quantities of liquid and solid radioactive wastes. These wastes are generated through the use of radionuclides for research, medical, industrial and other institutional activities in IAEA Member States that have not yet developed the infrastructure for a complete nuclear fuel cycle. The WPSF comprises two separate buildings. The first, for receiving and processing waste from the producers, includes the necessary equipment and support services for treating and conditioning the waste. The second building acts as a simple but adequate warehouse for storing a ten year inventory of the conditioned waste. In developing the design, it was a requirement of the IAEA that options for waste management techniques for each of the waste streams should be evaluated, in order to demonstrate that the reference design is based on the most appropriate technology. Refs, figs and tabs

  13. Support facilities

    International Nuclear Information System (INIS)

    Computer support is centered on the Remote Access Data Station (RADS), which is equipped with a 1000 lpm printer, 1000 cpm reader, and a 300 cps paper tape reader with 500-foot spools. The RADS is located in a data preparation room with four 029 key punches (two of which interpret), a storage vault for archival magnetic tapes, card files, and a 30 cps interactive terminal principally used for job inquiry and routing. An adjacent room provides work space for users, with a documentation library and a consultant's office, plus file storage for programs and their documentations. The facility has approximately 2,600 square feet of working laboratory space, and includes two fully equipped photographic darkrooms, sectioning and autoradiographic facilities, six microscope cubicles, and five transmission electron microscopes and one Cambridge scanning electron microscope equipped with an x-ray energy dispersive analytical system. Ancillary specimen preparative equipment includes vacuum evaporators, freeze-drying and freeze-etching equipment, ultramicrotomes, and assorted photographic and light microscopic equipment. The extensive physical plant of the animal facilities includes provisions for holding all species of laboratory animals under controlled conditions of temperature, humidity, and lighting. More than forty rooms are available for studies of the smaller species. These have a potential capacity of more than 75,000 mice, or smaller numbers of larger species and those requiring special housing arrangements. There are also six dog kennels to accommodate approximately 750 dogs housed in runs that consist of heated indoor compartments and outdoor exercise areas

  14. Geohydrology of the High Energy Laser System Test Facility site, White Sands Missile Range, Tularosa Basin, south-central New Mexico

    Science.gov (United States)

    Basabilvazo, G.T.; Nickerson, E.L.; Myers, R.G.

    1994-01-01

    The Yesum-HoHoman and Gypsum land (hummocky) soils at the High Energy Laser System Test Facility (HELSTF) represent wind deposits from recently desiccated lacustrine deposits and deposits from the ancestral Lake Otero. The upper 15-20 feet of the subsurface consists of varved gypsiferous clay and silt. Below these surfidai deposits the lithology consists of interbedded clay units, silty-clay units, and fine- to medium-grained quartz arenite units in continuous and discontinuous horizons. Clay horizons can cause perched water above the water table. Analyses of selected clay samples indicate that clay units are composed chiefly of kaolinire and mixed-layer illite/ smectite. The main aquifer is representative of a leaky-confined aquifer. Estimated aquifer properties are: transmissivity (T) = 780 feet squared per day, storage coefficient (S) = 3.1 x 10-3, and hydraulic conductivity (K) = 6.0 feet per day. Ground water flows south and southwest; the estimated hydraulic gradient is 5.3 feet per mile. Analyses of water samples indicate that ground water at the HELSTF site is brackish to slightly saline at the top of the main aquifer. Dissolved-solids concentration near the top of the main aquifer ranges from 5,940 to 11,800 milligrams per liter. Predominant ions are sodium and sulfate. At 815 feet below land surface, the largest dissolved-solids concentration measured is 111,000 milligrams per liter, which indicates increasing salinity with depth. Predominant ions are sodium and chloride.

  15. RCRA Facility Investigation/Remedial Investigation Report with Baseline Risk Assessment for the Central Shops Burning/Rubble Pit (631-6G), Volume 1 Final

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    The Burning/Rubble Pits at the Savannah River Site were usually shallow excavations approximately 3 to 4 meters in depth. Operations at the pits consisted of collecting waste on a continuous basis and burning on a monthly basis. The Central Shops Burning/Rubble Pit 631- 6G (BRP6G) was constructed in 1951 as an unlined earthen pit in surficial sediments for disposal of paper, lumber, cans and empty galvanized steel drums. The unit may have received other materials such as plastics, rubber, rags, cardboard, oil, degreasers, or drummed solvents. The BRP6G was operated from 1951 until 1955. After disposal activities ceased, the area was covered with soil. Hazardous substances, if present, may have migrated into the surrounding soil and/or groundwater. Because of this possibility, the United States Environmental Protection Agency (EPA) has designated the BRP6G as a Solid Waste Management Unit (SWMU) subject to the Resource Conservation Recovery Act/Comprehensive Environmental Response, Compensation and Liability Act (RCRA/CERCLA) process.

  16. FRS (Facility Registration System) Sites, Geographic NAD83, EPA (2007) [facility_registration_system_sites_LA_EPA_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — This dataset contains locations of Facility Registry System (FRS) sites which were pulled from a centrally managed database that identifies facilities, sites or...

  17. Nova control system - goals, architecture, and central system design

    International Nuclear Information System (INIS)

    Control and data acquisition functions for the Nova Laser and Target Irradiation facility are performed by a distributed, hierarchically organized, network of computers and devices interconnected through high speed fiber optic communication links. The architecture established for this control system provides the flexibility within each of its fundamental subsystems (power Conditioning, Alignment, Laser Diagnostics and Target Diagnostics) to optimize internal design and organization according to their specific criteria. Control system integration, support of common functions, and centralization of operation are achieved using a fifth unifying subsystem called Central Controls

  18. Annual report to the Laser Facility Committee 1979

    International Nuclear Information System (INIS)

    The report covers the work done at the Central Laser Facility, Rutherford Laboratory during the year preceding 31 March 1979. Preliminary work already undertaken on the upgrade of the glass laser and target areas consisting of the relocation of the two beam target chamber and tests on phosphate glass and also the completion of the electron beam generator for use by researchers on high power gas laser systems, are described. Work of the groups using the glass laser facility are considered under the headings; glass laser development, gas laser development, laser plasma interactions, transport and particle emission, ablative compression studies, atomic and radiation physics, XUV lasers, theory and computation. (U.K.)

  19. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  20. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  1. 21 CFR 1305.24 - Central processing of orders.

    Science.gov (United States)

    2010-04-01

    ... order with all linked records on the central computer system. (b) A company that has central processing... or more registered locations and maintains a central processing computer system in which orders are... the company owns and operates....

  2. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  3. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  4. Irreversible computable functions

    OpenAIRE

    Hoyrup, Mathieu

    2014-01-01

    The strong relationship between topology and computations has played a central role in the development of several branches of theoretical computer science: foundations of functional programming, computational geometry, computability theory, computable analysis. Often it happens that a given function is not computable simply because it is not continuous. In many cases, the function can moreover be proved to be non-computable in the stronger sense that it does not preserve computability: it map...

  5. Directory of computer users in nuclear medicine

    International Nuclear Information System (INIS)

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included

  6. Directory of computer users in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, J.J.; Gurney, J.; McClain, W.J. (eds.)

    1979-09-01

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included. (PCS)

  7. A computer control system for a research reactor

    International Nuclear Information System (INIS)

    Most reactor applications until now, have not required computer control of core output. Commercial reactors are generally operated at a constant power output to provide baseline power. However, if commercial reactor cores are to become load following over a wide range, then centralized digital computer control is required to make the entire facility respond as a single unit to continual changes in power demand. Navy and research reactors are much smaller and simpler and are operated at constant power levels as required, without concern for the number of operators required to operate the facility. For navy reactors, centralized digital computer control may provide space savings and reduced personnel requirements. Computer control offers research reactors versatility to efficiently change a system to develop new ideas. The operation of any reactor facility would be enhanced by a controller that does not panic and is continually monitoring all facility parameters. Eventually very sophisticated computer control systems may be developed which will sense operational problems, diagnose the problem, and depending on the severity of the problem, immediately activate safety systems or consult with operators before taking action

  8. Centralized digital control of accelerators

    International Nuclear Information System (INIS)

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors

  9. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  10. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  11. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  12. Experimental facilities

    International Nuclear Information System (INIS)

    We have completed an engineering feasibility study of a major modification of the HFIR facility and are now beginning a similar study of an entirely new facility. The design of the reactor itself is common to both options. In this paper, a general description of the modified HFIR is presented with some indications of the additional facilities that might be available in an entirely new facility

  13. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 2; Unsteady Analyses and Risk Assessment

    Science.gov (United States)

    Ahuja, Vineet; Hosangadi, Ashvin; Allgood, Daniel

    2008-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Design Support of the feasibility of operating conditions and procedures is critical in such cases due to the possibility of startup/shutdown transients, moving shock structures, unsteady shock-boundary layer interactions and engine and diffuser unstart modes that can result in catastrophic failure. Analyses of such systems is difficult due to resolution requirements needed to accurately capture moving shock structures, shock-boundary layer interactions, two-phase flow regimes and engine unstart modes. In a companion paper, we will demonstrate with the use of CFD, steady analyses advanced capability to evaluate supersonic diffuser and steam ejector performance in the sub-scale A-3 facility. In this paper we will address transient issues with the operation of the facility especially at startup and shutdown, and assess risks related to afterburning due to the interaction of a fuel rich plume with oxygen that is a by-product of the steam ejectors. The primary areas that will be addressed in this paper are: (1) analyses of unstart modes due to flow transients especially during startup/ignition, (2) engine safety during the shutdown process (3) interaction of steam ejectors with the primary plume i.e. flow transients as well as probability of afterburning. In this abstract we discuss unsteady analyses of the engine shutdown process. However, the final paper will include analyses of a staged startup, drawdown of the engine test cell pressure, and risk assessment of potential afterburning in the facility. Unsteady

  14. Centralized Fabric Management Using Puppet, Git, and GLPI

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of automated services. Puppet is a seasoned, open-source tool designed for enterprise-class centralized configuration management. At the RHIC/ATLAS Computing Facility at Brookhaven National Laboratory, we have adopted Puppet as part of a suite of tools, including Git, GLPI, and some custom scripts, that comprise our centralized configuration management system. In this paper, we discuss the use of these tools for centralized configuration management of our servers and services; change management, which requires authorized approval of production changes; a complete, version-controlled history of all changes made; separation of production, testing, and development systems using Puppet environments; semi-automated server inventory using GLPI; and configuration change monitoring and reporting via the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a...

  15. Use of the IBM ASTAP program for computer design of the sustaining neutral-beam power supply for the Mirror Fusion Test Facility with emphasis on the need for a shunt preconditioner

    International Nuclear Information System (INIS)

    The power supplies and regulators for the 24 neutral-beam guns to be used in the Lawrence Livermore Laboratory Mirror Fusion Test Facility (MFTF) have been analyzed. The initial results showed that transients involved in long pulses caused power-rating difficulties in the 13.8-kV line-voltage adjusting equipment, and in the regulator-modulator switch tube. A shunt preconditioner circuit was investigated, and appears to have sufficiently desirable features to warrant its inclusion in the system. In addition, considerable computation was carried out, so that most important components throughout the system could be selected. 8 refs

  16. 6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

    CERN Multimedia

    Teams : M. Brice, JC Gadmer

    2010-01-01

    6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

  17. Reference design for a centralized spent sealed sources facility. Technical manual for the management of low and intermediate level wastes generated at small nuclear research centres and by radioisotope users in medicine, research and industry

    International Nuclear Information System (INIS)

    To assist Member States in establishing facilities in which the most frequently occurring spent sealed sources can be safely conditioned, the IAEA has financed the development of a generic design for a Spent Sealed Sources Facility (SSS Facility). The purpose of this TECDOC is to provide enough general information about the functions and capabilities of the SSS Facility to enable the reader to understand what the facility can do to contribute towards the management of spent sealed sources without providing all the technical and/or design information available. Sufficient information is provided to enable the reader to judge how and to what extent such a facility can contribute to national radioactive waste management infrastructure. 2 refs, 5 figs, 1 tab

  18. Automatic Estimation of the Radiological Inventory for the Dismantling of Nuclear Facilities

    International Nuclear Information System (INIS)

    The estimation of the radiological inventory of Nuclear Facilities to be dismantled is a process that included information related with the physical inventory of all the plant and radiological survey. Estimation of the radiological inventory for all the components and civil structure of the plant could be obtained with mathematical models with statistical approach. A computer application has been developed in order to obtain the radiological inventory in an automatic way. Results: A computer application that is able to estimate the radiological inventory from the radiological measurements or the characterization program has been developed. In this computer applications has been included the statistical functions needed for the estimation of the central tendency and variability, e.g. mean, median, variance, confidence intervals, variance coefficients, etc. This computer application is a necessary tool in order to be able to estimate the radiological inventory of a nuclear facility and it is a powerful tool for decision taken in future sampling surveys

  19. Report on the control of the safety and security of nuclear facilities. Part 2: the reconversion of military plutonium stocks. The use of the helps given to central and eastern Europe countries and to the new independent states; Rapport sur le controle de la surete et de la securite des installations nucleaires. Deuxieme partie: la reconversion des stocks de plutonium militaire. L'utilisation des aides accordees aux pays d'Europe centrale et orientale et aux nouveaux etats independants

    Energy Technology Data Exchange (ETDEWEB)

    Birraux, C

    2002-07-01

    This report deals with two different aspects of the safety and security of nuclear facilities. The first aspect concerns the reconversion of weapon grade plutonium stocks: the plutonium in excess, plutonium hazards and nuclear fuel potentialities, the US program, the Russian program, the actions of European countries (France, Germany), the intervention of other countries, the unanswered questions (political aspects, uncertainties), the solutions of the future (improvement of reactors, the helium-cooled high temperature reactor technology (gas-turbine modular helium reactor: GT-MHR), the Carlo Rubbia's project). The second aspect concerns the actions carried out by the European Union in favor of the civil nuclear facilities of central and eastern Europe: the European Union competencies through the Euratom treaty, the conclusions of the European audit office about the PHARE and TACIS nuclear programs, the status of committed actions, the coming planned actions, and the critical analysis of the policy adopted so far. (J.S.)

  20. Patient-specific radiation dose and cancer risk in computed tomography examinations in some selected CT facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    The effective dose and cancer risk were determined for patients undergoing seven different types of CT examinations in two CT facilities in the Greater Accra region of Ghana. The two facilities, namely; the Diagnostic Centre Ltd and Cocoa Clinic were chosen because of their significant patient throughput. The effective dose was from patient data namely age, sex, height, weight and technique factors; namely scan length, KVp (Kilovolts peak), mAs (milliamperes per second) and CTDIv from the control console of the CT machines. The effective dose was also estimated using the dose length product (DLP) and k Coefficients which is the anatomic region specific conversion factors. The cancer risk for each patient for a particular examination was determined from the effective dose, age and sex of each patient with the help of BEIR VII. In all, a total number of 800 adult patients with 400 from each of the two CT facilities were compiled. From Diagnostic Centre Ltd, the average effective dose was 5.61mSv in the range of 1.41mSv to 13.34mSv with average BMI of 26.19kg/m2 in the range of 16.90kg/m2 to 48.28kg/m2 for all types of examinations. The average cancer risk was 0.0458 Sv-1 for 400 patients in the range of 0.0001 Sv-1 to 0.3036 Sv-1 compared with a population of 900 patients undergoing CT examination per year. From Cocoa Clinic, the average effective dose was 3.91MSv in the range of 0.54mSv to 27.32mSv with an average BMI of 25.59 kg/m2 in the range of 17.18kg/m2 to 35.34kg/m2 and the average cancer risk was 0.0371 Sv-1 in the range of 0.0001 Sv-1 and 0.7125 Sv-1. Some of the values were within the range of values of typical for typical effective dose for CT examinations reported by the ICRP. It was evident from this study that the variations in scanning parameters had significant impact on the effective doses to patient for similar CT examinations among the two facilities.(au)

  1. The CUTLASS database facilities

    International Nuclear Information System (INIS)

    The enhancement of the CUTLASS database management system to provide improved facilities for data handling is seen as a prerequisite to its effective use for future power station data processing and control applications. This particularly applies to the larger projects such as AGR data processing system refurbishments, and the data processing systems required for the new Coal Fired Reference Design stations. In anticipation of the need for improved data handling facilities in CUTLASS, the CEGB established a User Sub-Group in the early 1980's to define the database facilities required by users. Following the endorsement of the resulting specification and a detailed design study, the database facilities have been implemented as an integral part of the CUTLASS system. This paper provides an introduction to the range of CUTLASS Database facilities, and emphasises the role of Database as the central facility around which future Kit 1 and (particularly) Kit 6 CUTLASS based data processing and control systems will be designed and implemented. (author)

  2. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  3. A case of acute meningitis with clear cerebrospinal fluid: value of computed tomography for the diagnosis of central nervous system tuberculosis

    International Nuclear Information System (INIS)

    The author reports a case of acute meningitis with clear cerebrospinal fluid in which extensive bacteriologic investigations were negative making the etiologic diagnosis exceedingly difficult. Initiation of empiric antituberculous therapy was rapidly followed by clinical and biological improvement, without complications, and by resolution of abnormal findings on computed tomography of the brain. On these grounds, meningitis secondary to a tuberculoma in the temporal lobe was diagnosed. The author points out that tuberculous meningitis is still a severe, potentially fatal condition; this, together with the fact that tubercle bacilli are often very scarce or absent, requires that tuberculous meningitis be routinely considered in every patient with clear cerebrospinal fluid meningitis whose condition deteriorates. Computed tomography of the brain is essential to ensure rapid diagnosis and prompt initiation of antituberculous therapy. Lastly, the author points out that nowadays herpes simplex virus encephalopathy should also be considered

  4. Flux mapping system for AHWR critical facility

    International Nuclear Information System (INIS)

    A software for flux mapping system (FMS) for AHWR critical facility has been developed. The system consists of 25 LEU based pulse detectors and associated software. The objective of the FMS is to obtain the flux profiles over the central 5 x 5 lattice locations. For development of flux mapping system it is required to compute the higher harmonics of the diffusion equation. These harmonics (also called λ-modes) are the eigen functions of multi-group diffusion equation. The fundamental mode is found by power iteration method. Apart from the fundamental mode, other higher modes are evaluated by subtraction technique. In the present paper, fundamental eigenvalue and eigenfunction are evaluated by finite difference method. The bi-orthogonality relations between the direct and adjoint eigenvectors are used for this purpose. The reference core for AHWR has been simulated by computer code FINSQR. Two group lattice cell data have been generated using transport theory code WIMSD and its associated 69-group nuclear data library. The reactor core along with the surrounding radial and axial reflector was represented using 3243 mesh points. In total 5 λ-modes and corresponding eigenvalues have been estimated. The computer code FMS has been developed specifically for AHWR/PHWR critical facility. Flux construction at 5 x 5 lattice locations of the core has been achieved by using observed fluxes at 25 detector locations and linear combinations of pre calculated eigen functions. Combining coefficients have been computed by least square method. To validate the code, we have used computer code based on Monte Carlo method for estimation of thermal fluxes at a few discrete locations. The fluxes were estimated using ENDF/B-VI point nuclear data library. (author)

  5. Enlargement of the Assessment Database for Advanced Computer Codes in Relation to the VVER Technology: Benchmark on LB-LOCA Transient in PSB-VVER Facility

    International Nuclear Information System (INIS)

    The OECD/NEA PSB-VVER project provided unique and useful experimental data from the large-scale PSB-VVER test facility for code validation. This facility represents the scaled down layout of the Russian designed PWR reactors, namely VVER-1000. Five experiments were executed in the project, dealing mainly with the loss of coolant scenarios (small, intermediate, large break loss of coolant accident); a primary to secondary leak and a parametric study (natural circulation test) aimed at the characterization of the VVER system at reduced mass inventory conditions. The comparative analysis described in the paper deals with the analytical exercise on the large break loss of coolant accident experiment (Test 5). Four participants from three different institutions were involved in the benchmark and applied their own analytical models, set up for four different thermal-hydraulic system codes. The benchmark demonstrated that almost all performed post-tests appeared qualified against fixed criteria. Few mismatches between the results and acceptability thresholds are discussed and understood. The analysis involves the relevant features of the input models developed, the steady state conditions and the results of the simulations. The results submitted by the participants are discussed in the paper considering the resulting sequence of main events, the qualitative comparison of selected time trends, the analysis of the relevant thermal-hydraulic aspects and, finally, by the application of the Fast Fourier Transform based method.(author).

  6. Survey of solar thermal test facilities

    Energy Technology Data Exchange (ETDEWEB)

    Masterson, K.

    1979-08-01

    The facilities that are presently available for testing solar thermal energy collection and conversion systems are briefly described. Facilities that are known to meet ASHRAE standard 93-77 for testing flat-plate collectors are listed. The DOE programs and test needs for distributed concentrating collectors are identified. Existing and planned facilities that meet these needs are described and continued support for most of them is recommended. The needs and facilities that are suitable for testing components of central receiver systems, several of which are located overseas, are identified. The central contact point for obtaining additional details and test procedures for these facilities is the Solar Thermal Test Facilities Users' Association in Albuquerque, N.M. The appendices contain data sheets and tables which give additional details on the technical capabilities of each facility. Also included is the 1975 Aerospace Corporation report on test facilities that is frequently referenced in the present work.

  7. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  8. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  9. Computer applications in controlled fusion research

    International Nuclear Information System (INIS)

    The application of computers to controlled thermonuclear research (CTR) is essential. In the near future the use of computers in the numerical modeling of fusion systems should increase substantially. A recent panel has identified five categories of computational models to study the physics of magnetically confined plasmas. A comparable number of types of models for engineering studies is called for. The development and application of computer codes to implement these models is a vital step in reaching the goal of fusion power. To meet the needs of the fusion program the National CTR Computer Center has been established at the Lawrence Livermore Laboratory. A large central computing facility is linked to smaller computing centers at each of the major CTR Laboratories by a communication network. The crucial element needed for success is trained personnel. The number of people with knowledge of plasma science and engineering trained in numerical methods and computer science must be increased substantially in the next few years. Nuclear engineering departments should encourage students to enter this field and provide the necessary courses and research programs in fusion computing

  10. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    International Nuclear Information System (INIS)

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  11. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  12. The performance assessment of near-surface waste disposal facilities. Mathematical model and computer code for ground water flow in variably saturated porous media

    International Nuclear Information System (INIS)

    The safe disposal of radioactive waste, which is the final step in the waste management operation chain, is of vital importance for ensuring the long-term protection of man and his environment. Safety assessment is necessary to estimate the expected performance of a system considered for near-surface disposal of radioactive wastes and to compare it with acceptability criteria, both for the operational and post-operational phases. A ground water flow field is necessary for modelling both the source term and the rate of radionuclides transport through ground water. This paper describes the physical process of fluid flow in variable saturated porous media, the mathematical model, computer code, and model verification. With respect to water flow, based on the benchmarking exercise, J.T. McCord showed that the VS2DT code was identified as the 'code of choice' for two-dimensional problems in which flow described by Richards equation and transport described by convection-dispersion equation is deemed adequate. VS2DT is a finite-difference flow and solute transport code developed by the U.S. Geological Survey. The code was implemented on a IBM-compatible personal computer with a 486/33 processor at Pitesti Institute. In terms of flow, VS2DT can handle seepage faces, plant-root uptake, surface-pounding infiltration and profile-limited evaporation. In terms of code easiness of use, VS2DT is clearly superior to many codes, because VS2DT takes free-formatted input and requires only a small number of parameters on each line of input. Three test cases were considered for model and computer code verification. First, it was considered a problem of one-dimensional vertical infiltration into a medium of uniform initial pressure head. Secondly, it was considered a two-dimensional problem involving infiltration, evaporation, and evapotranspiration. Finally, a simple hypothetical repository was simulated. (authors)

  13. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  14. Primary central nervous system lymphoma in an human immunodeficiency virus-infected patient mimicking bilateral eye sign in brain seen in fluorine-18 fluorodeoxyglucose-positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Fluorodeoxyglucose-positron emission tomography/computed tomography (FDG PET/CT) has proven useful in the diagnosis, staging, and detection of metastasis and posttreatment monitoring of several malignancies in human immunodeficiency virus (HIV)-infected patients. It also has the ability to make the important distinction between malignancy and infection in the evaluation of central nervous system (CNS) lesions, leading to the initiation of the appropriate treatment and precluding the need for invasive biopsy. We report an interesting case of HIV positive 35-year-old woman presented with headache, disorientation, and decreased level of consciousness. She underwent whole body PET/CT which showed multiple lesions in the cerebrum which mimics bilateral eye in brain. A diagnosis of a primary CNS lymphoma was made and patient was started on chemotherapy

  15. 20 CFR 654.413 - Cooking and eating facilities.

    Science.gov (United States)

    2010-04-01

    ... cleaned materials. (c) When central mess facilities are provided, the kitchen and mess hall shall be in... physical facilities, equipment and operation shall be in accordance with provisions of applicable...

  16. Designing Facilities for Collaborative Operations

    Science.gov (United States)

    Norris, Jeffrey; Powell, Mark; Backes, Paul; Steinke, Robert; Tso, Kam; Wales, Roxana

    2003-01-01

    A methodology for designing operational facilities for collaboration by multiple experts has begun to take shape as an outgrowth of a project to design such facilities for scientific operations of the planned 2003 Mars Exploration Rover (MER) mission. The methodology could also be applicable to the design of military "situation rooms" and other facilities for terrestrial missions. It was recognized in this project that modern mission operations depend heavily upon the collaborative use of computers. It was further recognized that tests have shown that layout of a facility exerts a dramatic effect on the efficiency and endurance of the operations staff. The facility designs (for example, see figure) and the methodology developed during the project reflect this recognition. One element of the methodology is a metric, called effective capacity, that was created for use in evaluating proposed MER operational facilities and may also be useful for evaluating other collaboration spaces, including meeting rooms and military situation rooms. The effective capacity of a facility is defined as the number of people in the facility who can be meaningfully engaged in its operations. A person is considered to be meaningfully engaged if the person can (1) see, hear, and communicate with everyone else present; (2) see the material under discussion (typically data on a piece of paper, computer monitor, or projection screen); and (3) provide input to the product under development by the group. The effective capacity of a facility is less than the number of people that can physically fit in the facility. For example, a typical office that contains a desktop computer has an effective capacity of .4, while a small conference room that contains a projection screen has an effective capacity of around 10. Little or no benefit would be derived from allowing the number of persons in an operational facility to exceed its effective capacity: At best, the operations staff would be underutilized

  17. Use of single photon emission computed tomography and magnetic resonance to evaluate central nervous system involvement in patients with juvenile systemic lupus erythematosus

    Directory of Open Access Journals (Sweden)

    Prismich G.

    2002-01-01

    Full Text Available The objective of the present study was to identify the single photon emission computed tomography (SPECT and magnetic resonance (MR findings in juvenile systemic lupus erythematosus (JSLE patients with CNS involvement and to try to correlate them with neurological clinical history data and neurological clinical examination. Nineteen patients with JSLE (16 girls and 3 boys, mean age at onset 9.2 years were submitted to neurological examination, electroencephalography, cerebrospinal fluid analysis, SPECT and MR. All the evaluations were made separately within a period of 15 days. SPECT and MR findings were analyzed independently by two radiologists. Electroencephalography and cerebrospinal fluid analysis revealed no relevant alterations. Ten of 19 patients (53% presented neurological abnormalities including present or past neurological clinical history (8/19, 42%, abnormal neurological clinical examination (5/19, 26%, and abnormal SPECT or MR (8/19, 42% and 3/19, 16%, respectively. The most common changes in SPECT were cerebral hypoperfusion and heterogeneous distribution of blood flow. The most common abnormalities in MR were leukomalacia and diffuse alterations of white matter. There was a correlation between SPECT and MR (P<0.05. We conclude that SPECT and MR are complementary and useful exams in the evaluation of neurological involvement of lupus.

  18. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  19. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, such as birthing centers and psychiatric care centers. When you ...

  20. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  1. Computation of the time-varying flow rate from an artesian well in central Dade County, Florida, by analytical and numerical simulation methods

    Science.gov (United States)

    Merritt, Michael L.

    1995-01-01

    To construct a digital simulation of a plume of brackish water in the surficial Biscayne aquifer of central Dade County, Florida, that originated from a flowing artesian well, it was necessary to quantify the rate of spillage and the consequent point-source loading of the aquifer. However, a flow-rate measurement (2,350 gallons per minute) made 2 months after drilling of the well in 1944 was inconsistent with later measurements (1,170 gallons per minute) in 1964, 1965, and 1969. Possible explanations were the: (1) drawdown of the aquifer over time; (2) raising of the altitude at which the water was discharged; (3) installation of 80 feet of 8-inch liner; (4) an increase in the density of the flowing water; and (5) gradual deterioration of the well casing. The first approach to reconciling the measured flow rates was to apply a form of the equation for constant-drawdown analysis often used to estimate aquifer transmissivity. Next, a numerical simulation analysis was made that pro- vided the means to account for friction loss in the well and recharge across vertically adjacent con- fining layers and from lateral boundaries. The numerical analysis required the construction of a generalized model of the subsurface from the surficial Biscayne aquifer to the cavernous, dolomitic Boulder Zone at a depth of 3,000 feet. Calibration of the generalized flow model required that the moddle confining unit of the Floridan aquifer system separating the artesian flow zone in the Upper Floridan aquifer from the Lower Floridan aquifer (the Boulder Zone) have a vertical hydraulic conductivity of at least 1 foot per day. The intermediate confining unit separating the flow zone from the surficial Biscayne aquifer was assigned a much lower hydraulic conductivity (0.01 foot per day or less). The model indicated that the observed mounding of Upper Floridan aquifer heads along the axis of the Florida Peninsula was related to the variable depth of the freshwater and brackish-water zone

  2. Unique life sciences research facilities at NASA Ames Research Center

    Science.gov (United States)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  3. Relativistic heavy ion facilities: worldwide

    International Nuclear Information System (INIS)

    A review of relativistic heavy ion facilities which exist, are in a construction phase, or are on the drawing boards as proposals is presented. These facilities span the energy range from fixed target machines in the 1 to 2 GeV/nucleon regime, up to heavy ion colliders of 100 GeV/nucleon on 100 GeV/nucleon. In addition to specifying the general features of such machines, an outline of the central physics themes to be carried out at these facilities is given, along with a sampling of the detectors which will be used to extract the physics. 22 refs., 17 figs., 3 tabs

  4. Expertise concerning the request by the ZWILAG Intermediate Storage Facility Wuerenlingen AG for granting of a licence for the building and operation of the Central Intermediate Storage Facility for radioactive wastes; Gutachten zum Gesuch der ZWILAG Zwischenlager Wuerenlingen AG um Erteilung der Bewilligung fuer Bau und Betrieb des Zentralen Zwischenlagers fuer radioaktive Abfaelle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-15

    On July 15, 1993, the Intermediate Storage Facility Wuerenlingen AG (ZWILAG) submitted a request to the Swiss Federal Council for granting of a license for the construction and operation of a central intermediate storage facility for radioactive wastes. The project foresees intermediate storage halls as well as conditioning and incineration installations. The Federal Agency for the Safety of Nuclear Installations (HSK) has to examine the project from the point of view of nuclear safety. The present report presents the results of this examination. Different waste types have to be treated in ZWILAG: spent fuel assemblies from Swiss nuclear power plants (KKWs); vitrified, highly radioactive wastes from reprocessing; intermediate and low-level radioactive wastes from KKWs and from reprocessing; wastes from the dismantling of nuclear installations; wastes from medicine, industry and research. The wastes are partitioned into three categories: high-level (HAA) radioactive wastes containing, amongst others, {alpha}-active nuclides, intermediate-level (MAA) radioactive wastes and low-level (SAA) radioactive wastes. The projected installation consists of three repository halls for each waste category, a hot cell, a conditioning plant and an incineration and melting installation. The HAA repository can accept 200 transport and storage containers with vitrified high-level wastes or spent fuel assemblies. The expected radioactivity amounts to 10{sup 20} Bq, including 10{sup 18} Bq of {alpha}-active nuclides. The thermal power produced by decay is released to the environment by natural circulation of air. The ventilation system is designed for a maximum power of 5.8 MW. Severe conditions are imposed to the containers as far as tightness and shielding against radiation is concerned. In the repository for MAA wastes the maximum radioactivity is 10{sup 18} Bq with 10{sup 15} Bq of {alpha}-active nuclides. The maximum thermal power of 250 kW is removed by forced air cooling. Because

  5. Canastota Renewable Energy Facility Project

    Energy Technology Data Exchange (ETDEWEB)

    Blake, Jillian; Hunt, Allen

    2013-12-13

    The project was implemented at the Madison County Landfill located in the Town of Lincoln, Madison County, New York. Madison County has owned and operated the solid waste and recycling facilities at the Buyea Road site since 1974. At the onset of the project, the County owned and operated facilities there to include three separate landfills, a residential solid waste disposal and recycled material drop-off facility, a recycling facility and associated administrative, support and environmental control facilities. This putrescible waste undergoes anaerobic decomposition within the waste mass and generates landfill gas, which is approximately 50% methane. In order to recover this gas, the landfill was equipped with gas collection systems on both the east and west sides of Buyea Road which bring the gas to a central point for destruction. In order to derive a beneficial use from the collected landfill gases, the County decided to issue a Request for Proposals (RFP) for the future use of the generated gas.

  6. 中央执行负荷对成人估算策略运用的影响%The Effect of Central Executive Load on Adult's Strategy Using in Computational Estimation

    Institute of Scientific and Technical Information of China (English)

    司继伟; 杨佳; 贾国敬; 周超

    2012-01-01

    Many models on strategy choosing in arithmetic cognition have been proposed, however, the role of central executive load in strategy utilization is still far from clear. A previous research with the Choice Reaction Time task (CRT) found that working memory load didn't affect children's strategy utilization (Imbo & Vandierendonck, 2007). In another study, Logie, Gilhooly and Wynn (1994) reported that different sub-tasks affected mental arithmetic. More recent studies also revealed that the inhibition and shifting capacities mediated age-related differences in strategy selection (Lemaire & Lecacheur, 2011; Hodzik & Lemaire, 2011). Dual-task paradigm is commonly utilized in exploring working memory load in arithmetic performance, and the choice/no-choice is a standard method to obtain unbiased data about strategy utilization. In this study, we employed the dual-task paradigm and choice/no-choice method to investigate the influence of central executive load upon individual strategy utilization during arithmetic processing. 128 college students were tested by the dual-task paradigm and choice/no-choice method. They were askedto finish a two-digit addition computational estimate and a secondary task at the same time. The experimental design was as following: 5 (consistent -high load, consistent -low load, inconsistent -high load, inconsistent -low load, no load) x 4 (free-choice condition, best-strategy choice condition, no-choice/rounding-up condition, no-choice/rounding-down condition). The main task was to finish 30 two-digit addition questions, and the secondary task was Han and Kim's (2004) design with some modifications. Results showed that: 1) Central executive load did not affect adult's strategy distribution. But comparing with free-choice condition, adults used less rounding-down strategy under the best -strategy choice condition (F(4,123) = 0.58, p> 0.05); 2) Central executive load affected how participants selected (F(4,123) = 11.10, p < 0.05) and executed (F

  7. Centrally Banked Cryptocurrencies

    OpenAIRE

    Danezis, G.; Meiklejohn, S.

    2016-01-01

    Current cryptocurrencies, starting with Bitcoin, build a decentralized blockchain-based transaction ledger, maintained through proofs-of-work that also serve to generate a monetary supply. Such decentralization has benefits, such as independence from national political control, but also significant limitations in terms of computational costs and scalability. We introduce RSCoin, a cryptocurrency framework in which central banks maintain complete control over the monetary supply, but rely on...

  8. Centrally Banked Cryptocurrencies

    OpenAIRE

    Danezis, George; Meiklejohn, Sarah

    2015-01-01

    Current cryptocurrencies, starting with Bitcoin, build a decentralized blockchain-based transaction ledger, maintained through proofs-of-work that also generate a monetary supply. Such decentralization has benefits, such as independence from national political control, but also significant limitations in terms of scalability and computational cost. We introduce RSCoin, a cryptocurrency framework in which central banks maintain complete control over the monetary supply, but rely on a distribut...

  9. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  10. Centralized digital control of accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Melen, R.E.

    1983-09-01

    In contrasting the title of this paper with a second paper to be presented at this conference entitled Distributed Digital Control of Accelerators, a potential reader might be led to believe that this paper will focus on systems whose computing intelligence is centered in one or more computers in a centralized location. Instead, this paper will describe the architectural evolution of SLAC's computer based accelerator control systems with respect to the distribution of their intelligence. However, the use of the word centralized in the title is appropriate because these systems are based on the use of centralized large and computationally powerful processors that are typically supported by networks of smaller distributed processors.

  11. Social Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Adam Mcmahon

    2011-08-01

    Full Text Available While both volunteer computing and social networks have proved successful, the merging of these two models is a new field: Social Volunteer Computing. A Social Volunteer Computing system utilizes the relationships within a social network to determine how computational resources flow towards tasks that need to be completed, and the results of these computations are added back into the social network as content. Such a system will provide scientists and artists a new facility to obtain computational resources and disseminate their work. RenderWeb 2.0, a prototype Social Volunteer Computing system, is introduced that allows animations created in Blender to be distributed and rendered within Facebook.

  12. Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) System Configuration Management Plan

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.L. Jr.

    1994-06-01

    The Treated Effluent Disposal Facility Operator Training Station (TEDF OTS) is a computer based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS). It consists of PC compatible computers and a Programmable Logic Controller (PLC) designed to emulate the responses of various plant components connected to or under the control of the CCS. The system trains operators by simulating the normal operation but also has the ability to force failures of different equipment allowing the operator to react and observe the events. The paper describes organization, responsibilities, system configuration management activities, software, and action plans for fully utilizing the simulation program.

  13. Facile synthesis of silver nanoparticles and its antibacterial activity against Escherichia coli and unknown bacteria on mobile phone touch surfaces/computer keyboards

    Science.gov (United States)

    Reddy, T. Ranjeth Kumar; Kim, Hyun-Joong

    2016-07-01

    In recent years, there has been significant interest in the development of novel metallic nanoparticles using various top-down and bottom-up synthesis techniques. Kenaf is a huge biomass product and a potential component for industrial applications. In this work, we investigated the green synthesis of silver nanoparticles (AgNPs) by using kenaf ( Hibiscus cannabinus) cellulose extract and sucrose, which act as stabilizing and reducing agents in solution. With this method, by changing the pH of the solution as a function of time, we studied the optical, morphological and antibacterial properties of the synthesized AgNPs. In addition, these nanoparticles were characterized by Ultraviolet-visible spectroscopy, transmission electron microscopy (TEM), field-emission scanning electron microscopy, Fourier transform infrared (FTIR) spectroscopy and energy-dispersive X-ray spectroscopy (EDX). As the pH of the solution varies, the surface plasmon resonance peak also varies. A fast rate of reaction at pH 10 compared with that at pH 5 was identified. TEM micrographs confirm that the shapes of the particles are spherical and polygonal. Furthermore, the average size of the nanoparticles synthesized at pH 5, pH 8 and pH 10 is 40.26, 28.57 and 24.57 nm, respectively. The structure of the synthesized AgNPs was identified as face-centered cubic (fcc) by XRD. The compositional analysis was determined by EDX. FTIR confirms that the kenaf cellulose extract and sucrose act as stabilizing and reducing agents for the silver nanoparticles. Meanwhile, these AgNPs exhibited size-dependent antibacterial activity against Escherichia coli ( E. coli) and two other unknown bacteria from mobile phone screens and computer keyboard surfaces.

  14. A central spent fuel storage in Sweden

    International Nuclear Information System (INIS)

    A planned central spent fuel storage facility in Sweden is described. The nuclear power program and quantities of spent fuel generated in Sweden is discussed. A general description of the facility is given with emphasis on the lay-out of the buildings, transport casks and fuel handling. Finally a possible design of a Swedish transportation system is discussed. (author)

  15. Multipurpose dry storage facilities

    International Nuclear Information System (INIS)

    SGN has gained considerable experience in the design and construction of interim storage facilities for spent fuel and various nuclear waste, and can therefore propose single product and multiproduct facilities capable of accommodating all types of waste in a single structure. The pooling of certain functions (transport cask reception, radiation protection), the choice of optimized technologies to meet the specific needs of the clients (automatic transfer by shielded cask or nuclearized crane) and the use of the same type of well to cool the heat releasing packages (glass canisters, fuel elements) make it possible to propose industrially proven and cost effective solutions. The studies conducted by SGN on behalf of the Dutch company COVRA (Centrale Organisatie Voor Radioactif Afval), offer an example of the application of this new concept. This paper first presents the SGN experience through a short description of reference storage facilities for various types of products (MLW, HLW and Spent Fuel). It goes on with a typical application to show how these proven technologies are combined to obtain single product or multiproduct facilities tailored to the client's specific requirements. (author)

  16. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Videos related to Computed Tomography (CT) - Head About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  17. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Images related to Computed Tomography (CT) - Sinuses About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  18. A Computable Economist’s Perspective on Computational Complexity

    OpenAIRE

    Vela Velupillai, K.

    2007-01-01

    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of...

  19. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  20. Marina Facilities

    OpenAIRE

    2014-01-01

    The CIRPAS main facility and headquarters are at Marina Municipal Airport (formerly Fritchie Field, Fort Ord) in Marina, California. CIRPAS has a 30,000 sq. ft. maintenance hanger there, which houses staff offices, an instrument and calibration laboratory, maintenance and payload integration shops, conference rooms, and flight planning and operations control center.

  1. Correlation among Library Facilities: An Analytical Study

    OpenAIRE

    Y, Srinivasa Rao

    2011-01-01

    Libraries are central part of academic system. They are imperatively empowered and enriched by their facilities. These facilities help the users (faculty, students and researcher) not only to loan physically available resources but also help them to browse and search catalogues, access databases and avail services simultaneously. Provision of multiple library facilities and their correlation can have a strong impact on institutional outcomes. Therefore, this study has been conduct...

  2. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.

  3. Europa central

    Directory of Open Access Journals (Sweden)

    Karel BARTOSEK

    2010-02-01

    Full Text Available La investigación francesa continúa interesándose por Europa Central. Desde luego, hay límites a este interés en el ambiente general de mi nueva patria: en la ignorancia, producto del largo desinterés de Francia por este espacio después de la Segunda Guerra Mundial, y en el comportamiento y la reflexión de la clase política y de los medios de comunicación (una anécdota para ilustrar este ambiente: durante la preparación de nuestro coloquio «Refugiados e inmigrantes de Europa Central en el movimiento antifascista y la Resistencia en Francia, 1933-1945», celebrado en París en octubre de 1986, el problema de la definición fue planteado concreta y «prácticamente». ¡Y hubo entonces un historiador eminente, para quién Alemania no formaría parte de Europa Central!.

  4. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  5. Expert systems for protective monitoring of facilities

    International Nuclear Information System (INIS)

    In complex plants, the possibility of serious operator error always exists to some extent, but, this can be especially true during an experiment or some other unusual exercise. Possible contributing factors to operational error include personnel fatigue, misunderstanding in communication, mistakes in executing orders, uncertainty about the delegated authority, pressure to meet a demanding schedule, and a lack of understanding of the possible consequences of deliberate violations of the facility's established operating procedures. Authoritative reports indicate that most of these factors were involved in the disastrous Russian Chernobyl-4 nuclear reactor accident in April 1986, which, ironically, occurred when a safety experiment was being conducted. Given the computer hardware and software now available for implementing expert systems together with integrated signal monitoring and communications, plant protection could be enhanced by an expert system with extended features to monitor the plant. The system could require information from the operators on a rigidly enforced schedule and automatically log in and report on a scheduled time basis to authorities at a central remote site during periods of safe operation. Additionally, the system could warn an operator or automatically shut down the plant in case of dangerous conditions, while simultaneously notifying independent, responsible, off-site personnel of the action taken. This approach would provide protection beyond that provided by typical facility scram circuits. This paper presents such an approach to implementing an expert system for plant protection, together with specific hardware and software configurations. The Chernobyl accident is used as the basis of discussion

  6. Central Control Over Distributed Routing

    OpenAIRE

    Vissicchio, Stefano; Tilmans, Olivier; Vanbever, Laurent; Rexford, Jennifer; ACM SIGCOMM

    2015-01-01

    Centralizing routing decisions offers tremendous flexibility, but sacrifices the robustness of distributed protocols. In this paper, we present Fibbing, an architecture that achieves both flexibility and robustness through central control over distributed routing. Fibbing introduces fake nodes and links into an underlying link-state routing protocol, so that routers compute their own forwarding tables based on the augmented topology. Fibbing is expressive, and readily supports flexible load b...

  7. Women and Computers: An Introduction.

    Science.gov (United States)

    Perry, Ruth; Greber, Lisa

    1990-01-01

    Discusses women's central role in the development of the computer and their present day peripheral position, a progression paralleled in the fields of botany, medical care, and obstetrics. Affirms the importance of computer education to women. (DM)

  8. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  9. Large-coil-test-facility fault-tree analysis

    International Nuclear Information System (INIS)

    An operating-safety study is being conducted for the Large Coil Test Facility (LCTF). The purpose of this study is to provide the facility operators and users with added insight into potential problem areas that could affect the safety of personnel or the availability of equipment. This is a preliminary report, on Phase I of that study. A central feature of the study is the incorporation of engineering judgements (by LCTF personnel) into an outside, overall view of the facility. The LCTF was analyzed in terms of 32 subsystems, each of which are subject to failure from any of 15 generic failure initiators. The study identified approximately 40 primary areas of concern which were subjected to a computer analysis as an aid in understanding the complex subsystem interactions that can occur within the facility. The study did not analyze in detail the internal structure of the subsystems at the individual component level. A companion study using traditional fault tree techniques did analyze approximately 20% of the LCTF at the component level. A comparison between these two analysis techniques is included in Section 7

  10. Main Facilities

    International Nuclear Information System (INIS)

    This chapter discuss on main nuclear facilities available in the Malaysian Institute for Nuclear Technology Research (MINT). As a national research institute whose core activities are nuclear science and technology, MINT are made up of main commercializable radiation irradiators, pilot plant and fully equipped laboratories. Well elaboration on its characteristics and functions explain for RTP (PUPSPATI TRIGA reactors), Cobalt-60 gamma irradiator, electron beam accelerators, and radioactive waste management center

  11. Nuclear facility

    International Nuclear Information System (INIS)

    In nuclear facilities with a fuel storage pool in a spent fuel pit building there is a filter to each pool through which the fuel pit water is pumped. According to the invention the filter is provided with an independently movable housing placed beneath the surface of the pool water and fixed to the lateral side of the pool by means of detachable fixtures. (orig./RW)

  12. Operating procedures: Fusion Experiments Analysis Facility

    International Nuclear Information System (INIS)

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility

  13. [DNA computing].

    Science.gov (United States)

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  14. New super-computing facility in RIKEN

    International Nuclear Information System (INIS)

    A new superconductor, Fujitsu VPP500/28, was installed in the Institute of Physical and Chemical Research (RIKEN) at the end of March, 1994. It consists of 28 processing elements (PE's) connected by a high-speed crossbar switch. The switch is a combination of GaAs and ECL circuitry with peak band width of 800 Mbyte per second. Each PE consists of a GaAs/ECL vector processor with 1.6 Gflops peak speed and 256 Mbyte SRAM local memory. In addition, there are 8 GByte DRAM space, two 100 Gbyte RAID disks and a 10 TByte archive based on SONY File Bank system. The author ran three major benchmarks on this machine: modified LINPACK, lattice QCD and FFT. In the modified LINPACK benchmark, a sustained speed of about 28 Gflops is achieved, by removing the restriction on the size of the matrices. In the lattice QCD benchmark, a sustained speed of about 30 Gflops is achieved for inverting staggered fermion propagation matrix on a 324 lattice. In the FFT benchmark, real data of 32, 128, 512, and 2048 MByte are Fourier-transformed. The sustained speed for each is respectively 21, 21, 20, and 19 Gflops. The numbers are obtained after only a few weeks of coding efforts and can be improved further

  15. Designing for the home: a comparative study of support aids for central heating systems.

    Science.gov (United States)

    Sauer, J; Wastell, D G; Schmeink, C

    2009-03-01

    The study examined the influence of different types of enhanced system support on user performance during the management of a central heating system. A computer-based simulation of a central heating system, called CHESS V2.0, was used to model different interface options, providing different support facilities to the user (e.g., historical, predictive, and instructional displays). Seventy-five participants took part in the study and completed a series of operational scenarios under different support conditions. The simulation environment allowed the collection of performance measures (e.g., energy consumption), information sampling, and system control behaviour. Subjective user evaluations of various aspects of the system were also measured. The results showed performance gains for predictive displays whereas no such benefits were observed for the other display types. The data also revealed that status and predictive displays were valued most highly by users. The implications of the findings for designers of central heating systems are discussed. PMID:18433730

  16. About the Facilities for the Aged in Central Urban Areas of Ningbo%宁波市中心城区老龄设施使用状态分析与对策

    Institute of Scientific and Technical Information of China (English)

    2013-01-01

    As the population aging process speeds up in China, many cities are confronted with a big problem of how to make the aged have a happy life. On the grounds that the supply-demand issue is the material base, this article analyzes the supply and demand of the facilities for the aged, explores the problems of the facilities for the aged, and suggests some solutions to the problems.%  伴随着我国老龄化程度的加深,如何让老龄人口安居晚年是目前我国各主要城市面临的主要问题之一。老龄设施的供需状况是老龄人口安居晚年的物质基础,针对老龄设施的供给与老龄人口养老需求之间的差异,阐述老龄设施的供给存在的问题,从软硬件两个层面提出解决当前老龄设施困境的措施。

  17. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  18. LEGS data acquisition facility

    International Nuclear Information System (INIS)

    The data acquisition facility for the LEGS medium energy photonuclear beam line is composed of an auxiliary crate controller (ACC) acting as a front-end processor, loosely coupled to a time-sharing host computer based on a UNIX-like environment. The ACC services all real-time demands in the CAMAC crate: it responds to LAMs generated by data acquisition modules, to keyboard commands, and it refreshes the graphics display at frequent intervals. The host processor is needed only for printing histograms and recording event buffers on magnetic tape. The host also provides the environment for software development. The CAMAC crate is interfaced by a VERSAbus CAMAC branch driver

  19. Radiation interlocks: The choice between conventional hard-wired logic and computer-based systems

    International Nuclear Information System (INIS)

    During the past few years, the use of computers in radiation safety systems has become more widespread. This is not surprising given the ubiquitous nature of computers in the modern technological world. But is a computer a good choice for the central logic element of a personnel safety system. Recent accidents at computer controlled medical accelerators would indicate that extreme care must be exercised if malfunctions are to be avoided. The Department of Energy has recently established a sub-committee to formulate recommendations on the use of computers in safety systems for accelerators. This paper will review the status of the committee's recommendations, and describe radiation protection interlock systems as applied to both accelerators and to irradiation facilities. Comparisons are made between the conventional relay approach and designs using computers. 6 refs., 6 figs

  20. Facility Management

    OpenAIRE

    Král, David

    2012-01-01

    Tématem bakalářské práce je nalezení cesty ke zvýšení dlouhodobé efektivnosti a prosperity společnosti, která v rámci své podnikatelské činnosti spravuje a udržuje vlastní nemovitosti v centru Brna. Práce vychází z aktuálního stavu facility managementu společnosti a definování jejich silných a slabých stránek. Základem pro návrh efektivního řízení facility managementu je zpracování finanční analýzy společnosti a sledování nákladů včetně jejich optimalizace. Hlavním přínosem mé bakalářské prác...

  1. Biotechnology Facility (BTF) for ISS

    Science.gov (United States)

    1998-01-01

    Engineering mockup shows the general arrangement of the plarned Biotechnology Facility inside an EXPRESS rack aboard the International Space Station. This layout includes a gas supply module (bottom left), control computer and laptop interface (bottom right), two rotating wall vessels (top right), and support systems.

  2. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  3. Facility model for the Los Alamos Plutonium Facility

    International Nuclear Information System (INIS)

    The Los Alamos Plutonium Facility contains more than sixty unit processes and handles a large variety of nuclear materials, including many forms of plutonium-bearing scrap. The management of the Plutonium Facility is supporting the development of a computer model of the facility as a means of effectively integrating the large amount of information required for material control, process planning, and facility development. The model is designed to provide a flexible, easily maintainable facility description that allows the faciltiy to be represented at any desired level of detail within a single modeling framework, and to do this using a model program and data files that can be read and understood by a technically qualified person without modeling experience. These characteristics were achieved by structuring the model so that all facility data is contained in data files, formulating the model in a simulation language that provides a flexible set of data structures and permits a near-English-language syntax, and using a description for unit processes that can represent either a true unit process or a major subsection of the facility. Use of the model is illustrated by applying it to two configurations of a fictitious nuclear material processing line

  4. Safeguards Automated Facility Evaluation (SAFE) methodology

    International Nuclear Information System (INIS)

    An automated approach to facility safeguards effectiveness evaluation has been developed. This automated process, called Safeguards Automated Facility Evaluation (SAFE), consists of a collection of a continuous stream of operational modules for facility characterization, the selection of critical paths, and the evaluation of safeguards effectiveness along these paths. The technique has been implemented on an interactive computer time-sharing system and makes use of computer graphics for the processing and presentation of information. Using this technique, a comprehensive evaluation of a safeguards system can be provided by systematically varying the parameters that characterize the physical protection components of a facility to reflect the perceived adversary attributes and strategy, environmental conditions, and site operational conditions. The SAFE procedure has broad applications in the nuclear facility safeguards field as well as in the security field in general. Any fixed facility containing valuable materials or components to be protected from theft or sabotage could be analyzed using this same automated evaluation technique

  5. Radiation safety training for accelerator facilities

    International Nuclear Information System (INIS)

    In November 1992, a working group was formed within the U.S. Department of Energy's (DOE's) accelerator facilities to develop a generic safety training program to meet the basic requirements for individuals working in accelerator facilities. This training, by necessity, includes sections for inserting facility-specific information. The resulting course materials were issued by DOE as a handbook under its technical standards in 1996. Because experimenters may be at a facility for only a short time and often at odd times during the day, the working group felt that computer-based training would be useful. To that end, Lawrence Livermore National Laboratory (LLNL) and Argonne National Laboratory (ANL) together have developed a computer-based safety training program for accelerator facilities. This interactive course not only enables trainees to receive facility- specific information, but time the training to their schedule and tailor it to their level of expertise

  6. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  7. Expertise concerning the request by the ZWILAG Intermediate Storage Wuerenlingen AG for delivery of the operation licence for the conditioning installation as well as for the burning and melting installation of the central intermediate storage facility for radioactive wastes in Wuerenlingen

    International Nuclear Information System (INIS)

    On December 15, 1997, the Central Intermediate Storage Facility for radioactive materials (ZWILAG) delivered a request to the Swiss Federal Council for an operational licence for the conditioning installation and the incineration and melting installation at the Central Intermediate Storage Facility (ZZL). For the whole project, the general licence was granted on June 23, 1993. The construction of the whole installation as well as the operation in the storage halls with reception, hot cell and transhipment station were licensed on August 12, 1999. The Federal Agency for the Safety of Nuclear Installations (HSK) examined the request documentation of ZWILAG from the point of view of nuclear safety and radiation protection. The results of the examination are contained in this report. In general, the examination leads to a positive decision. During the realisation process many project adaptations have brought improvements to the installation as far as radiation protection and the treatment of wastes are concerned. HSK requests from a former licensing process were taken into account by ZWILAG. The present examination contains many remarks and requests that the HSK considers necessary. The most important remarks are issued as special references as far as they do not appear as proposals in the licence obligations. In general, the reference form was chosen when the fulfilment of the request can be guarantied by HSK within the framework of a release process. Several references inform ZWILAG of problems for which HSK wishes to receive the results of extended inquiries or where HSK will assist in testing or demonstrations. Other references concern requests for plant documentation. Further, calculations of radioactive material release during normal operation is considered as being necessary. Based on its examination, HSK concludes that the conditions are fulfilled for the safe operation of the conditioning installation and of the incineration and melting installation as long as

  8. The centralized monitoring system solution for auxiliary equipment and environmental facilities in the station operation%厂站运行辅助设备及环境设备的集中监测系统解决方案

    Institute of Scientific and Technical Information of China (English)

    张振华

    2013-01-01

    With the gradual expansion of power system construction, unattended substation is imperative. The number and type of auxiliary equipment and environmental equipment in station operation are relatively large, so to realize the efficient and centralized monitoring and management of the equipment bears the brunt of difficulties. A new solution of plant station running auxiliary equipment and environmental equipment for centralized monitoring system is designed and proposed based on the relative switch information of automatic control and new sensing technology collect equipment. The system realizes centralized monitoring and management on auxiliary equipment and environment in the station operation by the dispatching terminal, after the communication network of electric power and the internal information network sent back the running state data of auxiliary equipment and environmental equipment in the station operation, which not only greatly improves the quality and efficiency of the equipment operation and maintenance, but also provides a new solution for substation communication network operation and maintenance. It can effectively realize unmanned operation and development of substation.%随着电力系统建设规模日趋壮大,变电站无人化已势在必行。厂站运行辅助设备、环境设备的数量和类型均相对较多,实现设备的高效、集中监控管理便成为了首当其冲的难题。基于自动控制和新型传感技术采集设备的相关开关量信息,设计并提出了一种新型的厂站运行辅助设备及环境设备的集中监测系统解决方案。该方案通过电力通信网和内部信息网络完成厂站运行辅助设备及环境设备运行状态数据的回传,从而实现了地调主站端对厂站运行的辅助设备和环境的集中实时监控和管理,极大地提高了设备运行维护的质量和效率。该研究为今后变电站通信网络的运行和维护提供了一种新的解决

  9. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance computer-based electronic backend that...

  10. Miniaturized Airborne Imaging Central Server System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a miniaturized airborne imaging central server system (MAICSS). MAICSS is designed as a high-performance-computer-based electronic backend that...

  11. Indoor Lighting Facilities

    Science.gov (United States)

    Matsushima, Koji; Saito, Yoshinori; Ichikawa, Shigenori; Kawauchi, Takao; Tanaka, Tsuneo; Hirano, Rika; Tazuke, Fuyuki

    According to the statistics by the Ministry of Land, Infrastructure and Transport, the total floor space of all building construction started was 188.87 million m2 (1.5% increase y/y), marking the fourth straight year of increase. Many large-scale buildings under construction in central Tokyo become fully occupied by tenants before completion. As for office buildings, it is required to develop comfortable and functional office spaces as working styles are becoming more and more diversified, and lighting is also an element of such functionalities. The total floor space of construction started for exhibition pavilions, multipurpose halls, conference halls and religious architectures decreased 11.1% against the previous year. This marked a decline for 10 consecutive years and the downward trend continues. In exhibition pavilions, the light radiation is measured and adjusted throughout the year so as not to damage the artworks by lighting. Hospitals, while providing higher quality medical services and enhancing the dwelling environment of patients, are expected to meet various restrictions and requirements, including the respect for privacy. Meanwhile, lighting designs for school classrooms tend to be homogeneous, yet new ideas are being promoted to strike a balance between the economical and functional aspects. The severe economic environment continues to be hampering the growth of theaters and halls in both the private and public sectors. Contrary to the downsizing trend of such facilities, additional installations of lighting equipment were conspicuous, and the adoption of high efficacy lighting appliances and intelligent function control circuits are becoming popular. In the category of stores/commercial facilities, the construction of complex facilities is a continuing trend. Indirect lighting, high luminance discharge lamps with excellent color rendition and LEDs are being effectively used in these facilities, together with the introduction of lighting designs

  12. Modification of existing computing capabilities

    International Nuclear Information System (INIS)

    To upgrade the existing computing facilities for whole body counting data analysis, a system renovation has been undertaken which: (1) expands the existing storage and computational memory, (2) provides a multi-language computer-based system, and (3) increases data input/output speed

  13. Online analytical systems for the uranium solidification facility at SRS

    International Nuclear Information System (INIS)

    A new Uranium Solidification Facility (USF) is scheduled for completion in the latter part of 1990. This facility will convert liquid uranyl nitrate product from the Savannah River Site reactor reprocessing plant to solid uranium trioxide for shipment to Oak Ridge Y-12 plant. High- and low-level online uranium measurement systems have been developed to provide nuclear safety, process control, material control, and accountability information to the facility operator. The low level uranium concentration will be determined by an improved online alpha monitoring system for aqueous streams developed at the Savannah River Site. High uranium concentrations will be measured by a fiber-optic-based diode array absorption spectrophotometer. The online alpha monitor (OLAM-100S) consists of a 5-in. circular sample chamber and an alpha particle detector (a thin film of polyvinyl toluene plastic scintillation material) attached to a light guide and processing electronics. The light guide is optically coupled to a photomultiplier tube. Processing electronics include: amplifier/single channel analyzer, a dual counter/timer, and a high and low level alarm module. In order to implement the system online, a cell drain, automatic flushing mechanism, heat exchanger, temperature readout, and flow monitor will be used. The absorption spectrophotometer system consists of a xenon arc lamp, a nine-position fiber optic multiplexer (to allow a single system to monitor up to nine process locations), automated process samplers, diode array spectrometer, and a central computer system. 5 refs., 10 figs., 1 tab

  14. Centralized Fabric Management Using Puppet, Git, and GLPI

    Science.gov (United States)

    Smith, Jason A.; De Stefano, John S., Jr.; Fetzko, John; Hollowell, Christopher; Ito, Hironori; Karasawa, Mizuki; Pryor, James; Rao, Tejas; Strecker-Kellogg, William

    2012-12-01

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of recent technological advances in administrative automation. Puppet is a seasoned open-source tool that is designed for enterprise class centralized configuration management. At the RHIC and ATLAS Computing Facility (RACF) at Brookhaven National Laboratory, we use Puppet along with Git, GLPI, and some custom scripts as part of our centralized configuration management system. In this paper, we discuss how we use these tools for centralized configuration management of our servers and services, change management requiring authorized approval of production changes, a complete version controlled history of all changes made, separation of production, testing and development systems using puppet environments, semi-automated server inventory using GLPI, and configuration change monitoring and reporting using the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a 2,000+ node cluster and 400+ infrastructure servers with an administrative staff of approximately 25 full-time employees (FTEs).

  15. Centralized Fabric Management Using Puppet, Git, and GLPI

    International Nuclear Information System (INIS)

    Managing the infrastructure of a large and complex data center can be extremely difficult without taking advantage of recent technological advances in administrative automation. Puppet is a seasoned open-source tool that is designed for enterprise class centralized configuration management. At the RHIC and ATLAS Computing Facility (RACF) at Brookhaven National Laboratory, we use Puppet along with Git, GLPI, and some custom scripts as part of our centralized configuration management system. In this paper, we discuss how we use these tools for centralized configuration management of our servers and services, change management requiring authorized approval of production changes, a complete version controlled history of all changes made, separation of production, testing and development systems using puppet environments, semi-automated server inventory using GLPI, and configuration change monitoring and reporting using the Puppet dashboard. We will also discuss scalability and performance results from using these tools on a 2,000+ node cluster and 400+ infrastructure servers with an administrative staff of approximately 25 full-time employees (FTEs).

  16. Control computer system for KEK 12 GeV proton synchrotron

    International Nuclear Information System (INIS)

    Control computer system for KEK proton synchrotron is described. The system is composed of 8 mini-computers; one is the central computer, one is the software development computer, and others are satellite computers. The central computer and others are connected with high-speed data transfer means. The satellite computers are distributed around the accelerator to shorten cable length between signal sources and computers. Data taken by the satellite computers are sent to the central computer. They are analyzed, displayed or printed out on various peripheral devices at the central control room, and used for accelerator control. The central computer monitors other computers and manages programs of the satellite computers. Programs for the satellite computers are generated on this computer and saved in disk files. The software development computer is used for compilation of source programs, for linking of object program modules, and for debugging of programs. This computer is also connected to the central computer system of the laboratory. (auth.)

  17. Auto-control facility for sodium removal by chemical clean

    International Nuclear Information System (INIS)

    The author describes the technological process of sodium removal and micro-computer control system on sodium cleaning facility. Micro-computer auto-monitoring and controlling are realized in the cleaning process. The controlled results are satisfactory

  18. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    Science.gov (United States)

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  19. UNI C - A True Internet Pioneer, the Danish Computing Centre for Research and Education

    DEFF Research Database (Denmark)

    Olesen, Dorte

    2015-01-01

    that small computers could now be purchased for local use by the university departments whereas the need for high performance computing could only be satisfied by a joint national purchase and advanced network access to this central computer facility.The new center was named UNI-C and succeeded in helping...... Danish frontline research to use innovative computing techniques and have major breakthroughs using the first massively parallel computer architectures, but the greatest impact of UNI-C on Danish society was the successful early roll out of the Internet to universities with a follow-up of establishing...... the first Danish Internet service to ordinary PC users. This very first Internet service became a great success and helped to put Denmark on the international map as one of the very early Internet adopters. It also meant that UNI-C was tasked by the Ministry of Education with delivering a number...

  20. Distributed Data Mining using a Public Resource Computing Framework

    Science.gov (United States)

    Cesario, Eugenio; de Caria, Nicola; Mastroianni, Carlo; Talia, Domenico

    The public resource computing paradigm is often used as a successful and low cost mechanism for the management of several classes of scientific and commercial applications that require the execution of a large number of independent tasks. Public computing frameworks, also known as “Desktop Grids”, exploit the computational power and storage facilities of private computers, or “workers”. Despite the inherent decentralized nature of the applications for which they are devoted, these systems often adopt a centralized mechanism for the assignment of jobs and distribution of input data, as is the case for BOINC, the most popular framework in this realm. We present a decentralized framework that aims at increasing the flexibility and robustness of public computing applications, thanks to two basic features: (i) the adoption of a P2P protocol for dynamically matching the job specifications with the worker characteristics, without relying on centralized resources; (ii) the use of distributed cache servers for an efficient dissemination and reutilization of data files. This framework is exploitable for a wide set of applications. In this work, we describe how a Java prototype of the framework was used to tackle the problem of mining frequent itemsets from a transactional dataset, and show some preliminary yet interesting performance results that prove the efficiency improvements that can derive from the presented architecture.

  1. The M7 - A computer for on-line track-finding

    International Nuclear Information System (INIS)

    The M7 is a small fast computer with an instruction set designed to match the on-line pattern recognition problems of a high energy physics experiment. It has been used as a trigger processor in a photoproduction experiment at Fermilab. The full off-line track-finding program is being moved to the M7 to allow on-line selection of events based on detailed information and to relieve the burden on the central laboratory computer facility. The hardware, support software, and applications programs are discussed

  2. Optimizing Watershed Management by Coordinated Operation of Storing Facilities

    Science.gov (United States)

    Anghileri, Daniela; Castelletti, Andrea; Pianosi, Francesca; Soncini-Sessa, Rodolfo; Weber, Enrico

    2013-04-01

    Water storing facilities in a watershed are very often operated independently one to another to meet specific operating objectives, with no information sharing among the operators. This uncoordinated approach might result in upstream-downstream disputes and conflicts among different water users, or inefficiencies in the watershed management, when looked at from the viewpoint of an ideal central decision-maker. In this study, we propose an approach in two steps to design coordination mechanisms at the watershed scale with the ultimate goal of enlarging the space for negotiated agreements between competing uses and improve the overall system efficiency. First, we compute the multi-objective centralized solution to assess the maximum potential benefits of a shift from a sector-by-sector to an ideal fully coordinated perspective. Then, we analyze the Pareto-optimal operating policies to gain insight into suitable strategies to foster cooperation or impose coordination among the involved agents. The approach is demonstrated on an Alpine watershed in Italy where a long lasting conflict exists between upstream hydropower production and downstream irrigation water users. Results show that a coordination mechanism can be designed that drive the current uncoordinated structure towards the performance of the ideal centralized operation.

  3. North Slope, Alaska ESI: FACILITY (Facility Points)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...

  4. Computers and neurosurgery.

    Science.gov (United States)

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. PMID:22985531

  5. How to Bill Your Computer Services.

    Science.gov (United States)

    Dooskin, Herbert P.

    1981-01-01

    A computer facility billing procedure should be designed so that the full costs of a computer center operation are equitably charged to the users. Design criteria, costing methods, and management's role are discussed. (Author/MLF)

  6. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITY Evaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  7. Air Quality Facilities

    Data.gov (United States)

    Iowa State University GIS Support and Research FacilityFacilities with operating permits for Title V of the Federal Clean Air Act, as well as facilities required to submit an air emissions inventory, and other...

  8. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  9. Analysis of removal of residual decay heat from interim storage facilities by means of the CFD program FLUENT

    International Nuclear Information System (INIS)

    Within the scope of nuclear licensing procedures of on-site interim storage facilities for dual purpose casks it is necessary, among other things, to provide proof of sufficient removal of the residual decay heat emitted by the casks. The results of the analyses performed for this purpose define e.g. the boundary conditions for further thermal analyses regarding the permissible cask component temperatures or the maximum permissible temperatures of the fuel cladding tubes of the fuel elements stored in the casks. Up to now, for the centralized interim storage facilities in Germany such analyses were performed on the basis of experimental investigations using scaled-down storage geometries. In the engineering phase of the Lingen on-site interim storage facility, proof was furnished for the first time using the CFD (computational fluid dynamics) program FLUENT. The program FLUENT is an internationally recognized and comprehensively verified program for the calculation of flow and heat transport processes. Starting from a brief discussion of modeling and the different boundary conditions of the computation, this contribution presents various results regarding the temperatures of air, cask surfaces and storage facility components, the mass flows through the storage facility and the heat transfer at the cask surface. The interface point to the cask-specific analyses is defined to be the cask surface

  10. An inventory of aeronautical ground research facilities. Volume 4: Engineering flight simulation facilities

    Science.gov (United States)

    Pirrello, C. J.; Hardin, R. D.; Capelluro, L. P.; Harrison, W. D.

    1971-01-01

    The general purpose capabilities of government and industry in the area of real time engineering flight simulation are discussed. The information covers computer equipment, visual systems, crew stations, and motion systems, along with brief statements of facility capabilities. Facility construction and typical operational costs are included where available. The facilities provide for economical and safe solutions to vehicle design, performance, control, and flying qualities problems of manned and unmanned flight systems.

  11. Central Pain Syndrome

    Science.gov (United States)

    ... Enhancing Diversity Find People About NINDS NINDS Central Pain Syndrome Information Page Table of Contents (click to ... being done? Clinical Trials Organizations What is Central Pain Syndrome? Central pain syndrome is a neurological condition ...

  12. Development of a Computer Program for the Analysis Logistics of PWR Spent Fuels

    International Nuclear Information System (INIS)

    It is expected that the temporary storage facilities at the nuclear power plants will be full of the spent fuels within 10 years. Provided that a centralized interim storage facility is constructed along the coast of the Korean peninsula to solve this problem, a substantial amount of spent fuels should be transported by sea or by land every year. In this paper we developed a computer program for the analysis of transportation logistics of the spent fuels from 4 different nuclear power plant sites to the hypothetical centralized interim storage facility and the final repository. Mass balance equations were used to analyze the logistics between the nuclear power plants and the interim storage facility. To this end a computer program, CASK, was developed by using the VISUAL BASIC language. The annual transportation rates of spent fuels from the four nuclear power plant sites were determined by using the CASK program. The parameter study with the program illustrated the easiness of logistics analysis. The program could be used for the cost analysis of the spent fuel transportation as well.

  13. Coping with distributed computing

    International Nuclear Information System (INIS)

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  14. SRMAFTE facility checkout model flow field analysis

    Science.gov (United States)

    Dill, Richard A.; Whitesides, Harold R.

    1992-07-01

    The Solid Rocket Motor Air Flow Equipment (SRMAFTE) facility was constructed for the purpose of evaluating the internal propellant, insulation, and nozzle configurations of solid propellant rocket motor designs. This makes the characterization of the facility internal flow field very important in assuring that no facility induced flow field features exist which would corrupt the model related measurements. In order to verify the design and operation of the facility, a three-dimensional computational flow field analysis was performed on the facility checkout model setup. The checkout model measurement data, one-dimensional and three-dimensional estimates were compared, and the design and proper operation of the facility was verified. The proper operation of the metering nozzles, adapter chamber transition, model nozzle, and diffuser were verified. The one-dimensional and three-dimensional flow field estimates along with the available measurement data are compared.

  15. Facility Registry Service (FRS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Facility Registry Service (FRS) provides an integrated source of comprehensive (air, water, and waste) environmental information about facilities across EPA,...

  16. Licensed Healthcare Facilities

    Data.gov (United States)

    California Department of Resources — The Licensed Healthcare Facilities point layer represents the locations of all healthcare facilities licensed by the State of California, Department of Health...

  17. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  18. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    Science.gov (United States)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  19. Central Virginia Community College 1999 Fact Book.

    Science.gov (United States)

    Hicks, Geoffrey

    The Fact Book is an annual publication of the Office of Research, Assessment and Planning at Central Virginia Community College (CVCC). It includes data and trends from 1994-1998 and is divided into six parts: (1) "The College" includes information on the College Board, institutional history, facilities, mission and goals, revenues and…

  20. The practice of central banking in other industrialized countries

    OpenAIRE

    Richard W. Kopcke

    2002-01-01

    Central banks in larger industrialized countries increasingly favor market operations, the buying and selling of securities, over standing facilities, such as lending and deposit facilities, in conducting their monetary policies. In their market operations, foreign central banks most commonly trade securities issued or guaranteed by their governments and repurchase agreements that are backed by a variety of assets, including private securities and securities denominated in foreign currencies....

  1. Nonlinear thermal and structural analysis of a brazed solar-central-receiver panel

    Energy Technology Data Exchange (ETDEWEB)

    Napolitano, L.M. Jr.; Kanouff, M.P.

    1981-07-01

    One part of the evaluation program for a molten sodium central receiver was to be a test of a reduced-scale panel at Sandia's Central Receiver Test Facility in Albuquerque. The panel incorporates a new way of joining tubes - brazing to intermediate filler strips - which can affect the panel's lifetime. To calculate the stresses and strains for the worst-case section of the experimental panel, we have done a nonlinear elastic-plastic analysis with the MARC finite element computer code, which takes the temperature dependence of the material properties into account. From the results, tube design lifetimes are predicted. The analysis shows that concerns for cracking and reduction in lifetime are warranted, but a more detailed fracture analysis is necessary to determine whether there is a stable-crack-growth problem.

  2. The CMS Computing Model

    International Nuclear Information System (INIS)

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  3. Guide to research facilities

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This Guide provides information on facilities at US Department of Energy (DOE) and other government laboratories that focus on research and development of energy efficiency and renewable energy technologies. These laboratories have opened these facilities to outside users within the scientific community to encourage cooperation between the laboratories and the private sector. The Guide features two types of facilities: designated user facilities and other research facilities. Designated user facilities are one-of-a-kind DOE facilities that are staffed by personnel with unparalleled expertise and that contain sophisticated equipment. Other research facilities are facilities at DOE and other government laboratories that provide sophisticated equipment, testing areas, or processes that may not be available at private facilities. Each facility listing includes the name and phone number of someone you can call for more information.

  4. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  5. The role of micro size computing clusters for small physics groups

    International Nuclear Information System (INIS)

    A small physics group (3-15 persons) might use a number of computing facilities for the analysis/simulation, developing/testing, teaching. It is discussed different types of computing facilities: collaboration computing facilities, group local computing cluster (including colocation), cloud computing. The author discuss the growing variety of different computing options for small groups and does emphasize the role of the group owned computing cluster of micro size.

  6. Coverage centralities for temporal networks*

    Science.gov (United States)

    Takaguchi, Taro; Yano, Yosuke; Yoshida, Yuichi

    2016-02-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex based on the fastest temporal paths which use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bottleneck time at which most information sent in the temporal network passes through a small number of temporal vertices, which suggests an important role of these temporal vertices in spreading phenomena. Contribution to the Topical Issue "Temporal Network Theory and Applications", edited by Petter Holme.Supplementary material in the form of one pdf file available from the Journal web page at http://dx.doi.org/10.1140/epjb/e2016-60498-7

  7. Coverage centralities for temporal networks

    CERN Document Server

    Takaguchi, Taro; Yoshida, Yuichi

    2015-01-01

    Structure of real networked systems, such as social relationship, can be modeled as temporal networks in which each edge appears only at the prescribed time. Understanding the structure of temporal networks requires quantifying the importance of a temporal vertex, which is a pair of vertex index and time. In this paper, we define two centrality measures of a temporal vertex by the proportion of (normal) vertex pairs, the quickest routes between which can (or should) use the temporal vertex. The definition is free from parameters and robust against the change in time scale on which we focus. In addition, we can efficiently compute these centrality values for all temporal vertices. Using the two centrality measures, we reveal that distributions of these centrality values of real-world temporal networks are heterogeneous. For various datasets, we also demonstrate that a majority of the highly central temporal vertices are located within a narrow time window around a particular time. In other words, there is a bo...

  8. To centralize or not to centralize?

    OpenAIRE

    Campbell, Andrew; Kunisch, Sven; Müller-Stewens, Günter

    2011-01-01

    The CEO's dilemma-were the gains of centralization worth the pain it could cause?-is a perennial one. Business leaders dating back at least to Alfred Sloan, who laid out GM's influential philosophy of decentralization in a series of memos during the 1920s, have recognized that badly judged centralization can stifle initiative, constrain the ability to tailor products and services locally, and burden business divisions with high costs and poor service.1 Insufficient centralization can deny bus...

  9. Evaluation of Visual Computer Simulator for Computer Architecture Education

    Science.gov (United States)

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  10. AOV Facility Tool/Facility Safety Specifications

    Data.gov (United States)

    Department of Transportation — Develop and maintain authorizing documents that are standards that facilities must follow. These standards are references of FAA regulations and are specific to the...

  11. Composite analysis E-area vaults and saltstone disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cook, J.R.

    1997-09-01

    This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potential sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public.

  12. Composite analysis E-area vaults and saltstone disposal facilities

    International Nuclear Information System (INIS)

    This report documents the Composite Analysis (CA) performed on the two active Savannah River Site (SRS) low-level radioactive waste (LLW) disposal facilities. The facilities are the Z-Area Saltstone Disposal Facility and the E-Area Vaults (EAV) Disposal Facility. The analysis calculated potential releases to the environment from all sources of residual radioactive material expected to remain in the General Separations Area (GSA). The GSA is the central part of SRS and contains all of the waste disposal facilities, chemical separations facilities and associated high-level waste storage facilities as well as numerous other sources of radioactive material. The analysis considered 114 potential sources of radioactive material containing 115 radionuclides. The results of the CA clearly indicate that continued disposal of low-level waste in the saltstone and EAV facilities, consistent with their respective radiological performance assessments, will have no adverse impact on future members of the public

  13. Manual for operation of the multipurpose thermalhydraulic test facility TOPFLOW (Transient Two Phase Flow Test Facility)

    International Nuclear Information System (INIS)

    The Forschungszentrum Rossendorf (FZR) e. V. is constructing a new large-scale test facility, TOPFLOW, for thermalhydraulic single effect tests. The acronym stands for transient two phase flow test facility. It will mainly be used for the investigation of generic and applied steady state and transient two phase flow phenomena and the development and validation of models of computational fluid dynamic (CFD) codes. The manual of the test facility must always be available for the staff in the control room and is restricted condition during operation of personnel and also reconstruction of the facility. (orig./GL)

  14. The Cost of Supplying Segmented Consumers From a Central Facility

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Klose, Andreas

    Organizations regularly face the strategic marketing decision which groups of consumers they should target. A potential problem, highlighted in Steenkamp et al. (2002), is that the target consumers may be so widely dispersed that an organization cannot serve its customers cost-effectively. We...

  15. Reminder: Mandatory Computer Security Course

    CERN Multimedia

    IT Department

    2011-01-01

    Just like any other organization, CERN is permanently under attack – even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Therefore, a new dedicated basic computer security course has been designed informing you about the “Do’s” and “Dont’s” when using CERN's computing facilities. This course is mandatory for all person owning a CERN computer account and must be followed once every three years. Users who have never done the course, or whose course needs to be renewe...

  16. New Mandatory Computer Security Course

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Just like any other organization, CERN is permanently under attack - even right now. Consequently it's important to be vigilant about security risks, protecting CERN's reputation - and your work. The availability, integrity and confidentiality of CERN's computing services and the unhindered operation of its accelerators and experiments come down to the combined efforts of the CERN Security Team and you. In order to remain par with the attack trends, the Security Team regularly reminds CERN users about the computer security risks, and about the rules for using CERN’s computing facilities. Since 2007, newcomers have to follow a dedicated basic computer security course informing them about the “Do’s” and “Dont’s” when using CERNs computing facilities. This course has recently been redesigned. It is now mandatory for all CERN members (users and staff) owning a CERN computer account and must be followed once every three years. Members who...

  17. In-facility transport code review

    Energy Technology Data Exchange (ETDEWEB)

    Spore, J.W.; Boyack, B.E.; Bohl, W.R. [and others

    1996-07-01

    The following computer codes were reviewed by the In-Facility Transport Working Group for application to the in-facility transport of radioactive aerosols, flammable gases, and/or toxic gases: (1) CONTAIN, (2) FIRAC, (3) GASFLOW, (4) KBERT, and (5) MELCOR. Based on the review criteria as described in this report and the versions of each code available at the time of the review, MELCOR is the best code for the analysis of in-facility transport when multidimensional effects are not significant. When multi-dimensional effects are significant, GASFLOW should be used.

  18. In-facility transport code review

    International Nuclear Information System (INIS)

    The following computer codes were reviewed by the In-Facility Transport Working Group for application to the in-facility transport of radioactive aerosols, flammable gases, and/or toxic gases: (1) CONTAIN, (2) FIRAC, (3) GASFLOW, (4) KBERT, and (5) MELCOR. Based on the review criteria as described in this report and the versions of each code available at the time of the review, MELCOR is the best code for the analysis of in-facility transport when multidimensional effects are not significant. When multi-dimensional effects are significant, GASFLOW should be used

  19. Human factors engineering report for the cold vacuum drying facility

    International Nuclear Information System (INIS)

    The purpose of this report is to present the results and findings of the final Human Factors Engineering (HFE) technical analysis and evaluation of the Cold Vacuum Drying Facility (CVDF). Ergonomics issues are also addressed in this report, as appropriate. This report follows up and completes the preliminary work accomplished and reported by the Preliminary HFE Analysis report (SNF-2825, Spent Nuclear Fuel Project Cold Vacuum Drying Facility Human Factors Engineering Analysis: Results and Findings). This analysis avoids redundancy of effort except for ensuring that previously recommended HFE design changes have not affected other parts of the system. Changes in one part of the system may affect other parts of the system where those changes were not applied. The final HFE analysis and evaluation of the CVDF human-machine interactions (HMI) was expanded to include: the physical work environment, human-computer interface (HCI) including workstation and software, operator tasks, tools, maintainability, communications, staffing, training, and the overall ability of humans to accomplish their responsibilities, as appropriate. Key focal areas for this report are the process bay operations, process water conditioning (PWC) skid, tank room, and Central Control Room operations. These key areas contain the system safety-class components and are the foundation for the human factors design basis of the CVDF

  20. Human factors engineering report for the cold vacuum drying facility

    Energy Technology Data Exchange (ETDEWEB)

    IMKER, F.W.

    1999-06-30

    The purpose of this report is to present the results and findings of the final Human Factors Engineering (HFE) technical analysis and evaluation of the Cold Vacuum Drying Facility (CVDF). Ergonomics issues are also addressed in this report, as appropriate. This report follows up and completes the preliminary work accomplished and reported by the Preliminary HFE Analysis report (SNF-2825, Spent Nuclear Fuel Project Cold Vacuum Drying Facility Human Factors Engineering Analysis: Results and Findings). This analysis avoids redundancy of effort except for ensuring that previously recommended HFE design changes have not affected other parts of the system. Changes in one part of the system may affect other parts of the system where those changes were not applied. The final HFE analysis and evaluation of the CVDF human-machine interactions (HMI) was expanded to include: the physical work environment, human-computer interface (HCI) including workstation and software, operator tasks, tools, maintainability, communications, staffing, training, and the overall ability of humans to accomplish their responsibilities, as appropriate. Key focal areas for this report are the process bay operations, process water conditioning (PWC) skid, tank room, and Central Control Room operations. These key areas contain the system safety-class components and are the foundation for the human factors design basis of the CVDF.