WorldWideScience

Sample records for ground computer facilities

  1. Communication grounding facility

    International Nuclear Information System (INIS)

    Lee, Gye Seong

    1998-06-01

    It is about communication grounding facility, which is made up twelve chapters. It includes general grounding with purpose, materials thermal insulating material, construction of grounding, super strength grounding method, grounding facility with grounding way and building of insulating, switched grounding with No. 1A and LCR, grounding facility of transmission line, wireless facility grounding, grounding facility in wireless base station, grounding of power facility, grounding low-tenton interior power wire, communication facility of railroad, install of arrester in apartment and house, install of arrester on introduction and earth conductivity and measurement with introduction and grounding resistance.

  2. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  3. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  4. TUNL computer facilities

    International Nuclear Information System (INIS)

    Boyd, M.; Edwards, S.E.; Gould, C.R.; Roberson, N.R.; Westerfeldt, C.R.

    1985-01-01

    The XSYS system has been relatively stable during the last year, and most of our efforts have involved routine software maintenance and enhancement of existing XSYS capabilities. Modifications were made in the MBD program GDAP to increase the execution speed in key GDAP routines. A package of routines has been developed to allow communication between the XSYS and the new Wien filter microprocessor. Recently the authors have upgraded their operating system from VSM V3.7 to V4.1. This required numerous modifications to XSYS, mostly in the command procedures. A new reorganized edition of the XSYS manual will be issued shortly. The TUNL High Resolution Laboratory's VAX 11/750 computer has been in operation for its first full year as a replacement for the PRIME 300 computer which was purchased in 1974 and retired nine months ago. The data acquisition system on the VAX has been in use for the past twelve months performing a number of experiments

  5. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  6. Lightning and surge protection of large ground facilities

    Science.gov (United States)

    Stringfellow, Michael F.

    1988-04-01

    The vulnerability of large ground facilities to direct lightning strikes and to lightning-induced overvoltages on the power distribution, telephone and data communication lines are discussed. Advanced electrogeometric modeling is used for the calculation of direct strikes to overhead power lines, buildings, vehicles and objects within the facility. Possible modes of damage, injury and loss are discussed. Some appropriate protection methods for overhead power lines, structures, vehicles and aircraft are suggested. Methods to mitigate the effects of transients on overhead and underground power systems as well as within buildings and other structures are recommended. The specification and location of low-voltage surge suppressors for the protection of vulnerable hardware such as computers, telecommunication equipment and radar installations are considered. The advantages and disadvantages of commonly used grounding techniques, such as single point, multiple and isolated grounds are compared. An example is given of the expected distribution of lightning flashes to a large airport, its buildings, structures and facilities, as well as to vehicles on the ground.

  7. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    Cavina, A.

    2013-01-01

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  8. Computer-Aided Facilities Management Systems (CAFM).

    Science.gov (United States)

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  9. Quantum computing from the ground up

    CERN Document Server

    Perry, Riley Tipton

    2012-01-01

    Quantum computing - the application of quantum mechanics to information - represents a fundamental break from classical information and promises to dramatically increase a computer's power. Many difficult problems, such as the factorization of large numbers, have so far resisted attack by classical computers yet are easily solved with quantum computers. If they become feasible, quantum computers will end standard practices such as RSA encryption. Most of the books or papers on quantum computing require (or assume) prior knowledge of certain areas such as linear algebra or quantum mechanics. The majority of the currently-available literature is hard to understand for the average computer enthusiast or interested layman. This text attempts to teach quantum computing from the ground up in an easily readable way, providing a comprehensive tutorial that includes all the necessary mathematics, computer science and physics.

  10. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  11. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  12. BUSTED BUTTE TEST FACILITY GROUND SUPPORT CONFIRMATION ANALYSIS

    International Nuclear Information System (INIS)

    Bonabian, S.

    1998-01-01

    The main purpose and objective of this analysis is to confirm the validity of the ground support design for Busted Butte Test Facility (BBTF). The highwall stability and adequacy of highwall and tunnel ground support is addressed in this analysis. The design of the BBTF including the ground support system was performed in a separate document (Reference 5.3). Both in situ and seismic loads are considered in the evaluation of the highwall and the tunnel ground support system. In this analysis only the ground support designed in Reference 5.3 is addressed. The additional ground support installed (still work in progress) by the constructor is not addressed in this analysis. This additional ground support was evaluated by the A/E during a site visit and its findings and recommendations are addressed in this analysis

  13. Guidance on the Stand Down, Mothball, and Reactivation of Ground Test Facilities

    Science.gov (United States)

    Volkman, Gregrey T.; Dunn, Steven C.

    2013-01-01

    The development of aerospace and aeronautics products typically requires three distinct types of testing resources across research, development, test, and evaluation: experimental ground testing, computational "testing" and development, and flight testing. Over the last twenty plus years, computational methods have replaced some physical experiments and this trend is continuing. The result is decreased utilization of ground test capabilities and, along with market forces, industry consolidation, and other factors, has resulted in the stand down and oftentimes closure of many ground test facilities. Ground test capabilities are (and very likely will continue to be for many years) required to verify computational results and to provide information for regimes where computational methods remain immature. Ground test capabilities are very costly to build and to maintain, so once constructed and operational it may be desirable to retain access to those capabilities even if not currently needed. One means of doing this while reducing ongoing sustainment costs is to stand down the facility into a "mothball" status - keeping it alive to bring it back when needed. Both NASA and the US Department of Defense have policies to accomplish the mothball of a facility, but with little detail. This paper offers a generic process to follow that can be tailored based on the needs of the owner and the applicable facility.

  14. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  15. Proposed design criteria for a fusion facility electrical ground system

    International Nuclear Information System (INIS)

    Armellino, C.A.

    1983-01-01

    Ground grid design considerations for a nuclear fusion reactor facility are no different than any other facility in that the basis for design must be safety first and foremost. Unlike a conventional industrial facility the available fault energy comes not only from the utility source and in-house rotating machinery, but also from energy storage capacitor banks, collapsing magnetic fields and D.C. transmission lines. It is not inconceivable for a fault condition occurrence where all available energy can be discharged. The ground grid must adequately shunt this sudden energy discharge in a way that personnel will not be exposed by step and/or touch to hazardous energy levels that are in excess of maximum tolerable levels for humans. Fault energy discharge rate is a function of the ground grid surge impedance characteristic. Closed loop paths must be avoided in the ground grid design so that during energy discharge no stray magnetic fields or large voltage potentials between remote points can be created by circulating currents. Single point connection of equipment to the ground grid will afford protection to personnel and sensitive equipment by reducing the probability of circulating currents. The overall ground grid system design is best illustrated as a wagon wheel concept with the fusion machine at the center. Radial branches or spokes reach out to the perimeter limits designated by step-and-touch high risk areas based on soil resistivity criteria considerations. Conventional methods for the design of a ground grid with all of its radial branches are still pertinent. The center of the grid could include a deep well single ground rod element the length of which is at least equivalent to the radius of an imaginary sphere that enshrouds the immediate machine area. Special facilities such as screen rooms or other shielded areas are part of the ground grid system by way of connection to radial branches

  16. Oak Ridge Leadership Computing Facility (OLCF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of standing up a supercomputer 100 times...

  17. Computing facility at SSC for detectors

    International Nuclear Information System (INIS)

    Leibold, P.; Scipiono, B.

    1990-01-01

    A description of the RISC-based distributed computing facility for detector simulaiton being developed at the SSC Laboratory is discussed. The first phase of this facility is scheduled for completion in early 1991. Included is the status of the project, overview of the concepts used to model and define system architecture, networking capabilities for user access, plans for support of physics codes and related topics concerning the implementation of this facility

  18. Ground test facility for nuclear testing of space reactor subsystems

    International Nuclear Information System (INIS)

    Quapp, W.J.; Watts, K.D.

    1985-01-01

    Two major reactor facilities at the INEL have been identified as easily adaptable for supporting the nuclear testing of the SP-100 reactor subsystem. They are the Engineering Test Reactor (ETR) and the Loss of Fluid Test Reactor (LOFT). In addition, there are machine shops, analytical laboratories, hot cells, and the supporting services (fire protection, safety, security, medical, waste management, etc.) necessary to conducting a nuclear test program. This paper presents the conceptual approach for modifying these reactor facilities for the ground engineering test facility for the SP-100 nuclear subsystem. 4 figs

  19. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  20. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  1. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  2. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  3. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  4. Computer codes for ventilation in nuclear facilities

    International Nuclear Information System (INIS)

    Mulcey, P.

    1987-01-01

    In this paper the authors present some computer codes, developed in the last years, for ventilation and radioprotection. These codes are used for safety analysis in the conception, exploitation and dismantlement of nuclear facilities. The authors present particularly: DACC1 code used for aerosol deposit in sampling circuit of radiation monitors; PIAF code used for modelization of complex ventilation system; CLIMAT 6 code used for optimization of air conditioning system [fr

  5. Solving satisfiability problems by the ground-state quantum computer

    International Nuclear Information System (INIS)

    Mao Wenjin

    2005-01-01

    A quantum algorithm is proposed to solve the satisfiability (SAT) problems by the ground-state quantum computer. The scale of the energy gap of the ground-state quantum computer is analyzed for the 3-bit exact cover problem. The time cost of this algorithm on the general SAT problems is discussed

  6. Disposal facility in Olkiluoto, description of above ground facilities in tunnel transport alternative

    International Nuclear Information System (INIS)

    Kukkola, T.

    2006-11-01

    The above ground facilities of the disposal plant on the Olkiluoto site are described in this report as they will be when the operation of the disposal facility starts in the year 2020. The disposal plant is visualised on the Olkiluoto site. Parallel construction of the deposition tunnels and disposal of the spent fuel canisters constitute the principal design basis of the disposal plant. The annual production of disposal canisters for spent fuel amounts to about 40. Production of 100 disposal canisters has been used as the capacity basis. Fuel from the Olkiluoto plant and from the Loviisa plant will be encapsulated in the same production line. The disposal plant will require an area of about 15 to 20 hectares above ground level. The total building volume of the above ground facilities is about 75000 m 3 . The purpose of the report is to provide the base for detailed design of the encapsulation plant and the repository spaces, as well as for coordination between the disposal plant and ONKALO. The dimensioning bases for the disposal plant are shown in the Tables at the end of the report. The report can also be used as a basis for comparison in deciding whether the fuel canisters are transported to the repository by a lift or a by vehicle along the access tunnel. (orig.)

  7. Disposal facility in olkiluoto, description of above ground facilities in lift transport alternative

    International Nuclear Information System (INIS)

    Kukkola, T.

    2006-11-01

    The above ground facilities of the disposal plant on the Olkiluoto site are described in this report as they will be when the operation of the disposal facility starts in the year 2020. The disposal plant is visualised on the Olkiluoto site. Parallel construction of the deposition tunnels and disposal of the spent fuel canisters constitute the principal design basis of the disposal plant. The annual production of disposal canisters for spent fuel amounts to about 40. Production of 100 disposal canisters has been used as the capacity basis. Fuel from the Olkiluoto plant and from the Loviisa plant will be encapsulated in the same production line. The disposal plant will require an area of about 15 to 20 hectares above ground level. The total building volume of the above ground facilities is about 75000 m 3 . The purpose of the report is to provide the base for detailed design of the encapsulation plant and the repository spaces, as well as for coordination between the disposal plant and ONKALO. The dimensioning bases for the disposal plant are shown in the Tables at the end of the report. The report can also be used as a basis for comparison in deciding whether the fuel canisters are transported to the repository by a lift or by a vehicle along the access tunnel. (orig.)

  8. Embracing Safe Ground Test Facility Operations and Maintenance

    Science.gov (United States)

    Dunn, Steven C.; Green, Donald R.

    2010-01-01

    Conducting integrated operations and maintenance in wind tunnel ground test facilities requires a balance of meeting due dates, efficient operation, responsiveness to the test customer, data quality, effective maintenance (relating to readiness and reliability), and personnel and facility safety. Safety is non-negotiable, so the balance must be an "and" with other requirements and needs. Pressure to deliver services faster at increasing levels of quality in under-maintained facilities is typical. A challenge for management is to balance the "need for speed" with safety and quality. It s especially important to communicate this balance across the organization - workers, with a desire to perform, can be tempted to cut corners on defined processes to increase speed. Having a lean staff can extend the time required for pre-test preparations, so providing a safe work environment for facility personnel and providing good stewardship for expensive National capabilities can be put at risk by one well-intending person using at-risk behavior. This paper documents a specific, though typical, operational environment and cites management and worker safety initiatives and tools used to provide a safe work environment. Results are presented and clearly show that the work environment is a relatively safe one, though still not good enough to keep from preventing injury. So, the journey to a zero injury work environment - both in measured reality and in the minds of each employee - continues. The intent of this paper is to provide a benchmark for others with operational environments and stimulate additional sharing and discussion on having and keeping a safe work environment.

  9. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  10. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  11. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  12. Ground Shock Resistant of Buried Nuclear Power Plant Facility

    International Nuclear Information System (INIS)

    Ornai, D.; Adar, A.; Gal, E.

    2014-01-01

    Nuclear Power Plant (NPP) might be subjected to hostile attacks such as Earth Penetrating Weapons (EPW) that carry explosive charges. Explosions of these weapons near buried NPP facility might cause collapse, breaching, spalling, deflection, shear, rigid body motion (depending upon the foundations), and in-structure shock. The occupants and the equipment in the buried facilities are exposed to the in-structure motions, and if they are greater than their fragility values than occupants might be wounded or killed and the equipment might be damaged, unless protective measures will be applied. NPP critical equipment such as pumps are vital for the normal safe operation since it requires constant water circulation between the nuclear reactor and the cooling system, including in case of an immediate shut down. This paper presents analytical- semi empirical formulation and analysis of the explosion of a penetrating weapon with a warhead of 100kgs TNT (Trinitrotoluene) that creates ground shock effect on underground NPP structure containing equipment, such as a typical pump. If the in-structure spectral shock is greater than the pump fragility values than protective measures are required, otherwise a real danger to the NPP safety might occur

  13. Ground breaking at Astrotech for a new facility

    Science.gov (United States)

    1999-01-01

    Dirt flies during a ground-breaking ceremony to kick off Astrotech Space Operations' construction of a new satellite preparation facility to support the Delta IV, Boeing's winning entrant in the Air Force Evolved Expendable Launch Vehicle (EELV) Program. Wielding shovels are (from left to right) Tom Alexico; Chet Lee, chairman, Astrotech Space Operations; Gen. Forrest McCartney, vice president, Launch Operations, Lockheed Martin; Richard Murphy, director, Delta Launch Operations, The Boeing Company; Keith Wendt; Toby Voltz; Loren Shriver, deputy director, Launch & Payload Processing, Kennedy Space Center; Truman Scarborough, Brevard County commissioner; U.S. Representative 15th Congressional District David Weldon; Ron Swank; and watching the action at right is George Baker, president, Astrotech Space Operations. Astrotech is located in Titusville, Fla. It is a wholly owned subsidiary of SPACEHAB, Inc., and has been awarded a 10-year contract to provide payload processing services for The Boeing Company. The facility will enable Astrotech to support the full range of satellite sizes planned for launch aboard Delta II, III and IV launch vehicles, as well as the Atlas V, Lockheed Martin's entrant in the EELV Program. The Atlas V will be used to launch satellites for government, including NASA, and commercial customers.

  14. An assessment of testing requirement impacts on nuclear thermal propulsion ground test facility design

    International Nuclear Information System (INIS)

    Shipers, L.R.; Ottinger, C.A.; Sanchez, L.C.

    1993-01-01

    Programs to develop solid core nuclear thermal propulsion (NTP) systems have been under way at the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), and the Department of Energy (DOE). These programs have recognized the need for a new ground test facility to support development of NTP systems. However, the different military and civilian applications have led to different ground test facility requirements. The Department of Energy (DOE) in its role as landlord and operator of the proposed research reactor test facilities has initiated an effort to explore opportunities for a common ground test facility to meet both DoD and NASA needs. The baseline design and operating limits of the proposed DoD NTP ground test facility are described. The NASA ground test facility requirements are reviewed and their potential impact on the DoD facility baseline is discussed

  15. Joint ACE ground penetrating radar antenna test facility at the Technical University of Denmark

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter; Sarri, A.

    2005-01-01

    A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented.......A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented....

  16. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  17. Computer Security at Nuclear Facilities (French Edition)

    International Nuclear Information System (INIS)

    2013-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  18. Verifying a computational method for predicting extreme ground motion

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  19. Use of Ground Penetrating Radar at the FAA's National Airport Pavement Test Facility

    Science.gov (United States)

    Injun, Song

    2015-04-01

    The Federal Aviation Administration (FAA) in the United States has used a ground-coupled Ground Penetrating Radar (GPR) at the National Airport Pavement Test Facility (NAPTF) since 2005. One of the primary objectives of the testing at the facility is to provide full-scale pavement response and failure information for use in airplane landing gear design and configuration studies. During the traffic testing at the facility, a GSSI GPR system was used to develop new procedures for monitoring Hot Mix Asphalt (HMA) pavement density changes that is directly related to pavement failure. After reviewing current setups for data acquisition software and procedures for identifying different pavement layers, dielectric constant and pavement thickness were selected as dominant parameters controlling HMA properties provided by GPR. A new methodology showing HMA density changes in terms of dielectric constant variations, called dielectric sweep test, was developed and applied in full-scale pavement test. The dielectric constant changes were successfully monitored with increasing airplane traffic numbers. The changes were compared to pavement performance data (permanent deformation). The measured dielectric constants based on the known HMA thicknesses were also compared with computed dielectric constants using an equation from ASTM D4748-98 Standard Test Method for Determining the Thickness of Bound Pavement Layers Using Short-Pulse Radar. Six inches diameter cylindrical cores were taken after construction and traffic testing for the HMA layer bulk specific gravity. The measured bulk specific gravity was also compared to monitor HMA density changes caused by aircraft traffic conditions. Additionally this presentation will review the applications of the FAA's ground-coupled GPR on embedded rebar identification in concrete pavement, sewer pipes in soil, and gage identifications in 3D plots.

  20. Computer Profile of School Facilities Energy Consumption.

    Science.gov (United States)

    Oswalt, Felix E.

    This document outlines a computerized management tool designed to enable building managers to identify energy consumption as related to types and uses of school facilities for the purpose of evaluating and managing the operation, maintenance, modification, and planning of new facilities. Specifically, it is expected that the statistics generated…

  1. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  2. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  3. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  4. Stormwater Pollution Prevention Plan TA-60 Roads and Grounds Facility and Associated Sigma Mesa Staging Area

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, Leonard Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-01

    This Stormwater Pollution Prevention Plan (SWPPP) is applicable to operations at the Technical Area -60 (TA-60) Roads and Grounds Facility and Associated Sigma Mesa Staging Area off Eniwetok Drive, in Los Alamos County, New Mexico.

  5. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  6. Ground test facilities for evaluating nuclear thermal propulsion engines and fuel elements

    International Nuclear Information System (INIS)

    Allen, G.C.; Beck, D.F.; Harmon, C.D.; Shipers, L.R.

    1992-01-01

    Interagency panels evaluating nuclear thermal propulsion development options have consistently recognized the need for constructing a major new ground test facility to support fuel element and engine testing. This paper summarizes the requirements, configuration, and design issues of a proposed ground test complex for evaluating nuclear thermal propulsion engines and fuel elements being developed for the Space Nuclear Thermal Propulsion (SNTP) program. 2 refs

  7. Integration of small computers in the low budget facility

    International Nuclear Information System (INIS)

    Miller, G.E.; Crofoot, T.A.

    1988-01-01

    Inexpensive computers (PC's) are well within the reach of low budget reactor facilities. It is possible to envisage many uses that will both improve capabilities of existing instrumentation and also assist operators and staff with certain routine tasks. Both of these opportunities are important for survival at facilities with severe budget and staffing limitations. (author)

  8. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  9. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  10. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Van Arsdall, P.J. LLNL

    1998-01-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  11. Computers in experimental nuclear power facilities

    International Nuclear Information System (INIS)

    Jukl, M.

    1982-01-01

    The CIS 3000 information system is described used for monitoring the operating modes of large technological equipment. The CIS system consists of two ADT computers, an external drum store an analog input side, a bivalent input side, 4 control consoles with monitors and acoustic signalling, a print-out area with typewriters and punching machines and linear recorders. Various applications are described of the installed CIS configuration as is the general-purpose program for processing measured values into a protocol. The program operates in the conversational mode. Different processing variants are shown on the display monitor. (M.D.)

  12. MLSOIL and DFSOIL - computer codes to estimate effective ground surface concentrations for dose computations

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Kocher, D.C.; Killough, G.G.; Miller, C.W.

    1984-11-01

    This report is a user's manual for MLSOIL (Multiple Layer SOIL model) and DFSOIL (Dose Factors for MLSOIL) and a documentation of the computational methods used in those two computer codes. MLSOIL calculates an effective ground surface concentration to be used in computations of external doses. This effective ground surface concentration is equal to (the computed dose in air from the concentration in the soil layers)/(the dose factor for computing dose in air from a plane). MLSOIL implements a five compartment linear-transfer model to calculate the concentrations of radionuclides in the soil following deposition on the ground surface from the atmosphere. The model considers leaching through the soil as well as radioactive decay and buildup. The element-specific transfer coefficients used in this model are a function of the k/sub d/ and environmental parameters. DFSOIL calculates the dose in air per unit concentration at 1 m above the ground from each of the five soil layers used in MLSOIL and the dose per unit concentration from an infinite plane source. MLSOIL and DFSOIL have been written to be part of the Computerized Radiological Risk Investigation System (CRRIS) which is designed for assessments of the health effects of airborne releases of radionuclides. 31 references, 3 figures, 4 tables

  13. Common ground, not a battle ground. Violence prevention at a detoxification facility.

    Science.gov (United States)

    Adamson, Mary A; Vincent, Audrey A; Cundiff, Jeff

    2009-08-01

    This article evaluates the results of a workplace violence prevention program implemented in a Colorado detoxification facility. The program interventions are modeled after federal Occupational Safety and Health Administration guidelines and use theories from both nursing and criminology for philosophy and direction. Serving as its own control, the detoxification facility shares data measured over a 4-year period, demonstrating a sharp decline in assault rates after program implementation. The importance of administrative controls, environmental adjustments, recordkeeping and evaluation, and education and training are emphasized as key components of success. Copyright (c) 2009, SLACK Incorporated.

  14. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  15. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  16. COMPUTER ORIENTED FACILITIES OF TEACHING AND INFORMATIVE COMPETENCE

    Directory of Open Access Journals (Sweden)

    Olga M. Naumenko

    2010-09-01

    Full Text Available In the article it is considered the history of views to the tasks of education, estimations of its effectiveness from the point of view of forming of basic vitally important competences. Opinions to the problem in different countries, international organizations, corresponding experience of the Ukrainian system of education are described. The necessity of forming of informative competence of future teacher is reasonable in the conditions of application of the computer oriented facilities of teaching at the study of naturally scientific cycle subjects in pedagogical colleges. Prognosis estimations concerning the development of methods of application of computer oriented facilities of teaching are presented.

  17. Neutronic computational modeling of the ASTRA critical facility using MCNPX

    International Nuclear Information System (INIS)

    Rodriguez, L. P.; Garcia, C. R.; Milian, D.; Milian, E. E.; Brayner, C.

    2015-01-01

    The Pebble Bed Very High Temperature Reactor is considered as a prominent candidate among Generation IV nuclear energy systems. Nevertheless the Pebble Bed Very High Temperature Reactor faces an important challenge due to the insufficient validation of computer codes currently available for use in its design and safety analysis. In this paper a detailed IAEA computational benchmark announced by IAEA-TECDOC-1694 in the framework of the Coordinated Research Project 'Evaluation of High Temperature Gas Cooled Reactor (HTGR) Performance' was solved in support of the Generation IV computer codes validation effort using MCNPX ver. 2.6e computational code. In the IAEA-TECDOC-1694 were summarized a set of four calculational benchmark problems performed at the ASTRA critical facility. Benchmark problems include criticality experiments, control rod worth measurements and reactivity measurements. The ASTRA Critical Facility at the Kurchatov Institute in Moscow was used to simulate the neutronic behavior of nuclear pebble bed reactors. (Author)

  18. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    Krammen, J.

    1985-01-01

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  19. Computer-Assisted School Facility Planning with ONPASS.

    Science.gov (United States)

    Urban Decision Systems, Inc., Los Angeles, CA.

    The analytical capabilities of ONPASS, an on-line computer-aided school facility planning system, are described by its developers. This report describes how, using the Canoga Park-Winnetka-Woodland Hills Planning Area as a test case, the Department of City Planning of the city of Los Angeles employed ONPASS to demonstrate how an on-line system can…

  20. A Grounded Theory Analysis of Introductory Computer Science Pedagogy

    Directory of Open Access Journals (Sweden)

    Jonathan Wellons

    2011-12-01

    Full Text Available Planning is a critical, early step on the path to successful program writing and a skill that is often lacking in novice programmers. As practitioners we are continually searching for or creating interventions to help our students, particularly those who struggle in the early stages of their computer science education. In this paper we report on our ongoing research of novice programming skills that utilizes the qualitative research method of grounded theory to develop theories and inform the construction of these interventions. We describe how grounded theory, a popular research method in the social sciences since the 1960’s, can lend formality and structure to the common practice of simply asking students what they did and why they did it. Further, we aim to inform the reader not only about our emerging theories on interventions for planning but also how they might collect and analyze their own data in this and other areas that trouble novice programmers. In this way those who lecture and design CS1 interventions can do so from a more informed perspective.

  1. The ACE-DTU Planar Near-Field Ground Penetrating Radar Antenna Test Facility

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter

    2004-01-01

    The ACE-DTU planar near-field ground penetrating radar (GPR) antenna test facility is used to measure the plane-wave transmitting spectrum of a GPR loop antenna close to the air-soil interface by means of a probe buried in soil. Probe correction is implemented using knowledge about the complex...

  2. Concept of ground facilities and the analyses of the factors for cost estimation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Choi, H. J.; Choi, J. W.; Kim, S. K.; Cho, D. K

    2007-09-15

    The geologic disposal of spent fuels generated from the nuclear power plants is the only way to protect the human beings and the surrounding environments present and future. The direct disposal of the spent fuels from the nuclear power plants is considered, and a Korean Reference HLW disposal System(KRS) suitable for our representative geological conditions have been developed. In this study, the concept of the spent fuel encapsulation process as a key of the above ground facilities for deep geological disposal was established. To do this, the design requirements, such as the functions and the spent fuel accumulations, were reviewed. Also, the design principles and the bases were established. Based on the requirements and the bases, the encapsulation process of the spent fuel from receiving spent fuel of nuclear power plants to transferring canister into the underground repository was established. Simulation for the above-ground facility in graphic circumstances through KRS design concept and disposal scenarios for spent nuclear fuel showed that an appropriate process was performed based on facility design concept and required for more improvement on construction facility by actual demonstration test. And, based on the concept of the above ground facilities for the Korean Reference HLW disposal System, the analyses of the factors for the cost estimation was carried out.

  3. Effluent Containment System for space thermal nuclear propulsion ground test facilities

    International Nuclear Information System (INIS)

    1995-08-01

    This report presents the research and development study work performed for the Space Reactor Power System Division of the U.S. Department of Energy on an innovative ECS that would be used during ground testing of a space nuclear thermal rocket engine. A significant portion of the ground test facilities for a space nuclear thermal propulsion engine are the effluent treatment and containment systems. The proposed ECS configuration developed recycles all engine coolant media and does not impact the environment by venting radioactive material. All coolant media, hydrogen and water, are collected, treated for removal of radioactive particulates, and recycled for use in subsequent tests until the end of the facility life. Radioactive materials removed by the treatment systems are recovered, stored for decay of short-lived isotopes, or packaged for disposal as waste. At the end of the useful life, the facility will be decontaminated and dismantled for disposal

  4. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  5. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  6. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  7. COMPUTER ORIENTED FACILITIES OF TEACHING AND INFORMATIVE COMPETENCE

    OpenAIRE

    Olga M. Naumenko

    2010-01-01

    In the article it is considered the history of views to the tasks of education, estimations of its effectiveness from the point of view of forming of basic vitally important competences. Opinions to the problem in different countries, international organizations, corresponding experience of the Ukrainian system of education are described. The necessity of forming of informative competence of future teacher is reasonable in the conditions of application of the computer oriented facilities of t...

  8. Life-Cycle Assessments of Selected NASA Ground-Based Test Facilities

    Science.gov (United States)

    Sydnor, George Honeycutt

    2012-01-01

    In the past two years, two separate facility-specific life cycle assessments (LCAs) have been performed as summer student projects. The first project focused on 13 facilities managed by NASA s Aeronautics Test Program (ATP), an organization responsible for large, high-energy ground test facilities that accomplish the nation s most advanced aerospace research. A facility inventory was created for each facility, and the operational-phase carbon footprint and environmental impact were calculated. The largest impacts stemmed from electricity and natural gas used directly at the facility and to generate support processes such as compressed air and steam. However, in specialized facilities that use unique inputs like R-134a, R-14, jet fuels, or nitrogen gas, these sometimes had a considerable effect on the facility s overall environmental impact. The second LCA project was conducted on the NASA Ames Arc Jet Complex and also involved creating a facility inventory and calculating the carbon footprint and environmental impact. In addition, operational alternatives were analyzed for their effectiveness at reducing impact. Overall, the Arc Jet Complex impact is dominated by the natural-gas fired boiler producing steam on-site, but alternatives were provided that could reduce the impact of the boiler operation, some of which are already being implemented. The data and results provided by these LCA projects are beneficial to both the individual facilities and NASA as a whole; the results have already been used in a proposal to reduce carbon footprint at Ames Research Center. To help future life cycle projects, several lessons learned have been recommended as simple and effective infrastructure improvements to NASA, including better utility metering and data recording and standardization of modeling choices and methods. These studies also increased sensitivity to and appreciation for quantifying the impact of NASA s activities.

  9. Development and use of interactive displays in real-time ground support research facilities

    Science.gov (United States)

    Rhea, Donald C.; Hammons, Kvin R.; Malone, Jacqueline C.; Nesel, Michael C.

    1989-01-01

    The NASA Western Aeronautical Test Range (WATR) is one of the world's most advanced aeronautical research flight test support facilities. A variety of advanced and often unique real-time interactive displays has been developed for use in the mission control centers (MCC) to support research flight and ground testing. These dispalys consist of applications operating on systems described as real-time interactive graphics super workstations and real-time interactive PC/AT compatible workstations. This paper reviews these two types of workstations and the specific applications operating on each display system. The applications provide examples that demonstrate overall system capability applicable for use in other ground-based real-time research/test facilities.

  10. Hanford facility dangerous waste permit application, low-level burial grounds

    International Nuclear Information System (INIS)

    Engelmann, R.H.

    1997-01-01

    The Hanford Facility Dangerous Plaste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, 'operating' treatment, storage, and/or disposal units, such as the Low-Level Burial Grounds (this document, DOE/RL-88-20)

  11. Prospects for studies of ground-state proton decays with the Holifield Radioactive Ion Beam Facility

    International Nuclear Information System (INIS)

    Toth, K.S.

    1994-01-01

    By using radioactive ions from the Holifield Radioactive Ion Beam Facility at Oak Ridge National Laboratory it should be possible to identify many new ground-state proton emitters in the mass region from Sn to Pb. During this production and search process the limits of stability on the proton-rich side of the nuclidic chart will be delineated for a significant fraction of medium-weight elements and our understanding of the proton-emission process will be expanded and improved

  12. Hanford environment as related to radioactive waste burial grounds and transuranium waste storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.J.; Isaacson, R.E.

    1977-06-01

    A detailed characterization of the existing environment at Hanford was provided by the U.S. Energy Research and Development Administration (ERDA) in the Final Environmental Statement, Waste Management Operations, Hanford Reservation, Richland, Washington, December 1975. Abbreviated discussions from that document are presented together with current data, as they pertain to radioactive waste burial grounds and interim transuranic (TRU) waste storage facilities. The discussions and data are presented in sections on geology, hydrology, ecology, and natural phenomena. (JRD)

  13. Hanford environment as related to radioactive waste burial grounds and transuranium waste storage facilities

    International Nuclear Information System (INIS)

    Brown, D.J.; Isaacson, R.E.

    1977-06-01

    A detailed characterization of the existing environment at Hanford was provided by the U.S. Energy Research and Development Administration (ERDA) in the Final Environmental Statement, Waste Management Operations, Hanford Reservation, Richland, Washington, December 1975. Abbreviated discussions from that document are presented together with current data, as they pertain to radioactive waste burial grounds and interim transuranic (TRU) waste storage facilities. The discussions and data are presented in sections on geology, hydrology, ecology, and natural phenomena

  14. Hanford facility dangerous waste permit application, low-level burial grounds

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, R.H.

    1997-08-12

    The Hanford Facility Dangerous Plaste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, `operating` treatment, storage, and/or disposal units, such as the Low-Level Burial Grounds (this document, DOE/RL-88-20).

  15. Shieldings for X-ray radiotherapy facilities calculated by computer

    International Nuclear Information System (INIS)

    Pedrosa, Paulo S.; Farias, Marcos S.; Gavazza, Sergio

    2005-01-01

    This work presents a methodology for calculation of X-ray shielding in facilities of radiotherapy with help of computer. Even today, in Brazil, the calculation of shielding for X-ray radiotherapy is done based on NCRP-49 recommendation establishing a methodology for calculating required to the elaboration of a project of shielding. With regard to high energies, where is necessary the construction of a labyrinth, the NCRP-49 is not very clear, so that in this field, studies were made resulting in an article that proposes a solution to the problem. It was developed a friendly program in Delphi programming language that, through the manual data entry of a basic design of architecture and some parameters, interprets the geometry and calculates the shields of the walls, ceiling and floor of on X-ray radiation therapy facility. As the final product, this program provides a graphical screen on the computer with all the input data and the calculation of shieldings and the calculation memory. The program can be applied in practical implementation of shielding projects for radiotherapy facilities and can be used in a didactic way compared to NCRP-49.

  16. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  17. Ground model and computer complex for designing underground explosions

    Energy Technology Data Exchange (ETDEWEB)

    Bashurov, V.V.; Vakhrameev, Yu.S.; Dem' yanovskii, S.V.; Ignatenko, V.V.; Simonova, T.V.

    1977-01-01

    A description is given of a ground model that accounts for large deformations, their irreversibility, loose rock, breakdown, resistance to internal friction, and other factors. Calculations from the American Sulky explosion and camouflage detonations of two spaced explosive charges are cited as examples illustrating the possibility of design methods and the suitability of ground state equations for describing underground detonations.

  18. An Experimental Facility to Validate Ground Source Heat Pump Optimisation Models for the Australian Climate

    Directory of Open Access Journals (Sweden)

    Yuanshen Lu

    2017-01-01

    Full Text Available Ground source heat pumps (GSHPs are one of the most widespread forms of geothermal energy technology. They utilise the near-constant temperature of the ground below the frost line to achieve energy-efficiencies two or three times that of conventional air-conditioners, consequently allowing a significant offset in electricity demand for space heating and cooling. Relatively mature GSHP markets are established in Europe and North America. GSHP implementation in Australia, however, is limited, due to high capital price, uncertainties regarding optimum designs for the Australian climate, and limited consumer confidence in the technology. Existing GSHP design standards developed in the Northern Hemisphere are likely to lead to suboptimal performance in Australia where demand might be much more cooling-dominated. There is an urgent need to develop Australia’s own GSHP system optimisation principles on top of the industry standards to provide confidence to bring the GSHP market out of its infancy. To assist in this, the Queensland Geothermal Energy Centre of Excellence (QGECE has commissioned a fully instrumented GSHP experimental facility in Gatton, Australia, as a publically-accessible demonstration of the technology and a platform for systematic studies of GSHPs, including optimisation of design and operations. This paper presents a brief review on current GSHP use in Australia, the technical details of the Gatton GSHP facility, and an analysis on the observed cooling performance of this facility to date.

  19. Ground-glass opacity: High-resolution computed tomography and 64-multi-slice computed tomography findings comparison

    International Nuclear Information System (INIS)

    Sergiacomi, Gianluigi; Ciccio, Carmelo; Boi, Luca; Velari, Luca; Crusco, Sonia; Orlacchio, Antonio; Simonetti, Giovanni

    2010-01-01

    Objective: Comparative evaluation of ground-glass opacity using conventional high-resolution computed tomography technique and volumetric computed tomography by 64-row multi-slice scanner, verifying advantage of volumetric acquisition and post-processing technique allowed by 64-row CT scanner. Methods: Thirty-four patients, in which was assessed ground-glass opacity pattern by previous high-resolution computed tomography during a clinical-radiological follow-up for their lung disease, were studied by means of 64-row multi-slice computed tomography. Comparative evaluation of image quality was done by both CT modalities. Results: It was reported good inter-observer agreement (k value 0.78-0.90) in detection of ground-glass opacity with high-resolution computed tomography technique and volumetric Computed Tomography acquisition with moderate increasing of intra-observer agreement (k value 0.46) using volumetric computed tomography than high-resolution computed tomography. Conclusions: In our experience, volumetric computed tomography with 64-row scanner shows good accuracy in detection of ground-glass opacity, providing a better spatial and temporal resolution and advanced post-processing technique than high-resolution computed tomography.

  20. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  1. Preclosure radiological safety assessment for the ground support system in the exploratory studies facility

    International Nuclear Information System (INIS)

    Smith, A.J.; Tsai, F.C.

    1995-01-01

    An initial probabilistic safety assessment was performed for the exploratory studies facility underground opening to determine whether the ground support system should be classified as an item important to safety. The initiating event was taken to be a rock fall in an operational facility impacting a loaded waste transporter. Rock fall probability rates were estimated from data reported by commercial mining operations. This information was retrieved from the data base compiled by the Mining Safety and Health Administration from the mandatory reporting of incidents. The statistical distribution of the rock fall magnitude was estimated from the horizontal and vertical spacing fractures measured at the Yucca Mountain repository horizon. Simple models were developed to estimate container deformation and radionuclide releases arising from the projected distribution of impacts. Accepted techniques were used to calculate atmospheric dispersion and obtain the committed dose to individuals

  2. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  3. The Advance of Computing from the Ground to the Cloud

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  4. Correlation of horizontal and vertical components of strong ground motion for response-history analysis of safety-related nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yin-Nan, E-mail: ynhuang@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Yen, Wen-Yi, E-mail: b01501059@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Whittaker, Andrew S., E-mail: awhittak@buffalo.edu [Dept. of Civil, Structural and Environmental Engineering, MCEER, State University of New York at Buffalo, Buffalo, NY 14260 (United States)

    2016-12-15

    Highlights: • The correlation of components of ground motion is studied using 1689 sets of records. • The data support an upper bound of 0.3 on the correlation coefficient. • The data support the related requirement in the upcoming edition of ASCE Standard 4. - Abstract: Design standards for safety-related nuclear facilities such as ASCE Standard 4-98 and ASCE Standard 43-05 require the correlation coefficient for two orthogonal components of ground motions for response-history analysis to be less than 0.3. The technical basis of this requirement was developed by Hadjian three decades ago using 50 pairs of recorded ground motions that were available at that time. In this study, correlation coefficients for (1) two horizontal components, and (2) the vertical component and one horizontal component, of a set of ground motions are computed using records from a ground-motion database compiled recently for large-magnitude shallow crustal earthquakes. The impact of the orientation of the orthogonal horizontal components on the correlation coefficient of ground motions is discussed. The rules in the forthcoming edition of ASCE Standard 4 for the correlation of components in a set of ground motions are shown to be reasonable.

  5. Heat dissipation computations of a HVDC ground electrode using a supercomputer

    International Nuclear Information System (INIS)

    Greiss, H.; Mukhedkar, D.; Lagace, P.J.

    1990-01-01

    This paper reports on the temperature, of soil surrounding a High Voltage Direct Current (HVDC) toroidal ground electrode of practical dimensions, in both homogeneous and non-homogeneous soils that was computed at incremental points in time using finite difference methods on a supercomputer. Curves of the response were computed and plotted at several locations within the soil in the vicinity of the ground electrode for various values of the soil parameters

  6. High resolution muon computed tomography at neutrino beam facilities

    International Nuclear Information System (INIS)

    Suerfu, B.; Tully, C.G.

    2016-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pion decay pipe at a neutrino beam facility and what can be achieved for momentum resolution in a muon spectrometer. Such an imaging system can be applied in archaeology, art history, engineering, material identification and whenever there is a need to image inside a transportable object constructed of dense materials

  7. Study of the unknown hemisphere of mercury by ground-based astronomical facilities

    Science.gov (United States)

    Ksanfomality, L. V.

    2011-08-01

    The short exposure method proved to be very productive in ground-based observations of Mercury. Telescopic observations with short exposures, together with computer codes for the processing of data arrays of many thousands of original electronic photos, make it possible to improve the resolution of images from ground-based instruments to almost the diffraction limit. The resulting composite images are comparable with images from spacecrafts approaching from a distance of about 1 million km. This paper presents images of the hemisphere of Mercury in longitude sectors 90°-180°W, 215°-350°W, and 50°-90°W, including, among others, areas not covered by spacecraft cameras. For the first time a giant S basin was discovered in the sector of longitudes 250°-290°W, which is the largest formation of this type on terrestrial planets. Mercury has a strong phase effects. As a result, the view of the surface changes completely with the change in the planetary phase. But the choice of the phase in the study using spacecrafts is limited by orbital characteristics of the mission. Thus, ground-based observations of the planet provide a valuable support.

  8. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  9. Computer Security at Nuclear Facilities. Reference Manual (Arabic Edition)

    International Nuclear Information System (INIS)

    2011-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  10. Computer Security at Nuclear Facilities. Reference Manual (Russian Edition)

    International Nuclear Information System (INIS)

    2012-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  11. Computer Security at Nuclear Facilities. Reference Manual (Chinese Edition)

    International Nuclear Information System (INIS)

    2012-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  12. RCRA [Resource Conservation and Recovery Act] ground-water monitoring projects for Hanford facilities: Annual progress report for 1988

    International Nuclear Information System (INIS)

    Fruland, R.M.; Lundgren, R.E.

    1989-04-01

    This report describes the progress during 1988 of 14 Hanford Site ground-water monitoring projects covering 16 hazardous waste facilities and 1 nonhazardous waste facility (the Solid Waste Landfill). Each of the projects is being conducted according to federal regulations based on the Resource Conservation and Recovery Act (RCRA) of 1976 and the State of Washington Administrative Code. 21 refs., 23 figs., 8 tabs

  13. Modular Extended-Stay HyperGravity Facility Design Concept: An Artificial-Gravity Space-Settlement Ground Analogue

    Science.gov (United States)

    Dorais, Gregory A.

    2015-01-01

    This document defines the design concept for a ground-based, extended-stay hypergravity facility as a precursor for space-based artificial-gravity facilities that extend the permanent presence of both human and non-human life beyond Earth in artificial-gravity settlements. Since the Earth's current human population is stressing the environment and the resources off-Earth are relatively unlimited, by as soon as 2040 more than one thousand people could be living in Earthorbiting artificial-gravity habitats. Eventually, the majority of humanity may live in artificialgravity habitats throughout this solar system as well as others, but little is known about the longterm (multi-generational) effects of artificial-gravity habitats on people, animals, and plants. In order to extend life permanently beyond Earth, it would be useful to create an orbiting space facility that generates 1g as well as other gravity levels to rigorously address the numerous challenges of such an endeavor. Before doing so, developing a ground-based artificial-gravity facility is a reasonable next step. Just as the International Space Station is a microgravity research facility, at a small fraction of the cost and risk a ground-based artificial-gravity facility can begin to address a wide-variety of the artificial-gravity life-science questions and engineering challenges requiring long-term research to enable people, animals, and plants to live off-Earth indefinitely.

  14. Approach to developing a ground-motion design basis for facilities important to safety at Yucca Mountain

    International Nuclear Information System (INIS)

    King, J.L.

    1990-01-01

    This paper discusses a methodology for developing a ground-motion design basis for prospective facilities at Yucca Mountain that are important to safety. The methodology utilizes a guasi-deterministic construct called the 10,000-year cumulative-slip earthquake that is designed to provide a conservative, robust, and reproducible estimate of ground motion that has a one-in-ten chance of occurring during the preclosure period. This estimate is intended to define a ground-motion level for which the seismic design would ensure minimal disruption to operations engineering analyses to ensure safe performance are included

  15. Academic Computing Facilities and Services in Higher Education--A Survey.

    Science.gov (United States)

    Warlick, Charles H.

    1986-01-01

    Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…

  16. Public Computer Assisted Learning Facilities for Children with Visual Impairment: Universal Design for Inclusive Learning

    Science.gov (United States)

    Siu, Kin Wai Michael; Lam, Mei Seung

    2012-01-01

    Although computer assisted learning (CAL) is becoming increasingly popular, people with visual impairment face greater difficulty in accessing computer-assisted learning facilities. This is primarily because most of the current CAL facilities are not visually impaired friendly. People with visual impairment also do not normally have access to…

  17. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  18. Development of a low background test facility for the SPICA-SAFARI on-ground calibration

    Science.gov (United States)

    Dieleman, P.; Laauwen, W. M.; Ferrari, L.; Ferlet, M.; Vandenbussche, B.; Meinsma, L.; Huisman, R.

    2012-09-01

    SAFARI is a far-infrared camera to be launched in 2021 onboard the SPICA satellite. SAFARI offers imaging spectroscopy and imaging photometry in the wavelength range of 34 to 210 μm with detector NEP of 2•10-19 W/√Hz. A cryogenic test facility for SAFARI on-ground calibration and characterization is being developed. The main design driver is the required low background of a few attoWatts per pixel. This prohibits optical access to room temperature and hence all test equipment needs to be inside the cryostat at 4.5K. The instrument parameters to be verified are interfaces with the SPICA satellite, sensitivity, alignment, image quality, spectral response, frequency calibration, and point spread function. The instrument sensitivity is calibrated by a calibration source providing a spatially homogeneous signal at the attoWatt level. This low light intensity is achieved by geometrical dilution of a 150K source to an integrating sphere. The beam quality and point spread function is measured by a pinhole/mask plate wheel, back-illuminated by a second integrating sphere. This sphere is fed by a stable wide-band source, providing spectral lines via a cryogenic etalon.

  19. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  20. Coping with the Stigma of Mental Illness: Empirically-Grounded Hypotheses from Computer Simulations

    Science.gov (United States)

    Kroska, Amy; Har, Sarah K.

    2011-01-01

    This research demonstrates how affect control theory and its computer program, "Interact", can be used to develop empirically-grounded hypotheses regarding the connection between cultural labels and behaviors. Our demonstration focuses on propositions in the modified labeling theory of mental illness. According to the MLT, negative societal…

  1. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  2. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  3. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  4. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  5. On-line satellite/central computer facility of the Multiparticle Argo Spectrometer System

    International Nuclear Information System (INIS)

    Anderson, E.W.; Fisher, G.P.; Hien, N.C.; Larson, G.P.; Thorndike, A.M.; Turkot, F.; von Lindern, L.; Clifford, T.S.; Ficenec, J.R.; Trower, W.P.

    1974-09-01

    An on-line satellite/central computer facility has been developed at Brookhaven National Laboratory as part of the Multiparticle Argo Spectrometer System (MASS). This facility consisting of a PDP-9 and a CDC-6600, has been successfully used in study of proton-proton interactions at 28.5 GeV/c. (U.S.)

  6. Computer applications for the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Worth, G.A.; Patterson, J.R.

    1976-01-01

    Computer applications for the FFTF reactor include plant surveillance functions and fuel handling and examination control functions. Plant surveillance systems provide the reactor operator with a selection of over forty continuously updated, formatted displays of correlated data. All data are checked for limits and validity and the operator is advised of any anomaly. Data are also recorded on magnetic tape for historical purposes. The system also provides calculated variables, such as reactor thermal power and anomalous reactivity. Supplementing the basic plant surveillance computer system is a minicomputer system that monitors the reactor cover gas to detect and characterize absorber or fuel pin failures. In addition to plant surveillance functions, computers are used in the FFTF for controlling selected refueling equipment and for post-irradiation fuel pin examination. Four fuel handling or examination systems operate under computer control with manual monitoring and over-ride capability

  7. Implementation of computer security at nuclear facilities in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lochthofen, Andre; Sommer, Dagmar [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    2013-07-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  8. Implementation of computer security at nuclear facilities in Germany

    International Nuclear Information System (INIS)

    Lochthofen, Andre; Sommer, Dagmar

    2013-01-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  9. Computer-aided system for cryogenic research facilities

    International Nuclear Information System (INIS)

    Gerasimov, V.P.; Zhelamsky, M.V.; Mozin, I.V.; Repin, S.S.

    1994-01-01

    A computer-aided system is developed for the more effective choice and optimization of the design and manufacturing technologies of the superconductor for the magnet system of the International Thermonuclear Experimental Reactor (ITER) with the aim to ensure the superconductor certification. The computer-aided system provides acquisition, processing, storage and display of data describing the proceeding tests, the detection of any parameter deviations and their analysis. Besides, it generates commands for the equipment switch off in emergency situations. ((orig.))

  10. Ground facility for information reception, processing, dissemination and scientific instruments management setup in the CORONAS-PHOTON space project

    Science.gov (United States)

    Buslov, A. S.; Kotov, Yu. D.; Yurov, V. N.; Bessonov, M. V.; Kalmykov, P. A.; Oreshnikov, E. M.; Alimov, A. M.; Tumanov, A. V.; Zhuchkova, E. A.

    2011-06-01

    This paper deals with the organizational structure of ground-based receiving, processing, and dissemination of scientific information created by the Astrophysics Institute of the Scientific Research Nuclear University, Moscow Engineering Physics Institute. Hardware structure and software features are described. The principles are given for forming sets of control commands for scientific equipment (SE) devices, and statistics data are presented on the operation of facility during flight tests of the spacecraft (SC) in the course of one year.

  11. Corrective Measures Study Modeling Results for the Southwest Plume - Burial Ground Complex/Mixed Waste Management Facility

    International Nuclear Information System (INIS)

    Harris, M.K.

    1999-01-01

    Groundwater modeling scenarios were performed to support the Corrective Measures Study and Interim Action Plan for the southwest plume of the Burial Ground Complex/Mixed Waste Management Facility. The modeling scenarios were designed to provide data for an economic analysis of alternatives, and subsequently evaluate the effectiveness of the selected remedial technologies for tritium reduction to Fourmile Branch. Modeling scenarios assessed include no action, vertical barriers, pump, treat, and reinject; and vertical recirculation wells

  12. Engineering and Design. Guidelines on Ground Improvement for Structures and Facilities

    National Research Council Canada - National Science Library

    Enson, Carl

    1999-01-01

    .... It addresses general evaluation of site and soil conditions, selection of improvement methods, preliminary cost estimating, design, construction, and performance evaluation for ground improvement...

  13. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hanquan, E-mail: hanquan.wang@gmail.com [School of Statistics and Mathematics, Yunnan University of Finance and Economics, Kunming, Yunnan Province, 650221 (China); Yunnan Tongchang Scientific Computing and Data Mining Research Center, Kunming, Yunnan Province, 650221 (China)

    2014-10-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method.

  14. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    International Nuclear Information System (INIS)

    Wang, Hanquan

    2014-01-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method

  15. Operational facility-integrated computer system for safeguards

    International Nuclear Information System (INIS)

    Armento, W.J.; Brooksbank, R.E.; Krichinsky, A.M.

    1980-01-01

    A computer system for safeguards in an active, remotely operated, nuclear fuel processing pilot plant has been developed. This sytem maintains (1) comprehensive records of special nuclear materials, (2) automatically updated book inventory files, (3) material transfer catalogs, (4) timely inventory estimations, (5) sample transactions, (6) automatic, on-line volume balances and alarmings, and (7) terminal access and applications software monitoring and logging. Future development will include near-real-time SNM mass balancing as both a static, in-tank summation and a dynamic, in-line determination. It is planned to incorporate aspects of site security and physical protection into the computer monitoring

  16. Computer modeling of ground-water flow at the Savannah River Plant

    International Nuclear Information System (INIS)

    Root, R.W. Jr.

    1979-01-01

    Mathematical equations describing ground-water flow are used in a computer model being developed to predict the space-time distribution of hydraulic head beneath a part of the Savannah River Plant site. These equations are solved by a three-dimensional finite-difference scheme. Preliminary calibration of the hydraulic head model has been completed and calculated results compare well with water-level changes observed in the field. 10 figures, 1 table

  17. Characterization of 618-11 solid waste burial ground, disposed waste, and description of the waste generating facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hladek, K.L.

    1997-10-07

    The 618-11 (Wye or 318-11) burial ground received transuranic (TRTJ) and mixed fission solid waste from March 9, 1962, through October 2, 1962. It was then closed for 11 months so additional burial facilities could be added. The burial ground was reopened on September 16, 1963, and continued operating until it was closed permanently on December 31, 1967. The burial ground received wastes from all of the 300 Area radioactive material handling facilities. The purpose of this document is to characterize the 618-11 solid waste burial ground by describing the site, burial practices, the disposed wastes, and the waste generating facilities. This document provides information showing that kilogram quantities of plutonium were disposed to the drum storage units and caissons, making them transuranic (TRU). Also, kilogram quantities of plutonium and other TRU wastes were disposed to the three trenches, which were previously thought to contain non-TRU wastes. The site burial facilities (trenches, caissons, and drum storage units) should be classified as TRU and the site plutonium inventory maintained at five kilograms. Other fissile wastes were also disposed to the site. Additionally, thousands of curies of mixed fission products were also disposed to the trenches, caissons, and drum storage units. Most of the fission products have decayed over several half-lives, and are at more tolerable levels. Of greater concern, because of their release potential, are TRU radionuclides, Pu-238, Pu-240, and Np-237. TRU radionuclides also included slightly enriched 0.95 and 1.25% U-231 from N-Reactor fuel, which add to the fissile content. The 618-11 burial ground is located approximately 100 meters due west of Washington Nuclear Plant No. 2. The burial ground consists of three trenches, approximately 900 feet long, 25 feet deep, and 50 feet wide, running east-west. The trenches constitute 75% of the site area. There are 50 drum storage units (five 55-gallon steel drums welded together

  18. Characterization of 618-11 solid waste burial ground, disposed waste, and description of the waste generating facilities

    International Nuclear Information System (INIS)

    Hladek, K.L.

    1997-01-01

    The 618-11 (Wye or 318-11) burial ground received transuranic (TRTJ) and mixed fission solid waste from March 9, 1962, through October 2, 1962. It was then closed for 11 months so additional burial facilities could be added. The burial ground was reopened on September 16, 1963, and continued operating until it was closed permanently on December 31, 1967. The burial ground received wastes from all of the 300 Area radioactive material handling facilities. The purpose of this document is to characterize the 618-11 solid waste burial ground by describing the site, burial practices, the disposed wastes, and the waste generating facilities. This document provides information showing that kilogram quantities of plutonium were disposed to the drum storage units and caissons, making them transuranic (TRU). Also, kilogram quantities of plutonium and other TRU wastes were disposed to the three trenches, which were previously thought to contain non-TRU wastes. The site burial facilities (trenches, caissons, and drum storage units) should be classified as TRU and the site plutonium inventory maintained at five kilograms. Other fissile wastes were also disposed to the site. Additionally, thousands of curies of mixed fission products were also disposed to the trenches, caissons, and drum storage units. Most of the fission products have decayed over several half-lives, and are at more tolerable levels. Of greater concern, because of their release potential, are TRU radionuclides, Pu-238, Pu-240, and Np-237. TRU radionuclides also included slightly enriched 0.95 and 1.25% U-231 from N-Reactor fuel, which add to the fissile content. The 618-11 burial ground is located approximately 100 meters due west of Washington Nuclear Plant No. 2. The burial ground consists of three trenches, approximately 900 feet long, 25 feet deep, and 50 feet wide, running east-west. The trenches constitute 75% of the site area. There are 50 drum storage units (five 55-gallon steel drums welded together

  19. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  20. NNS computing facility manual P-17 Neutron and Nuclear Science

    International Nuclear Information System (INIS)

    Hoeberling, M.; Nelson, R.O.

    1993-11-01

    This document describes basic policies and provides information and examples on using the computing resources provided by P-17, the Neutron and Nuclear Science (NNS) group. Information on user accounts, getting help, network access, electronic mail, disk drives, tape drives, printers, batch processing software, XSYS hints, PC networking hints, and Mac networking hints is given

  1. Convergence of Ground and Excited State Properties of Divacancy Defects in 4H-SiC with Computational Cell Size

    Science.gov (United States)

    2018-03-01

    SiC with Computational Cell Size by Ariana Beste and DeCarlos E Taylor Approved for public release; distribution is unlimited...Laboratory Convergence of Ground and Excited State Properties of Divacancy Defects in 4H-SiC with Computational Cell Size by Ariana Beste...Ground and Excited State Properties of Divacancy Defects in 4H-SiC with Computational Cell Size 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  2. Federal Technology Alert: Ground-Source Heat Pumps Applied to Federal Facilities-Second Edition; FINAL

    International Nuclear Information System (INIS)

    Hadley, Donald L

    2001-01-01

    This Federal Technology Alert, which was sponsored by the U.S. Department of Energy's Office of Federal Energy Management Programs, provides the detailed information and procedures that a Federal energy manager needs to evaluate most ground-source heat pump applications. This report updates an earlier report on ground-source heat pumps that was published in September 1995. In the current report, general benefits of this technology to the Federal sector are described, as are ground-source heat pump operation, system types, design variations, energy savings, and other benefits. In addition, information on current manufacturers, technology users, and references for further reading are provided

  3. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  4. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  5. Improving CMS data transfers among its distributed computing facilities

    International Nuclear Information System (INIS)

    Flix, J; Magini, N; Sartirana, A

    2011-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on their usage, customizing the topologies and improving their setup in order to keep CMS transferring data at the desired levels in a reliable and robust way.

  6. Implementation of the Facility Integrated Inventory Computer System (FICS)

    International Nuclear Information System (INIS)

    McEvers, J.A.; Krichinsky, A.M.; Layman, L.R.; Dunnigan, T.H.; Tuft, R.M.; Murray, W.P.

    1980-01-01

    This paper describes a computer system which has been developed for nuclear material accountability and implemented in an active radiochemical processing plant involving remote operations. The system posesses the following features: comprehensive, timely records of the location and quantities of special nuclear materials; automatically updated book inventory files on the plant and sub-plant levels of detail; material transfer coordination and cataloging; automatic inventory estimation; sample transaction coordination and cataloging; automatic on-line volume determination, limit checking, and alarming; extensive information retrieval capabilities; and terminal access and application software monitoring and logging

  7. Ground-glass opacity in diffuse lung diseases: high-resolution computed tomography-pathology correlation

    International Nuclear Information System (INIS)

    Santos, Maria Lucia de Oliveira; Vianna, Alberto Domingues; Marchiori, Edson; Souza Junior, Arthur Soares; Moraes, Heleno Pinto de

    2003-01-01

    Ground-glass opacity is a finding frequently seen in high-resolution computed tomography examinations of the chest and is characterized by hazy increased attenuation of lung, however without blurring of bronchial and vascular margins. Due to its un specificity, association with other radiological, clinical and pathological findings must be considered for an accurate diagnostic interpretation. In this paper were reviewed 62 computed tomography examinations of patients with diffuse pulmonary diseases of 14 different etiologies in which ground-glass opacity was the only or the most remarkable finding, and correlated this findings with pathology abnormalities seen on specimens obtained from biopsies or necropsies. In pneumocystosis, ground-glass opacities correlated histologically with alveolar occupation by a foaming material containing parasites, in bronchiole alveolar cell carcinoma with thickening of the alveolar septa and occupation of the lumen by mucus and tumoral cells, in paracoccidioidomycosis with thickening of the alveolar septa, areas of fibrosis and alveolar bronchopneumonia exudate, in sarcoidosis with fibrosis or clustering of granulomas and in idiopathic pulmonary fibrosis with alveolar septa thickening due to fibrosis. Alveolar occupation by blood was found in cases of leptospirosis, idiopathic hemo siderosis, metastatic kidney tumor and invasive aspergillosis whereas oily vacuole were seen in lipoid pneumonia, proteinaceous and lipo proteinaceous material in silico proteinosis and pulmonary alveolar proteinosis, and edematous fluid in cardiac failure. (author)

  8. Ground Water Monitoring Requirements for Hazardous Waste Treatment, Storage and Disposal Facilities

    Science.gov (United States)

    The groundwater monitoring requirements for hazardous waste treatment, storage and disposal facilities (TSDFs) are just one aspect of the Resource Conservation and Recovery Act (RCRA) hazardous waste management strategy for protecting human health and the

  9. NEGOTIATING COMMON GROUND IN COMPUTER-MEDIATED VERSUS FACE-TO-FACE DISCUSSIONS

    Directory of Open Access Journals (Sweden)

    Ilona Vandergriff

    2006-01-01

    Full Text Available To explore the impact of the communication medium on building common ground, this article presents research comparing learner use of reception strategies in traditional face-to-face (FTF and in synchronous computer-mediated communication (CMC.Reception strategies, such as reprises, hypothesis testing and forward inferencing provide evidence of comprehension and thus serve to establish common ground among participants. A number of factors, including communicative purpose or medium are hypothesized to affect the use of such strategies (Clark & Brennan, 1991. In the data analysis, I 1 identify specific types of reception strategies, 2 compare their relative frequencies by communication medium, by task, and by learner and 3 describe how these reception strategies function in the discussions. The findings of the quantitative analysis show that the medium alone seems to have little impact on grounding as indicated by use of reception strategies. The qualitative analysis provides evidence that participants adapted the strategies to the goals of the communicative interaction as they used them primarily to negotiate and update common ground on their collaborative activity rather than to compensate for L2 deficiencies.

  10. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  11. Development of computer model for radionuclide released from shallow-land disposal facility

    International Nuclear Information System (INIS)

    Suganda, D.; Sucipta; Sastrowardoyo, P.B.; Eriendi

    1998-01-01

    Development of 1-dimensional computer model for radionuclide release from shallow land disposal facility (SLDF) has been done. This computer model is used for the SLDF facility at PPTA Serpong. The SLDF facility is above 1.8 metres from groundwater and 150 metres from Cisalak river. Numerical method by implicit method of finite difference solution is chosen to predict the migration of radionuclide with any concentration.The migration starts vertically from the bottom of SLDF until the groundwater layer, then horizontally in the groundwater until the critical population group. Radionuclide Cs-137 is chosen as a sample to know its migration. The result of the assessment shows that the SLDF facility at PPTA Serpong has the high safety criteria. (author)

  12. Predicting the Pullout Capacity of Small Ground Anchors Using Nonlinear Integrated Computing Techniques

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study investigates predicting the pullout capacity of small ground anchors using nonlinear computing techniques. The input-output prediction model for the nonlinear Hammerstein-Wiener (NHW and delay inputs for the adaptive neurofuzzy inference system (DANFIS are developed and utilized to predict the pullout capacity. The results of the developed models are compared with previous studies that used artificial neural networks and least square support vector machine techniques for the same case study. The in situ data collection and statistical performances are used to evaluate the models performance. Results show that the developed models enhance the precision of predicting the pullout capacity when compared with previous studies. Also, the DANFIS model performance is proven to be better than other models used to detect the pullout capacity of ground anchors.

  13. Aespoe Hard Rock Laboratory. Ground magnetic survey at site for planned facility for calibration of borehole orientation equipment at Aespoe

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Haakan (GeoVista AB (Sweden))

    2012-01-15

    This report presents survey description and results of ground magnetic measurements carried out by GeoVista AB at Aespoe in December, 2011. The purpose of the ground magnetic measurement was to measure variations in the earth magnetic field and to gain knowledge of the magnetization of the bedrock in an area where SKB plan to build a facility for calibration of equipment for measurements of borehole orientation. A total of 312 data points were collected along three survey lines, 104 points/profile. The data show nice and smooth variations that appear to be natural. There is a clear consistency of the magnetic field variations between the three survey lines, which indicates that the variations in the magnetic field reflect geological variations related to lithology and content of magnetic minerals. There are no indications of artifacts or erroneous data. The anomaly field averages at -32 nT with peak values of Min = -1,016 nT and Max = +572 nT. The strongest anomalies occur at profile length c. 130-140 m. Adding the background field of 50,823 nT, measured at a base station located close to the survey area, the total magnetic field averages at 50,791+-226 nT. The ground magnetic measurement gives background information before the construction of the calibration facility. The magnetic anomaly at c. 130-140 m give possibilities to control disturbances of magnetic-accelerometer based instruments. The magnetic measurements show that it is possible to construct the facility at the site

  14. Aespoe Hard Rock Laboratory. Ground magnetic survey at site for planned facility for calibration of borehole orientation equipment at Aespoe

    International Nuclear Information System (INIS)

    Mattsson, Haakan

    2012-01-01

    This report presents survey description and results of ground magnetic measurements carried out by GeoVista AB at Aespoe in December, 2011. The purpose of the ground magnetic measurement was to measure variations in the earth magnetic field and to gain knowledge of the magnetization of the bedrock in an area where SKB plan to build a facility for calibration of equipment for measurements of borehole orientation. A total of 312 data points were collected along three survey lines, 104 points/profile. The data show nice and smooth variations that appear to be natural. There is a clear consistency of the magnetic field variations between the three survey lines, which indicates that the variations in the magnetic field reflect geological variations related to lithology and content of magnetic minerals. There are no indications of artifacts or erroneous data. The anomaly field averages at -32 nT with peak values of Min = -1,016 nT and Max = +572 nT. The strongest anomalies occur at profile length c. 130-140 m. Adding the background field of 50,823 nT, measured at a base station located close to the survey area, the total magnetic field averages at 50,791±226 nT. The ground magnetic measurement gives background information before the construction of the calibration facility. The magnetic anomaly at c. 130-140 m give possibilities to control disturbances of magnetic-accelerometer based instruments. The magnetic measurements show that it is possible to construct the facility at the site

  15. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Graf, F.A. Jr.

    1995-02-27

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System`s pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System.

  16. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1995-01-01

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  17. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  18. Fast Computation of Ground Motion Shaking Map base on the Modified Stochastic Finite Fault Modeling

    Science.gov (United States)

    Shen, W.; Zhong, Q.; Shi, B.

    2012-12-01

    Rapidly regional MMI mapping soon after a moderate-large earthquake is crucial to loss estimation, emergency services and planning of emergency action by the government. In fact, many countries show different degrees of attention on the technology of rapid estimation of MMI , and this technology has made significant progress in earthquake-prone countries. In recent years, numerical modeling of strong ground motion has been well developed with the advances of computation technology and earthquake science. The computational simulation of strong ground motion caused by earthquake faulting has become an efficient way to estimate the regional MMI distribution soon after earthquake. In China, due to the lack of strong motion observation in network sparse or even completely missing areas, the development of strong ground motion simulation method has become an important means of quantitative estimation of strong motion intensity. In many of the simulation models, stochastic finite fault model is preferred to rapid MMI estimating for its time-effectiveness and accuracy. In finite fault model, a large fault is divided into N subfaults, and each subfault is considered as a small point source. The ground motions contributed by each subfault are calculated by the stochastic point source method which is developed by Boore, and then summed at the observation point to obtain the ground motion from the entire fault with a proper time delay. Further, Motazedian and Atkinson proposed the concept of Dynamic Corner Frequency, with the new approach, the total radiated energy from the fault and the total seismic moment are conserved independent of subfault size over a wide range of subfault sizes. In current study, the program EXSIM developed by Motazedian and Atkinson has been modified for local or regional computations of strong motion parameters such as PGA, PGV and PGD, which are essential for MMI estimating. To make the results more reasonable, we consider the impact of V30 for the

  19. DARA Solid Storage Facility evaluation and recommendations, Y-12 Bear Creek Burial Grounds, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Barton, W.D. III; Hughey, J.C.

    1992-08-01

    The Disposal Area Remedial Action (DARA) Solid Storage Facility (SSF) is a rectangular concrete vault with two high-density Polyethlene (HDPE) liners and covered with a metal building. The SSF was originally designed and constructed to receive saturated sediments from the excavation of the Oil Retention Ponds and Tributary 7 at the Oak Ridge Y-12 Plant. The sediments placed in the SSF were generally high-water-content soils contaminated with polychlorinated biphenyls (PCBs) and volatile organic carbons. The facility was intended to dewater the sediments by allowing the free water to percolate to a 6-in. sand layer covering the entire floor of the facility. The sand layer then drained into sumps located at the east and west ends of the facility. An application for a Part-B Permit was submitted to the state of Tennessee in February 1992 (MMES 1992a). This report is being submitted to support approval of that permit application and to address certain issues known to the regulators regarding this facility

  20. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  1. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  2. Mixed Waste Management Facility (MWMF) Old Burial Ground (OBG) source control technology and inventory study

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P.; Rehder, T.E.; Kanzleiter, J.P.

    1996-10-02

    This report has been developed to support information needs for wastes buried in the Burial Ground Complex. Information discussed is presented in a total of four individual attachments. The general focus of this report is to collect information on estimated source inventories, leaching studies, source control technologies, and to provide information on modeling parameters and associated data deficiencies.

  3. Mixed Waste Management Facility (MWMF) Old Burial Ground (OBG) source control technology and inventory study

    International Nuclear Information System (INIS)

    Flach, G.P.; Rehder, T.E.; Kanzleiter, J.P.

    1996-01-01

    This report has been developed to support information needs for wastes buried in the Burial Ground Complex. Information discussed is presented in a total of four individual attachments. The general focus of this report is to collect information on estimated source inventories, leaching studies, source control technologies, and to provide information on modeling parameters and associated data deficiencies

  4. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  5. Computer security at ukrainian nuclear facilities: interface between nuclear safety and security

    International Nuclear Information System (INIS)

    Chumak, D.; Klevtsov, O.

    2015-01-01

    Active introduction of information technology, computer instrumentation and control systems (I and C systems) in the nuclear field leads to a greater efficiency and management of technological processes at nuclear facilities. However, this trend brings a number of challenges related to cyber-attacks on the above elements, which violates computer security as well as nuclear safety and security of a nuclear facility. This paper considers regulatory support to computer security at the nuclear facilities in Ukraine. The issue of computer and information security considered in the context of physical protection, because it is an integral component. The paper focuses on the computer security of I and C systems important to nuclear safety. These systems are potentially vulnerable to cyber threats and, in case of cyber-attacks, the potential negative impact on the normal operational processes can lead to a breach of the nuclear facility security. While ensuring nuclear security of I and C systems, it interacts with nuclear safety, therefore, the paper considers an example of an integrated approach to the requirements of nuclear safety and security

  6. AIRDOS-II computer code for estimating radiation dose to man from airborne radionuclides in areas surrouding nuclear facilities

    International Nuclear Information System (INIS)

    Moore, R.E.

    1977-04-01

    The AIRDOS-II computer code estimates individual and population doses resulting from the simultaneous atmospheric release of as many as 36 radionuclides from a nuclear facility. This report describes the meteorological and environmental models used is the code, their computer implementation, and the applicability of the code to assessments of radiological impact. Atmospheric dispersion and surface deposition of released radionuclides are estimated as a function of direction and distance from a nuclear power plant or fuel-cycle facility, and doses to man through inhalation, air immersion, exposure to contaminated ground, food ingestion, and water immersion are estimated in the surrounding area. Annual doses are estimated for total body, GI tract, bone, thyroid, lungs, muscle, kidneys, liver, spleen, testes, and ovaries. Either the annual population doses (man-rems/year) or the highest annual individual doses in the assessment area (rems/year), whichever are applicable, are summarized in output tables in several ways--by nuclides, modes of exposure, and organs. The location of the highest individual doses for each reference organ estimated for the area is specified in the output data

  7. Ground-water monitoring compliance projects for Hanford Site facilities: Annual progress report for 1987

    International Nuclear Information System (INIS)

    Hall, S.H.

    1988-09-01

    This report describes progress during 1987 of five Hanford Site ground water monitoring projects. Four of these projects are being conducted according to regulations based on the federal Resource Conservation and Recovery Act of 1976 and the state Hazardous Waste Management Act. The fifth project is being conducted according to regulations based on the state Solid Waste Management Act. The five projects discussed herein are: 300 Area Process Trenches; 183-H Solar Evaporation Basins; 200 Areas Low-Level Burial Grounds; Nonradioactive Dangerous Waste Landfill; Solid Waste Landfill. For each of the projects, there are included, as applicable, discussions of monitoring well installations, water-table measurements, background and/or downgradient water quality and results of chemical analysis, and extent and rate of movement of contaminant plumes. 14 refs., 30 figs., 13 tabs

  8. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  9. Military Handbook. Grounding, Bonding, and Shielding for Electronic Equipments and Facilities. Volume 1. Basic Theory

    Science.gov (United States)

    1987-12-29

    when the air or gas stream contains particulate matter. b. Pulverized materials passing through chutes or pneumatic conveyors . c. Nonconductive power...Hanover NH, 1971, AD 722 221. 146.Oakley, R.J., "Surface Transfer Impedance and Cable Shielding Design ," Wire Journal, Vol 4, No. 3, March 1971, pp...including considerations of grounding, bonding, and shielding in all phases of design , construction, operation, and maintenance of electronic equipment

  10. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    Science.gov (United States)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  11. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Jayatilaka, B. [Fermilab; Levshina, T. [Fermilab; Sehgal, C. [Fermilab; Gardner, R. [Chicago U.; Rynge, M. [USC - ISI, Marina del Rey; Würthwein, F. [UC, San Diego

    2017-11-22

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  12. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  13. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  14. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Plessis, Anton du, E-mail: anton2@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Physics Department, Stellenbosch University, Stellenbosch (South Africa); Roux, Stephan Gerhard le, E-mail: lerouxsg@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Guelpa, Anina, E-mail: aninag@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa)

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory’s first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  15. Qualification of Coatings for Launch Facilities and Ground Support Equipment Through the NASA Corrosion Technology Laboratory

    Science.gov (United States)

    Kolody, Mark R.; Curran, Jerome P.; Calle, Luz Marina

    2014-01-01

    Corrosion protection at NASA's Kennedy Space Center is a high priority item. The launch facilities at the Kennedy Space Center are located approximately 1000 feet from the Atlantic Ocean where they are exposed to salt deposits, high humidity, high UV degradation, and acidic exhaust from solid rocket boosters. These assets are constructed from carbon steel, which requires a suitable coating to provide long-term protection to reduce corrosion and its associated costs.

  16. Atmospheric dispersion calculation for posturated accident of nuclear facilities and the computer code: PANDA

    International Nuclear Information System (INIS)

    Kitahara, Yoshihisa; Kishimoto, Yoichiro; Narita, Osamu; Shinohara, Kunihiko

    1979-01-01

    Several Calculation methods for relative concentration (X/Q) and relative cloud-gamma dose (D/Q) of the radioactive materials released from nuclear facilities by posturated accident are presented. The procedure has been formulated as a Computer program PANDA and the usage is explained. (author)

  17. Taking the classical large audience university lecture online using tablet computer and webconferencing facilities

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    During four offerings (September 2008 – May 2011) of the course 02402 Introduction to Statistics for Engineering students at DTU, with an average of 256 students, the lecturing was carried out 100% through a tablet computer combined with the web conferencing facility Adobe Connect (version 7...

  18. Stormwater Pollution Prevention Plan - TA-60 Roads and Grounds Facility and Associated Sigma Mesa Staging Area

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, Leonard Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-31

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector P-Land Transportation and Warehousing as a guide. This SWPPP applies to discharges of stormwater from the operational areas of the TA-60 Roads and Grounds and Associated Sigma Mesa Staging Area at Los Alamos National Laboratory. Los Alamos National Laboratory (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-60 Roads and Grounds and Associated Sigma Mesa Staging Area. The current permit expires at midnight on June 4, 2020.

  19. Integrated ground-water monitoring strategy for NRC-licensed facilities and sites: Case study applications

    Science.gov (United States)

    Price, V.; Temples, T.; Hodges, R.; Dai, Z.; Watkins, D.; Imrich, J.

    2007-01-01

    This document discusses results of applying the Integrated Ground-Water Monitoring Strategy (the Strategy) to actual waste sites using existing field characterization and monitoring data. The Strategy is a systematic approach to dealing with complex sites. Application of such a systematic approach will reduce uncertainty associated with site analysis, and therefore uncertainty associated with management decisions about a site. The Strategy can be used to guide the development of a ground-water monitoring program or to review an existing one. The sites selected for study fall within a wide range of geologic and climatic settings, waste compositions, and site design characteristics and represent realistic cases that might be encountered by the NRC. No one case study illustrates a comprehensive application of the Strategy using all available site data. Rather, within each case study we focus on certain aspects of the Strategy, to illustrate concepts that can be applied generically to all sites. The test sites selected include:Charleston, South Carolina, Naval Weapons Station,Brookhaven National Laboratory on Long Island, New York,The USGS Amargosa Desert Research Site in Nevada,Rocky Flats in Colorado,C-Area at the Savannah River Site in South Carolina, andThe Hanford 300 Area.A Data Analysis section provides examples of detailed data analysis of monitoring data.

  20. Horizontal Air-Ground Heat Exchanger Performance and Humidity Simulation by Computational Fluid Dynamic Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Maria Congedo

    2016-11-01

    Full Text Available Improving energy efficiency in buildings and promoting renewables are key objectives of European energy policies. Several technological measures are being developed to enhance the energy performance of buildings. Among these, geothermal systems present a huge potential to reduce energy consumption for mechanical ventilation and cooling, but their behavior depending on varying parameters, boundary and climatic conditions is not fully established. In this paper a horizontal air-ground heat exchanger (HAGHE system is studied by the development of a computational fluid dynamics (CFD model. Summer and winter conditions representative of the Mediterranean climate are analyzed to evaluate operation and thermal performance differences. A particular focus is given to humidity variations as this parameter has a major impact on indoor air quality and comfort. Results show the benefits that HAGHE systems can provide in reducing energy consumption in all seasons, in summer when free-cooling can be implemented avoiding post air treatment using heat pumps.

  1. The ground support computer and in-orbit survey data analysis program for the SEEP experiment

    International Nuclear Information System (INIS)

    Voss, H.D.; Datlowe, D.W.; Mobilia, J.; Roselle, S.N.

    1985-01-01

    The ground support computer equipment (GSE) and production survey plot and analysis software are described for the Stimulated Emissions of Energetic Particles (SEEP) experiment on the S81-1 satellite. A general purpose satellite data acquisition circuit was developed based on a Z-80 portable microcomputer. By simply changing instrument control software and electrical connectors, automatic testing and control of the various SEEP instruments was accomplished. A new feature incorporated into the SEEP data analysis phase was the development of a correlative data base for all of the SEEP instruments. A CPU efficient survey plot program (with ephemeris) was developed to display the approximate 3100 hours of data, with a time resolution of 0.5 sec, from the ten instrument sensors. The details of the general purpose multigraph algorithms and plot formats are presented. For the first time new associations are being investigated of simultaneous particle, X-ray, optical and plasma density satellite measurements

  2. Soft Computing Approach to Evaluate and Predict Blast-Induced Ground Vibration

    Science.gov (United States)

    Khandelwal, Manoj

    2010-05-01

    the same excavation site, different predictors give different values of safe PPV vis-à-vis safe charge per delay. There is no uniformity in the predicted result by different predictors. All vibration predictor equations have their site specific constants. Therefore, they cannot be used in a generalized way with confidence and zero level of risk. To overcome on this aspect new soft computing tools like artificial neural network (ANN) has attracted because of its ability to learn from the pattern acquainted before. ANN has the ability to learn from patterns acquainted before. It is a highly interconnected network of a large number of processing elements called neurons in an architecture inspired by the brain. ANN can be massively parallel and hence said to exhibit parallel distributed processing. Once, the network has been trained, with sufficient number of sample data sets, it can make reliable and trustworthy predictions on the basis of its previous learning, about the output related to new input data set of similar pattern. This paper deals the application of ANN for the prediction of ground vibration by taking into consideration of maximum charge per delay and distance between blast face to monitoring point. To investigate the appropriateness of this approach, the predictions by ANN have been also compared with other vibration predictor equations.

  3. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  4. Computational dosimetry for grounded and ungrounded human models due to contact current

    International Nuclear Information System (INIS)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-01-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm 2 . (paper)

  5. Computation of Ground-State Properties in Molecular Systems: Back-Propagation with Auxiliary-Field Quantum Monte Carlo.

    Science.gov (United States)

    Motta, Mario; Zhang, Shiwei

    2017-11-14

    We address the computation of ground-state properties of chemical systems and realistic materials within the auxiliary-field quantum Monte Carlo method. The phase constraint to control the Fermion phase problem requires the random walks in Slater determinant space to be open-ended with branching. This in turn makes it necessary to use back-propagation (BP) to compute averages and correlation functions of operators that do not commute with the Hamiltonian. Several BP schemes are investigated, and their optimization with respect to the phaseless constraint is considered. We propose a modified BP method for the computation of observables in electronic systems, discuss its numerical stability and computational complexity, and assess its performance by computing ground-state properties in several molecular systems, including small organic molecules.

  6. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  7. Computations of Viking Lander Capsule Hypersonic Aerodynamics with Comparisons to Ground and Flight Data

    Science.gov (United States)

    Edquist, Karl T.

    2006-01-01

    Comparisons are made between the LAURA Navier-Stokes code and Viking Lander Capsule hypersonic aerodynamics data from ground and flight measurements. Wind tunnel data are available for a 3.48 percent scale model at Mach 6 and a 2.75 percent scale model at Mach 10.35, both under perfect gas air conditions. Viking Lander 1 aerodynamics flight data also exist from on-board instrumentation for velocities between 2900 and 4400 m/sec (Mach 14 to 23.3). LAURA flowfield solutions are obtained for the geometry as tested or flown, including sting effects at tunnel conditions and finite-rate chemistry effects in flight. Using the flight vehicle center-of-gravity location (trim angle approx. equals -11.1 deg), the computed trim angle at tunnel conditions is within 0.31 degrees of the angle derived from Mach 6 data and 0.13 degrees from the Mach 10.35 trim angle. LAURA Mach 6 trim lift and drag force coefficients are within 2 percent of measured data, and computed trim lift-to-drag ratio is within 4 percent of the data. Computed trim lift and drag force coefficients at Mach 10.35 are within 5 percent and 3 percent, respectively, of wind tunnel data. Computed trim lift-to-drag ratio is within 2 percent of the Mach 10.35 data. Using the nominal density profile and center-of-gravity location, LAURA trim angle at flight conditions is within 0.5 degrees of the total angle measured from on-board instrumentation. LAURA trim lift and drag force coefficients at flight conditions are within 7 and 5 percent, respectively, of the flight data. Computed trim lift-to-drag ratio is within 4 percent of the data. Computed aerodynamics sensitivities to center-of-gravity location, atmospheric density, and grid refinement are generally small. The results will enable a better estimate of aerodynamics uncertainties for future Mars entry vehicles where non-zero angle-of-attack is required.

  8. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  9. Ground-facilities at the DLR Institute of Aerospace Medicine for preparation of flight experiments

    Science.gov (United States)

    Hemmersbach, Ruth; Hendrik Anken, Ralf; Hauslage, Jens; von der Wiesche, Melanie; Baerwalde, Sven; Schuber, Marianne

    In order to investigate the influence of altered gravity on biological systems and to identify gravisensitive processes, various experimental platforms have been developed, which are useful to simulate weightlessness or are able to produce hypergravity. At the Institute of Aerospace Medicine, DLR Cologne, a broad spectrum of applications is offered to scientists: clinostats with one rotation axis and variable rotation speeds for cultivation of small objects (including aquatic organisms) in simulated weightlessness conditions, for online microscopic observations and for online kinetic measurements. Own research concentrates on comparative studies with other kinds of methods to simulate weightlessness, also available at the institute: Rotating Wall Vessel (RWV) for aquatic studies, Random Positioning Machine (RPM; manufactured by Dutch Space, Leiden, The Netherlands). Correspondingly, various centrifuge devices are available to study different test objects under hypergravity conditions -such as NIZEMI, a slow rotating centrifuge microscope, and MUSIC, a multi-sample centrifuge. Mainly for experiments with human test subjects (artificial gravity), but also for biological systems or for testing various kinds of (flight-) hardware, the SAHC, a short arm human centrifuge -loaned by ESA -was installed in Cologne and completes our experimental scenario. Furthermore, due to our specific tasks such as providing laboratories during the German Parabolic Flight Experiments starting from Cologne and being the Facility Responsible Center for BIOLAB, a science rack in the Columbus module aboard the ISS, scientists have the possibility for an optimal preparation of their flight experiments.

  10. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-01-01

    A proposal has been made to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multi-level, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data, through typical transformations and correlations, in under 30 sec. The throughput for such a facility, assuming five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600

  11. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-05-01

    A proposal has been made at LBL to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multilevel, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data through typical transformations and correlations in under 30 s. The throughput for such a facility, for five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600. 3 figures

  12. Mechanisms which help explain implementation of evidence-based practice in residential aged care facilities: a grounded theory study.

    Science.gov (United States)

    Masso, Malcolm; McCarthy, Grace; Kitson, Alison

    2014-07-01

    The context for the study was a nation-wide programme in Australia to implement evidence-based practice in residential aged care, in nine areas of practice, using a wide range of implementation strategies and involving 108 facilities. The study drew on the experiences of those involved in the programme to answer the question: what mechanisms influence the implementation of evidence-based practice in residential aged care and how do those mechanisms interact? The methodology used grounded theory from a critical realist perspective, informed by a conceptual framework that differentiates between the context, process and content of change. People were purposively sampled and invited to participate in semi-structured interviews, resulting in 44 interviews involving 51 people during 2009 and 2010. Participants had direct experience of implementation in 87 facilities, across nine areas of practice, in diverse locations. Sampling continued until data saturation was reached. The quality of the research was assessed using four criteria for judging trustworthiness: credibility, transferability, dependability and confirmability. Data analysis resulted in the identification of four mechanisms that accounted for what took place and participants' experiences. The core category that provided the greatest understanding of the data was the mechanism On Common Ground, comprising several constructs that formed a 'common ground' for change to occur. The mechanism Learning by Connecting recognised the ability to connect new knowledge with existing practice and knowledge, and make connections between actions and outcomes. Reconciling Competing Priorities was an ongoing mechanism whereby new practices had to compete with an existing set of constantly shifting priorities. Strategies for reconciling priorities ranged from structured approaches such as care planning to more informal arrangements such as conversations during daily work. The mechanism Exercising Agency bridged the gap between

  13. Health workers' knowledge of and attitudes towards computer applications in rural African health facilities.

    Science.gov (United States)

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje

    2014-01-01

    The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (pworkplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.

  14. Development of the computer code to monitor gamma radiation in the nuclear facility environment

    International Nuclear Information System (INIS)

    Akhmad, Y. R.; Pudjiyanto, M.S.

    1998-01-01

    Computer codes for gamma radiation monitoring in the vicinity of nuclear facility which have been developed could be introduced to the commercial potable gamma analyzer. The crucial stage of the first year activity was succeeded ; that is the codes have been tested to transfer data file (pulse high distribution) from Micro NOMAD gamma spectrometer (ORTEC product) and the convert them into dosimetry and physics quantities. Those computer codes are called as GABATAN (Gamma Analyzer of Batan) and NAGABAT (Natural Gamma Analyzer of Batan). GABATAN code can isable to used at various nuclear facilities for analyzing gamma field up to 9 MeV, while NAGABAT could be used for analyzing the contribution of natural gamma rays to the exposure rate in the certain location

  15. Computer program for storage of historical and routine safety data related to radiologically controlled facilities

    International Nuclear Information System (INIS)

    Marsh, D.A.; Hall, C.J.

    1984-01-01

    A method for tracking and quick retrieval of radiological status of radiation and industrial safety systems in an active or inactive facility has been developed. The system uses a mini computer, a graphics plotter, and mass storage devices. Software has been developed which allows input and storage of architectural details, radiological conditions such as exposure rates, current location of safety systems, and routine and historical information on exposure and contamination levels. A blue print size digitizer is used for input. The computer program retains facility floor plans in three dimensional arrays. The software accesses an eight pen color plotter for output. The plotter generates color plots of the floor plans and safety systems on 8 1/2 x 11 or 20 x 30 paper or on overhead transparencies for reports and presentations

  16. Maintenance of reactor safety and control computers at a large government facility

    International Nuclear Information System (INIS)

    Brady, H.G.

    1985-01-01

    In 1950 the US Government contracted the Du Pont Company to design, build, and operate the Savannah River Plant (SRP). At the time, it was the largest construction project ever undertaken by man. It is still the largest of the Department of Energy facilities. In the nearly 35 years that have elapsed, Du Pont has met its commitments to the US Government and set world safety records in the construction and operation of nuclear facilities. Contributing factors in achieving production goals and setting the safety records are a staff of highly qualified personnel, a well maintained plant, and sound maintenance programs. There have been many ''first ever'' achievements at SRP. These ''firsts'' include: (1) computer control of a nuclear rector, and (2) use of computer systems as safety circuits. This presentation discusses the maintenance program provided for these computer systems and all digital systems at SRP. An in-house computer maintenance program that was started in 1966 with five persons has grown to a staff of 40 with investments in computer hardware increasing from $4 million in 1970 to more than $60 million in this decade. 4 figs

  17. Opportunities for artificial intelligence application in computer- aided management of mixed waste incinerator facilities

    International Nuclear Information System (INIS)

    Rivera, A.L.; Ferrada, J.J.; Singh, S.P.N.

    1992-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site. It is designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conservation and Recovery Act (RCRA). This facility, known as the TSCA Incinerator, services seven DOE/OR installations. This incinerator was recently authorized for production operation in the United States for the processing of mixed (radioactively contaminated-chemically hazardous) wastes as regulated under TSCA and RCRA. Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. These requirements impact the characteristics and disposition of incinerator residues, limits the quality of liquid and gaseous effluents, limit the characteristics and rates of waste feeds and operating conditions, and restrict the handling of the waste feed inventories. This incinerator facility presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. Demonstrated computer-aided management systems could be transferred to future mixed waste incinerator facilities

  18. Automation of a cryogenic facility by commercial process-control computer

    International Nuclear Information System (INIS)

    Sondericker, J.H.; Campbell, D.; Zantopp, D.

    1983-01-01

    To insure that Brookhaven's superconducting magnets are reliable and their field quality meets accelerator requirements, each magnet is pre-tested at operating conditions after construction. MAGCOOL, the production magnet test facility, was designed to perform these tests, having the capacity to test ten magnets per five day week. This paper describes the control aspects of MAGCOOL and the advantages afforded the designers by the implementation of a commercial process control computer system

  19. A Computer Simulation to Assess the Nuclear Material Accountancy System of a MOX Fuel Fabrication Facility

    International Nuclear Information System (INIS)

    Portaix, C.G.; Binner, R.; John, H.

    2015-01-01

    SimMOX is a computer programme that simulates container histories as they pass through a MOX facility. It performs two parallel calculations: · the first quantifies the actual movements of material that might be expected to occur, given certain assumptions about, for instance, the accumulation of material and waste, and of their subsequent treatment; · the second quantifies the same movements on the basis of the operator's perception of the quantities involved; that is, they are based on assumptions about quantities contained in the containers. Separate skeletal Excel computer programmes are provided, which can be configured to generate further accountancy results based on these two parallel calculations. SimMOX is flexible in that it makes few assumptions about the order and operational performance of individual activities that might take place at each stage of the process. It is able to do this because its focus is on material flows, and not on the performance of individual processes. Similarly there are no pre-conceptions about the different types of containers that might be involved. At the macroscopic level, the simulation takes steady operation as its base case, i.e., the same quantity of material is deemed to enter and leave the simulated area, over any given period. Transient situations can then be superimposed onto this base scene, by simulating them as operational incidents. A general facility has been incorporated into SimMOX to enable the user to create an ''act of a play'' based on a number of operational incidents that have been built into the programme. By doing this a simulation can be constructed that predicts the way the facility would respond to any number of transient activities. This computer programme can help assess the nuclear material accountancy system of a MOX fuel fabrication facility; for instance the implications of applying NRTA (near real time accountancy). (author)

  20. On the importance of a rich embodiment in the grounding of concepts: perspectives from embodied cognitive science and computational linguistics.

    Science.gov (United States)

    Thill, Serge; Padó, Sebastian; Ziemke, Tom

    2014-07-01

    The recent trend in cognitive robotics experiments on language learning, symbol grounding, and related issues necessarily entails a reduction of sensorimotor aspects from those provided by a human body to those that can be realized in machines, limiting robotic models of symbol grounding in this respect. Here, we argue that there is a need for modeling work in this domain to explicitly take into account the richer human embodiment even for concrete concepts that prima facie relate merely to simple actions, and illustrate this using distributional methods from computational linguistics which allow us to investigate grounding of concepts based on their actual usage. We also argue that these techniques have applications in theories and models of grounding, particularly in machine implementations thereof. Similarly, considering the grounding of concepts in human terms may be of benefit to future work in computational linguistics, in particular in going beyond "grounding" concepts in the textual modality alone. Overall, we highlight the overall potential for a mutually beneficial relationship between the two fields. Copyright © 2014 Cognitive Science Society, Inc.

  1. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  2. A method to estimate characteristics of grounding systems considering experimental studies and computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Andre Nunes de; Silva, Ivan Nunes da; Ulson, Jose Alfredo C.; Zago, Maria Goretti [UNESP, Bauru, SP (Brazil). Dept. de Engenharia Eletrica]. E-mail: andrejau@bauru.unesp.br

    2001-07-01

    This paper describes a novel approach for mapping characteristics of grounding systems using artificial neural networks. The network acts as identifier of structural features of the grounding processes. So that output parameters can be estimated and generalized from an input parameter set. The results obtained by the network are compared with other approaches also used to model grounding systems concerning lightning. (author)

  3. Detectability of molecular gas signatures on Jupiter’s moon Europa from ground and space-based facilities

    Science.gov (United States)

    Paganini, Lucas; Villanueva, Geronimo Luis; Hurford, Terry; Mandell, Avi; Roth, Lorenz; Mumma, Michael J.

    2017-10-01

    Plumes and their effluent material could provide insights into Europa’s subsurface chemistry and relevant information about the prospect that life could exist, or now exists, within the ocean. In 2016, we initiated a strong observational campaign to characterize the chemical composition of Europa’s surface and exosphere using high-resolution infrared spectroscopy. While several studies have focused on the detection of water, or its dissociation products, there could be a myriad of complex molecules released by erupting plumes. Our IR survey has provided a serendipitous search for several key molecular species, allowing a chemical characterization that can aid the investigation of physical processes underlying its surface. Since our tentative water detection, presented at the 2016 DPS meeting, we have continued the observations of Europa during 2017 covering a significant extent of the moon’s terrain and orbital position (true anomaly), accounting for over 50 hr on source. Current analyses of these data are showing spectral features that grant further investigation. In addition to analysis algorithms tailored to the examination of Europan data, we have developed simulation tools to predict the possible detection of molecular species using ground-based facilities like the Keck Observatory, NASA’s Infrared Telescope and the Atacama Large Millimeter/submillimeter Array (ALMA). In this presentation we will discuss the detectability of key molecular species with these remote sensing facilities, as well as expected challenges and future strategies with upcoming spacecrafts such as the James Webb Space Telescope (JWST), the Large UV/Optical/Infrared Surveyor (LUVOIR), and a possible gas spectrometer onboard an orbiter.This work is supported by NASA’s Keck PI Data Award (PI L.P.) and Solar System Observation Program (PI L.P.), and by the NASA Astrobiology Institute through funding awarded to the Goddard Center for Astrobiology (PI M.J.M.).

  4. Computer mapping and visualization of facilities for planning of D and D operations

    International Nuclear Information System (INIS)

    Wuller, C.E.; Gelb, G.H.; Cramond, R.; Cracraft, J.S.

    1995-01-01

    The lack of as-built drawings for many old nuclear facilities impedes planning for decontamination and decommissioning. Traditional manual walkdowns subject workers to lengthy exposure to radiological and other hazards. The authors have applied close-range photogrammetry, 3D solid modeling, computer graphics, database management, and virtual reality technologies to create geometrically accurate 3D computer models of the interiors of facilities. The required input to the process is a set of photographs that can be acquired in a brief time. They fit 3D primitive shapes to objects of interest in the photos and, at the same time, record attributes such as material type and link patches of texture from the source photos to facets of modeled objects. When they render the model as either static images or at video rates for a walk-through simulation, the phototextures are warped onto the objects, giving a photo-realistic impression. The authors have exported the data to commercial CAD, cost estimating, robotic simulation, and plant design applications. Results from several projects at old nuclear facilities are discussed

  5. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    Science.gov (United States)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  6. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    Science.gov (United States)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  7. Preliminary Results From a Heavily Instrumented Engine Ice Crystal Icing Test in a Ground Based Altitude Test Facility

    Science.gov (United States)

    Flegel, Ashlie B.; Oliver, Michael J.

    2016-01-01

    Preliminary results from the heavily instrumented ALF502R-5 engine test conducted in the NASA Glenn Research Center Propulsion Systems Laboratory are discussed. The effects of ice crystal icing on a full scale engine is examined and documented. This same model engine, serial number LF01, was used during the inaugural icing test in the Propulsion Systems Laboratory facility. The uncommanded reduction of thrust (rollback) events experienced by this engine in flight were simulated in the facility. Limited instrumentation was used to detect icing on the LF01 engine. Metal temperatures on the exit guide vanes and outer shroud and the load measurement were the only indicators of ice formation. The current study features a similar engine, serial number LF11, which is instrumented to characterize the cloud entering the engine, detect/characterize ice accretion, and visualize the ice accretion in the region of interest. Data were acquired at key LF01 test points and additional points that explored: icing threshold regions, low altitude, high altitude, spinner heat effects, and the influence of varying the facility and engine parameters. For each condition of interest, data were obtained from some selected variations of ice particle median volumetric diameter, total water content, fan speed, and ambient temperature. For several cases the NASA in-house engine icing risk assessment code was used to find conditions that would lead to a rollback event. This study further helped NASA develop necessary icing diagnostic instrumentation, expand the capabilities of the Propulsion Systems Laboratory, and generate a dataset that will be used to develop and validate in-house icing prediction and risk mitigation computational tools. The ice accretion on the outer shroud region was acquired by internal video cameras. The heavily instrumented engine showed good repeatability of icing responses when compared to the key LF01 test points and during day-to-day operation. Other noticeable

  8. Safety assessment for the above ground storage of Cadmium Safety and Control Rods at the Solid Waste Management Facility

    International Nuclear Information System (INIS)

    Shaw, K.W.

    1993-11-01

    The mission of the Savannah River Site is changing from radioisotope production to waste management and environmental restoration. As such, Reactor Engineering has recently developed a plan to transfer the safety and control rods from the C, K, L, and P reactor disassembly basin areas to the Transuranic (TRU) Waste Storage Pads for long-term, retrievable storage. The TRU pads are located within the Solid Waste Management Facilities at the Savannah River Site. An Unreviewed Safety Question (USQ) Safety Evaluation has been performed for the proposed disassembly basin operations phase of the Cadmium Safety and Control Rod Project. The USQ screening identified a required change to the authorization basis; however, the Proposed Activity does not involve a positive USQ Safety Evaluation. A Hazard Assessment for the Cadmium Safety and Control Rod Project determined that the above-ground storage of the cadmium rods results in no change in hazard level at the TRU pads. A Safety Assessment that specifically addresses the storage (at the TRU pads) phase of the Cadmium Safety and Control Rod Project has been performed. Results of the Safety Assessment support the conclusion that a positive USQ is not involved as a result of the Proposed Activity

  9. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  10. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford facilities: Progress Report for the Period April 1 to June 30, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-09-01

    This report describes the progress of 13 Hanford ground-water monitoring projects for the period April 1 to June 30, 1989. These projects are for the 300 area process trenches (300 area), 183-H solar evaporation basins (100-H area), 200 areas low-level burial grounds, nonradioactive dangerous waste landfill (southeast of the 200 areas), 1301-N liquid waste disposal facility (100-N area), 1324-N surface impoundment and 1324-NA percolation pond (100-N area), 1325-N liquid waste disposal facility (100-N area), 216-A-10 crib (200-east area), 216-A-29 ditch (200-east area), 216-A-36B crib (200-east area), 216-B-36B crib (200-east area), 216-B-3 pond (east of the 200-east area), 2101-M pond (200-east area), grout treatment facility (200-east area).

  11. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1991-01-01

    In the process of review and evaluation of licensing issues related to nuclear power plants, it is essential to understand the behavior of seismic loading, foundation and structural properties and their impact on the overall structural response. In most cases, such knowledge could be obtained by using simplified engineering models which, when properly implemented, can capture the essential parameters describing the physics of the problem. Such models do not require execution on large computer systems and could be implemented through a personal computer (PC) based capability. Recognizing the need for a PC software package that can perform structural response computations required for typical licensing reviews, the US Nuclear Regulatory Commission sponsored the development of a PC operated computer software package CARES (Computer Analysis for Rapid Evaluation of Structures) system. This development was undertaken by Brookhaven National Laboratory (BNL) during FY's 1988 and 1989. A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to operate on a PC, have user friendly input/output interface, and have quick turnaround. This paper describes the various features which have been implemented into the seismic module of CARES version 1.0

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  14. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    International Nuclear Information System (INIS)

    Zynovyev, Mykhaylo

    2012-01-01

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  15. Computer software design description for the Treated Effluent Disposal Facility (TEDF), Project L-045H, Operator Training Station (OTS)

    International Nuclear Information System (INIS)

    Carter, R.L. Jr.

    1994-01-01

    The Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) is a computer-based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS)

  16. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1990-01-01

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  17. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  18. Teaching ergonomics to nursing facility managers using computer-based instruction.

    Science.gov (United States)

    Harrington, Susan S; Walker, Bonnie L

    2006-01-01

    This study offers evidence that computer-based training is an effective tool for teaching nursing facility managers about ergonomics and increasing their awareness of potential problems. Study participants (N = 45) were randomly assigned into a treatment or control group. The treatment group completed the ergonomics training and a pre- and posttest. The control group completed the pre- and posttests without training. Treatment group participants improved significantly from 67% on the pretest to 91% on the posttest, a gain of 24%. Differences between mean scores for the control group were not significant for the total score or for any of the subtests.

  19. Neurons Forming Optic Glomeruli Compute Figure-Ground Discriminations in Drosophila

    OpenAIRE

    Aptekar, JW; Keles, MF; Lu, PM; Zolotova, NM; Frye, MA

    2015-01-01

    Many animals rely on visual figure–ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure–ground stimuli con...

  20. Neurons Forming Optic Glomeruli Compute Figure–Ground Discriminations in Drosophila

    Science.gov (United States)

    Aptekar, Jacob W.; Keleş, Mehmet F.; Lu, Patrick M.; Zolotova, Nadezhda M.

    2015-01-01

    Many animals rely on visual figure–ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure–ground stimuli containing a motion-defined edge elicit statistically similar behavioral responses to one another, and statistically distinct behavioral responses from ground motion alone. From studies in larger flies and other insect species, we hypothesized that the circuitry of the lobula—one of the four, primary neuropiles of the fly optic lobe—performs this visual discrimination. Using calcium imaging of input dendrites, we then show that information encoded in cells projecting from the lobula to discrete optic glomeruli in the central brain group these sets of figure–ground stimuli in a homologous manner to the behavior; “figure-like” stimuli are coded similar to one another and “ground-like” stimuli are encoded differently. One cell class responds to the leading edge of a figure and is suppressed by ground motion. Two other classes cluster any figure-like stimuli, including a figure moving opposite the ground, distinctly from ground alone. This evidence demonstrates that lobula outputs provide a diverse basis set encoding visual features necessary for figure detection. PMID:25972183

  1. FIRAC: a computer code to predict fire-accident effects in nuclear facilities

    International Nuclear Information System (INIS)

    Bolstad, J.W.; Krause, F.R.; Tang, P.K.; Andrae, R.W.; Martin, R.A.; Gregory, W.S.

    1983-01-01

    FIRAC is a medium-sized computer code designed to predict fire-induced flows, temperatures, and material transport within the ventilating systems and other airflow pathways in nuclear-related facilities. The code is designed to analyze the behavior of interconnected networks of rooms and typical ventilation system components. This code is one in a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. The structure of this code closely follows that of the previously developed TVENT and EVENT codes. Because a lumped-parameter formulation is used, this code is particularly suitable for calculating the effects of fires in the far field (that is, in regions removed from the fire compartment), where the fire may be represented parametrically. However, a fire compartment model to simulate conditions in the enclosure is included. This model provides transport source terms to the ventilation system that can affect its operation and in turn affect the fire

  2. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    Gould, S.S.; Layman, L.R.; Million, D.L.

    1983-01-01

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  3. Computer-model analysis of ground-water flow and simulated effects of contaminant remediation at Naval Weapons Industrial Reserve Plant, Dallas, Texas

    Science.gov (United States)

    Barker, Rene A.; Braun, Christopher L.

    2000-01-01

    In June 1993, the Department of the Navy, Southern Division Naval Facilities Engineering Command (SOUTHDIV), began a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI) of the Naval Weapons Industrial Reserve Plant (NWIRP) in north-central Texas. The RFI has found trichloroethene, dichloroethene, vinyl chloride, as well as chromium, lead, and other metallic residuum in the shallow alluvial aquifer underlying NWIRP. These findings and the possibility of on-site or off-site migration of contaminants prompted the need for a ground-water-flow model of the NWIRP area. The resulting U.S. Geological Survey (USGS) model: (1) defines aquifer properties, (2) computes water budgets, (3) delineates major flowpaths, and (4) simulates hydrologic effects of remediation activity. In addition to assisting with particle-tracking analyses, the calibrated model could support solute-transport modeling as well as help evaluate the effects of potential corrective action. The USGS model simulates steadystate and transient conditions of ground-water flow within a single model layer.The alluvial aquifer is within fluvial terrace deposits of Pleistocene age, which unconformably overlie the relatively impermeable Eagle Ford Shale of Late Cretaceous age. Over small distances and short periods, finer grained parts of the aquifer are separated hydraulically; however, most of the aquifer is connected circuitously through randomly distributed coarser grained sediments. The top of the underlying Eagle Ford Shale, a regional confining unit, is assumed to be the effective lower limit of ground-water circulation and chemical contamination.The calibrated steady-state model reproduces long-term average water levels within +5.1 or –3.5 feet of those observed; the standard error of the estimate is 1.07 feet with a mean residual of 0.02 foot. Hydraulic conductivity values range from 0.75 to 7.5 feet per day, and average about 4 feet per day. Specific yield values range from 0

  4. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    Energy Technology Data Exchange (ETDEWEB)

    Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Norman, A. [Fermilab; Timm, S. [Fermilab; Tiradani, A. [Fermilab

    2017-03-15

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  5. The HEPCloud Facility: elastic computing for High Energy Physics - The NOvA Use Case

    Science.gov (United States)

    Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Norman, A.; Timm, S.; Tiradani, A.

    2017-10-01

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  6. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    Science.gov (United States)

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  7. CSNI Integral Test Facility Matrices for Validation of Best-Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Internationally agreed Integral Test Facility (ITF) matrices for validation of realistic thermal hydraulic system computer codes were established. ITF development is mainly for Pressurised Water Reactors (PWRs) and Boiling Water Reactors (BWRs). A separate activity was for Russian Pressurised Water-cooled and Water-moderated Energy Reactors (WWER). Firstly, the main physical phenomena that occur during considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. In this paper some specific examples from the ITF matrices will also be provided. The matrices will be a guide for code validation, will be a basis for comparisons of code predictions performed with different system codes, and will contribute to the quantification of the uncertainty range of code model predictions. In addition to this objective, the construction of such a matrix is an attempt to record information which has been generated around the world over the last years, so that it is more accessible to present and future workers in that field than would otherwise be the case.

  8. Enhanced computational infrastructure for data analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; McHarg, B.B.; Meyer, W.H.; Parker, C.T.

    2000-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from nine national laboratories, 19 foreign laboratories, 16 universities, and five industrial partnerships. As a result of this work, DIII-D data is available on a 24x7 basis from a set of viewing and analysis tools that can be run on either the collaborators' or DIII-D's computer systems. Additionally, a web based data and code documentation system has been created to aid the novice and expert user alike

  9. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.; McCharg, B.B.

    1999-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  10. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele

    2012-01-01

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  11. FIRAC - a computer code to predict fire accident effects in nuclear facilities

    International Nuclear Information System (INIS)

    Bolstad, J.W.; Foster, R.D.; Gregory, W.S.

    1983-01-01

    FIRAC is a medium-sized computer code designed to predict fire-induced flows, temperatures, and material transport within the ventilating systems and other airflow pathways in nuclear-related facilities. The code is designed to analyze the behavior of interconnected networks of rooms and typical ventilation system components. This code is one in a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. The structure of this code closely follows that of the previously developed TVENT and EVENT codes. Because a lumped-parameter formulation is used, this code is particularly suitable for calculating the effects of fires in the far field (that is, in regions removed from the fire compartment), where the fire may be represented parametrically. However, a fire compartment model to simulate conditions in the enclosure is included. This model provides transport source terms to the ventilation system that can affect its operation and in turn affect the fire. A basic material transport capability that features the effects of convection, deposition, entrainment, and filtration of material is included. The interrelated effects of filter plugging, heat transfer, gas dynamics, and material transport are taken into account. In this paper the authors summarize the physical models used to describe the gas dynamics, material transport, and heat transfer processes. They also illustrate how a typical facility is modeled using the code

  12. Application of personal computer to development of entrance management system for radiating facilities

    International Nuclear Information System (INIS)

    Suzuki, Shogo; Hirai, Shouji

    1989-01-01

    The report describes a system for managing the entrance and exit of personnel to radiating facilities. A personal computer is applied to its development. Major features of the system is outlined first. The computer is connected to the gate and two magnetic card readers provided at the gate. The gate, which is installed at the entrance to a room under control, opens only for those who have a valid card. The entrance-exit management program developed is described next. The following three files are used: ID master file (random file of the magnetic card number, name, qualification, etc., of each card carrier), entrance-exit management file (random file of time of entrance/exit, etc., updated everyday), and entrance-exit record file (sequential file of card number, name, date, etc.), which are stored on floppy disks. A display is provided to show various lists including a list of workers currently in the room and a list of workers who left the room at earlier times of the day. This system is useful for entrance management of a relatively small facility. Though small in required cost, it requires only a few operators to perform effective personnel management. (N.K.)

  13. ARADISH - Development of a Standardized Plant Growth Chamber for Experiments in Gravitational Biology Using Ground Based Facilities

    Science.gov (United States)

    Schüler, Oliver; Krause, Lars; Görög, Mark; Hauslage, Jens; Kesseler, Leona; Böhmer, Maik; Hemmersbach, Ruth

    2016-06-01

    Plant development strongly relies on environmental conditions. Growth of plants in Biological Life Support Systems (BLSS), which are a necessity to allow human survival during long-term space exploration missions, poses a particular problem for plant growth, as in addition to the traditional environmental factors, microgravity (or reduced gravity such as on Moon or Mars) and limited gas exchange hamper plant growth. Studying the effects of reduced gravity on plants requires real or simulated microgravity experiments under highly standardized conditions, in order to avoid the influence of other environmental factors. Analysis of a large number of biological replicates, which is necessary for the detection of subtle phenotypical differences, can so far only be achieved in Ground Based Facilities (GBF). Besides different experimental conditions, the usage of a variety of different plant growth chambers was a major factor that led to a lack of reproducibility and comparability in previous studies. We have developed a flexible and customizable plant growth chamber, called ARAbidopsis DISH (ARADISH), which allows plant growth from seed to seedling, being realized in a hydroponic system or on Agar. By developing a special holder, the ARADISH can be used for experiments with Arabidopsis thaliana or a plant with a similar habitus on common GBF hardware, including 2D clinostats and Random Positioning Machines (RPM). The ARADISH growth chamber has a controlled illumination system of red and blue light emitting diodes (LED), which allows the user to apply defined light conditions. As a proof of concept we tested a prototype in a proteomic experiment in which plants were exposed to simulated microgravity or a 90° stimulus. We optimized the design and performed viability tests after several days of growth in the hardware that underline the utility of ARADISH in microgravity research.

  14. Neurons forming optic glomeruli compute figure-ground discriminations in Drosophila.

    Science.gov (United States)

    Aptekar, Jacob W; Keleş, Mehmet F; Lu, Patrick M; Zolotova, Nadezhda M; Frye, Mark A

    2015-05-13

    Many animals rely on visual figure-ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure-ground stimuli containing a motion-defined edge elicit statistically similar behavioral responses to one another, and statistically distinct behavioral responses from ground motion alone. From studies in larger flies and other insect species, we hypothesized that the circuitry of the lobula--one of the four, primary neuropiles of the fly optic lobe--performs this visual discrimination. Using calcium imaging of input dendrites, we then show that information encoded in cells projecting from the lobula to discrete optic glomeruli in the central brain group these sets of figure-ground stimuli in a homologous manner to the behavior; "figure-like" stimuli are coded similar to one another and "ground-like" stimuli are encoded differently. One cell class responds to the leading edge of a figure and is suppressed by ground motion. Two other classes cluster any figure-like stimuli, including a figure moving opposite the ground, distinctly from ground alone. This evidence demonstrates that lobula outputs provide a diverse basis set encoding visual features necessary for figure detection. Copyright © 2015 the authors 0270-6474/15/357587-13$15.00/0.

  15. Computer simulation and implementation of defected ground structure on a microstrip antenna

    Science.gov (United States)

    Adrian, H.; Rambe, A. H.; Suherman

    2018-03-01

    Defected Ground Structure (DGS) is a method reducing etching area on antenna ground to form desirable antenna’s ground field. This paper reports the method impact on microstrip antennas working on 1800 and 2400 MHz. These frequencies are important as many radio network applications such mobile phones and wireless devices working on these channels. The assessments were performed by simulating and fabricating the evaluated antennas. Both simulation data and implementation measurements show that DGS successfully improves antenna performances by increasing bandwidth up to 19%, reducing return loss up to 109% and increasing gain up to 33%.

  16. GASFLOW: A computational model to analyze accidents in nuclear containment and facility buildings

    International Nuclear Information System (INIS)

    Travis, J.R.; Nichols, B.D.; Wilson, T.L.; Lam, K.L.; Spore, J.W.; Niederauer, G.F.

    1993-01-01

    GASFLOW is a finite-volume computer code that solves the time-dependent, compressible Navier-Stokes equations for multiple gas species. The fluid-dynamics algorithm is coupled to the chemical kinetics of combusting liquids or gases to simulate diffusion or propagating flames in complex geometries of nuclear containment or confinement and facilities' buildings. Fluid turbulence is calculated to enhance the transport and mixing of gases in rooms and volumes that may be connected by a ventilation system. The ventilation system may consist of extensive ductwork, filters, dampers or valves, and fans. Condensation and heat transfer to walls, floors, ceilings, and internal structures are calculated to model the appropriate energy sinks. Solid and liquid aerosol behavior is simulated to give the time and space inventory of radionuclides. The solution procedure of the governing equations is a modified Los Alamos ICE'd-ALE methodology. Complex facilities can be represented by separate computational domains (multiblocks) that communicate through overlapping boundary conditions. The ventilation system is superimposed throughout the multiblock mesh. Gas mixtures and aerosols are transported through the free three-dimensional volumes and the restricted one-dimensional ventilation components as the accident and fluid flow fields evolve. Combustion may occur if sufficient fuel and reactant or oxidizer are present and have an ignition source. Pressure and thermal loads on the building, structural components, and safety-related equipment can be determined for specific accident scenarios. GASFLOW calculations have been compared with large oil-pool fire tests in the 1986 HDR containment test T52.14, which is a 3000-kW fire experiment. The computed results are in good agreement with the observed data

  17. PRECONDITIONED CONJUGATE-GRADIENT 2 (PCG2), a computer program for solving ground-water flow equations

    Science.gov (United States)

    Hill, Mary C.

    1990-01-01

    This report documents PCG2 : a numerical code to be used with the U.S. Geological Survey modular three-dimensional, finite-difference, ground-water flow model . PCG2 uses the preconditioned conjugate-gradient method to solve the equations produced by the model for hydraulic head. Linear or nonlinear flow conditions may be simulated. PCG2 includes two reconditioning options : modified incomplete Cholesky preconditioning, which is efficient on scalar computers; and polynomial preconditioning, which requires less computer storage and, with modifications that depend on the computer used, is most efficient on vector computers . Convergence of the solver is determined using both head-change and residual criteria. Nonlinear problems are solved using Picard iterations. This documentation provides a description of the preconditioned conjugate gradient method and the two preconditioners, detailed instructions for linking PCG2 to the modular model, sample data inputs, a brief description of PCG2, and a FORTRAN listing.

  18. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  19. The Overview of the National Ignition Facility Distributed Computer Control System

    International Nuclear Information System (INIS)

    Lagin, L.J.; Bettenhausen, R.C.; Carey, R.A.; Estes, C.M.; Fisher, J.M.; Krammen, J.E.; Reed, R.K.; VanArsdall, P.J.; Woodruff, J.P.

    2001-01-01

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008

  20. Estimation of natural ground water recharge for the performance assessment of a low-level waste disposal facility at the Hanford Site

    International Nuclear Information System (INIS)

    Rockhold, M.L.; Fayer, M.J.; Kincaid, C.T.; Gee, G.W.

    1995-03-01

    In 1994, the Pacific Northwest Laboratory (PNL) initiated the Recharge Task, under the PNL Vitrification Technology Development (PVTD) project, to assist Westinghouse Hanford Company (WHC) in designing and assessing the performance of a low-level waste (LLW) disposal facility for the US Department of Energy (DOE). The Recharge Task was established to address the issue of ground water recharge in and around the LLW facility and throughout the Hanford Site as it affects the unconfined aquifer under the facility. The objectives of this report are to summarize the current knowledge of natural ground water recharge at the Hanford Site and to outline the work that must be completed in order to provide defensible estimates of recharge for use in the performance assessment of this LLW disposal facility. Recharge studies at the Hanford Site indicate that recharge rates are highly variable, ranging from nearly zero to greater than 100 mm/yr depending on precipitation, vegetative cover, and soil types. Coarse-textured soils without plants yielded the greatest recharge. Finer-textured soils, with or without plants, yielded the least. Lysimeters provided accurate, short-term measurements of recharge as well as water-balance data for the soil-atmosphere interface and root zone. Tracers provided estimates of longer-term average recharge rates in undisturbed settings. Numerical models demonstrated the sensitivity of recharge rates to different processes and forecast recharge rates for different conditions. All of these tools (lysimetry, tracers, and numerical models) are considered vital to the development of defensible estimates of natural ground water recharge rates for the performance assessment of a LLW disposal facility at the Hanford Site

  1. The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster

    Science.gov (United States)

    Löwe, P.; Klump, J.; Thaler, J.

    2012-04-01

    Compute clusters can be used as GIS workbenches, their wealth of resources allow us to take on geocomputation tasks which exceed the limitations of smaller systems. To harness these capabilities requires a Geographic Information System (GIS), able to utilize the available cluster configuration/architecture and a sufficient degree of user friendliness to allow for wide application. In this paper we report on the first successful porting of GRASS GIS, the oldest and largest Free Open Source (FOSS) GIS project, onto a compute cluster using Platform Computing's Load Sharing Facility (LSF). In 2008, GRASS6.3 was installed on the GFZ compute cluster, which at that time comprised 32 nodes. The interaction with the GIS was limited to the command line interface, which required further development to encapsulate the GRASS GIS business layer to facilitate its use by users not familiar with GRASS GIS. During the summer of 2011, multiple versions of GRASS GIS (v 6.4, 6.5 and 7.0) were installed on the upgraded GFZ compute cluster, now consisting of 234 nodes with 480 CPUs providing 3084 cores. The GFZ compute cluster currently offers 19 different processing queues with varying hardware capabilities and priorities, allowing for fine-grained scheduling and load balancing. After successful testing of core GIS functionalities, including the graphical user interface, mechanisms were developed to deploy scripted geocomputation tasks onto dedicated processing queues. The mechanisms are based on earlier work by NETELER et al. (2008). A first application of the new GIS functionality was the generation of maps of simulated tsunamis in the Mediterranean Sea for the Tsunami Atlas of the FP-7 TRIDEC Project (www.tridec-online.eu). For this, up to 500 processing nodes were used in parallel. Further trials included the processing of geometrically complex problems, requiring significant amounts of processing time. The GIS cluster successfully completed all these tasks, with processing times

  2. [Elderlies in street situation or social vulnerability: facilities and difficulties in the use of computational tools].

    Science.gov (United States)

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Pereira, Valclei Aparecida Gandolpho; Negreiros, Maria Célia de; Paranhos, Wana Yeda; Leite, Maria Madalena Januário

    2014-01-01

    This study aimed to identify the advantages and difficulties encountered by older people living on the streets or social vulnerability, to use the computer or internet. It is an exploratory qualitative research, in which five elderlies, attended on a non-governmental organization located in the city of São Paulo, have participated. The discourses were analyzed by content analysis technique and showed, as facilities, among others, to clarify doubts with the monitors, the stimulus for new discoveries coupled with proactivity and curiosity, and develop new skills. The mentioned difficulties were related to physical or cognitive issues, lack of instructor, and lack of knowledge to interact with the machine. The studies focusing on the elderly population living on the streets or in social vulnerability may contribute with evidence to guide the formulation of public policies to this population.

  3. Development of a personal computer based facility-level SSAC component and inspector support system

    International Nuclear Information System (INIS)

    Markov, A.

    1989-08-01

    Research Contract No. 4658/RB was conducted between the IAEA and the Bulgarian Committee on Use of Atomic Energy for Peaceful Purposes. The contract required the Committee to develop and program a personal computer based software package to be used as a facility-level computerized State System of Accounting and Control (SSAC) at an off-load power reactor. The software delivered, called the National Safeguards System (NSS) keeps track of all fuel assembly activity at a power reactor and generates all ledgers, MBA material balances and any required reports to national or international authorities. The NSS is designed to operate on a PC/AT or compatible equipment with a hard disk of 20 MB, color graphics monitor or adaptor and at least one floppy disk drive, 360 Kb. The programs are written in Basic (compiler 2.0). They are executed under MS DOS 3.1 or later

  4. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  5. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  6. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  7. Computational Simulations of the NASA Langley HyMETS Arc-Jet Facility

    Science.gov (United States)

    Brune, A. J.; Bruce, W. E., III; Glass, D. E.; Splinter, S. C.

    2017-01-01

    The Hypersonic Materials Environmental Test System (HyMETS) arc-jet facility located at the NASA Langley Research Center in Hampton, Virginia, is primarily used for the research, development, and evaluation of high-temperature thermal protection systems for hypersonic vehicles and reentry systems. In order to improve testing capabilities and knowledge of the test article environment, an effort is underway to computationally simulate the flow-field using computational fluid dynamics (CFD). A detailed three-dimensional model of the arc-jet nozzle and free-jet portion of the flow-field has been developed and compared to calibration probe Pitot pressure and stagnation-point heat flux for three test conditions at low, medium, and high enthalpy. The CFD model takes into account uniform pressure and non-uniform enthalpy profiles at the nozzle inlet as well as catalytic recombination efficiency effects at the probe surface. Comparing the CFD results and test data indicates an effectively fully-catalytic copper surface on the heat flux probe of about 10% efficiency and a 2-3 kpa pressure drop from the arc heater bore, where the pressure is measured, to the plenum section, prior to the nozzle. With these assumptions, the CFD results are well within the uncertainty of the stagnation pressure and heat flux measurements. The conditions at the nozzle exit were also compared with radial and axial velocimetry. This simulation capability will be used to evaluate various three-dimensional models that are tested in the HyMETS facility. An end-to-end aerothermal and thermal simulation of HyMETS test articles will follow this work to provide a better understanding of the test environment, test results, and to aid in test planning. Additional flow-field diagnostic measurements will also be considered to improve the modeling capability.

  8. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    Science.gov (United States)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  9. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nemec, Ursula [Vienna General Hospital, Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Heidinger, Benedikt H.; Bankier, Alexander A. [Harvard Medical School, Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Anderson, Kevin R.; VanderLaan, Paul A. [Harvard Medical School, Pathology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Westmore, Michael S. [Imbio, Delafield, WI (United States)

    2018-01-15

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  10. Software-based risk stratification of pulmonary adenocarcinomas manifesting as pure ground glass nodules on computed tomography

    International Nuclear Information System (INIS)

    Nemec, Ursula; Heidinger, Benedikt H.; Bankier, Alexander A.; Anderson, Kevin R.; VanderLaan, Paul A.; Westmore, Michael S.

    2018-01-01

    To assess the performance of the ''Computer-Aided Nodule Assessment and Risk Yield'' (CANARY) software in the differentiation and risk assessment of histological subtypes of lung adenocarcinomas manifesting as pure ground glass nodules on computed tomography (CT). 64 surgically resected and histologically proven adenocarcinomas manifesting as pure ground-glass nodules on CT were assessed using CANARY software, which classifies voxel-densities into three risk components (low, intermediate, and high risk). Differences in risk components between histological adenocarcinoma subtypes were analysed. To determine the optimal threshold reflecting the presence of an invasive focus, sensitivity, specificity, negative predictive value, and positive predictive value were calculated. 28/64 (44%) were adenocarcinomas in situ (AIS); 26/64 (41%) were minimally invasive adenocarcinomas (MIA); and 10/64 (16%) were invasive ACs (IAC). The software showed significant differences in risk components between histological subtypes (P<0.001-0.003). A relative volume of 45% or less of low-risk components was associated with histological invasiveness (specificity 100%, positive predictive value 100%). CANARY-based risk assessment of ACs manifesting as pure ground glass nodules on CT allows the differentiation of their histological subtypes. A threshold of 45% of low-risk components reflects invasiveness in these groups. (orig.)

  11. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  12. Data on the quantitative assessment pulmonary ground-glass opacification from coronary computed tomography angiography datasets

    DEFF Research Database (Denmark)

    Kühl, J Tobias; Kristensen, Thomas S; Thomsen, Anna F

    2017-01-01

    We assessed the CT attenuation density of the pulmonary tissue adjacent to the heart in patients with acute non-ST segment elevation myocardial infarction (J.T. Kuhl, T.S. Kristensen, A.F. Thomsen et al., 2016) [1]. This data was related to the level of ground-glass opacification evaluated by a r...

  13. Dynamic Design of Ground Transport With the Help of Computational Experiment

    Directory of Open Access Journals (Sweden)

    Kravets Victor

    2015-05-01

    Full Text Available Objectives of ground transport (motor transport vehicle have been considered. Mathematical model of nonlinear dynamics in spatial motion of asymmetric carriage in the form of Euler-Lagrange equations represented as symmetrical block structure in quaternion matrices has been developed. Kinematic equations and partition matrices of external action in which Rodrigues-Hamilton parameters have been applied describe quaternionic matrices.

  14. Computer-Aided Segmentation and Volumetry of Artificial Ground-Glass Nodules at Chest CT

    NARCIS (Netherlands)

    Scholten, Ernst Th.; Jacobs, Colin; van Ginneken, Bram; Willemink, Martin J.; Kuhnigk, Jan-Martin; van Ooijen, Peter M. A.; Oudkerk, Matthijs; Mali, Willem P. Th. M.; de Jong, Pim A.

    OBJECTIVE. The purpose of this study was to investigate a new software program for semiautomatic measurement of the volume and mass of ground-glass nodules (GGNs) in a chest phantom and to investigate the influence of CT scanner, reconstruction filter, tube voltage, and tube current. MATERIALS AND

  15. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  16. On a new method to compute photon skyshine doses around radiotherapy facilities

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, R.; Facure, A. [Comissao Nacional de Eenrgia Nuclear, Rio de Janeiro (Brazil); Xavier, A. [PEN/Coppe -UFRJ, Rio de Janeiro (Brazil)

    2006-07-01

    Full text of publication follows: Nowadays, in a great number of situations constructions are raised around radiotherapy facilities. In cases where the constructions would not be in the primary x-ray beam, 'skyshine' radiation is normally accounted for. The skyshine method is commonly used to to calculate the dose contribution from scattered radiation in such circumstances, when the roof shielding is projected considering there will be no occupancy upstairs. In these cases, there will be no need to have the usual 1,5-2,0 m thick ceiling, and the construction costs can be considerably reduced. The existing expression to compute these doses do not accomplish to explain mathematically the existence of a shadow area just around the outer room walls, and its growth, as we get away from these walls. In this paper we propose a new method to compute photon skyshine doses, using geometrical considerations to find the maximum dose point. An empirical equation is derived, and its validity is tested using M.C.N.P. 5 Monte Carlo calculation to simulate radiotherapy rooms configurations. (authors)

  17. Computer-guided facility for the study of single crystals at the gamma diffractometer GADI

    International Nuclear Information System (INIS)

    Heer, H.; Bleichert, H.; Gruhn, W.; Moeller, R.

    1984-10-01

    In the study of solid-state properties it is in many cases necessary to work with single crystals. The increased requirement in the industry and research as well as the desire for better characterization by means of γ-diffractometry made it necessary to improve and to modernize the existing instrument. The advantages of a computer-guided facility against the conventional, semiautomatic operation are manifold. Not only the process guidance, but also the data acquisition and evaluation are performed by the computer. By a remote control the operator is able to find quickly a reflex and to drive the crystal in every desired measuring position. The complete protocollation of all important measuring parameters, the convenient data storage, as well as the automatic evaluation are much useful for the user. Finally the measuring time can be increased to practically 24 hours per day. By this the versed characterization by means of γ-diffractometry is put on a completely new level. (orig.) [de

  18. A guide for the selection of computer assisted mapping (CAM) and facilities informations systems

    Energy Technology Data Exchange (ETDEWEB)

    Haslin, S.; Baxter, P.; Jarvis, L.

    1980-12-01

    Many distribution engineers are now aware that computer assisted mapping (CAM) and facilities informations systems are probably the most significant breakthrough to date in computer applications for distribution engineering. The Canadian Electrical Asociation (CEA) recognized this and requested engineers of B.C. Hydro make a study of the state of the art in Canadian utilities and the progress of CAM systems on an international basis. The purpose was to provide a guide to assist Canadian utility distribution engineers faced with the problem of studying the application of CAM systems as an alternative to present methods, consideration being given to the long-term and other benefits that were perhaps not apparent for those approaching this field for the first time. It soon became apparent that technology was developing at a high rate and competition in the market was very strong. Also a number of publications were produced by other sources which adequately covered the scope of this study. This report is thus a collection of references to reports, manuals, and other documents with a few considerations provided for those companies interested in exploring further the use of interactive graphics. 24 refs.

  19. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  20. In Situ Production of Chlorine-36 in the Eastern Snake River Plain Aquifer, Idaho: Implications for Describing Ground-Water Contamination Near a Nuclear Facility

    International Nuclear Information System (INIS)

    Cecil, L. D.; Knobel, L. L.; Green, J. R.; Frape, S. K.

    2000-01-01

    The purpose of this report is to describe the calculated contribution to ground water of natural, in situ produced 36Cl in the eastern Snake River Plain aquifer and to compare these concentrations in ground water with measured concentrations near a nuclear facility in southeastern Idaho. The scope focused on isotopic and chemical analyses and associated 36Cl in situ production calculations on 25 whole-rock samples from 6 major water-bearing rock types present in the eastern Snake River Plain. The rock types investigated were basalt, rhyolite, limestone, dolomite, shale, and quartzite. Determining the contribution of in situ production to 36Cl inventories in ground water facilitated the identification of the source for this radionuclide in environmental samples. On the basis of calculations reported here, in situ production of 36Cl was determined to be insignificant compared to concentrations measured in ground water near buried and injected nuclear waste at the INEEL. Maximum estimated 36Cl concentrations in ground water from in situ production are on the same order of magnitude as natural concentrations in meteoric water

  1. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  2. Description of NORMTRI: a computer program for assessing the off-site consequences from air-borne releases of tritium during normal operation of nuclear facilities

    International Nuclear Information System (INIS)

    Raskob, W.

    1994-10-01

    The computer program NORMTRI has been developed to calculate the behaviour of tritium in the environment released into the atmosphere under normal operation of nuclear facilities. It is possible to investigate the two chemical forms tritium gas and tritiated water vapour. The conversion of tritium gas into tritiated water followed by its reemission back to the atmosphere as well as the conversion into organically bound tritium is considered. NORMTRI is based on the statistical Gaussian dispersion model ISOLA, which calculates the activity concentration in air near the ground contamination due to dry and wet deposition at specified locations in a polar grid system. ISOLA requires a four-parametric meteorological statistics derived from one or more years synoptic recordings of 1-hour-averages of wind speed, wind direction, stability class and precipitation intensity. Additional features of NORMTRI are the possibility to choose several dose calculation procedures, ranging from the equations of the German regulatory guidelines to a pure specific equilibrium approach. (orig.)

  3. Compliance matrix for the mixed waste disposal facilities, Trenches 31 ampersand 34, burial ground 218-W-5

    International Nuclear Information System (INIS)

    Carlyle, D.W.

    1994-01-01

    The purpose of the Trench 31 ampersand 34 Mixed Waste Disposal Facility Compliance Matrix is to provide objective evidence of implementation of all regulatory and procedural-institutional requirements for the disposal facilities. This matrix provides a listing of the individual regulatory and procedural-institutional requirements that were addressed. Subject matter experts reviewed pertinent documents that had direct or indirect impact on the facility. Those found to be applicable were so noted and listed in Appendix A. Subject matter experts then extracted individual requirements from the documents deemed applicable and listed them in the matrix tables. The results of this effort are documented in Appendix B

  4. Arctic Atmospheric Measurements Using Manned and Unmanned Aircraft, Tethered Balloons, and Ground-Based Systems at U.S. DOE ARM Facilities on the North Slope Of Alaska

    Science.gov (United States)

    Ivey, M.; Dexheimer, D.; Roesler, E. L.; Hillman, B. R.; Hardesty, J. O.

    2016-12-01

    The U.S. Department of Energy (DOE) provides scientific infrastructure and data to the international Arctic research community via research sites located on the North Slope of Alaska and an open data archive maintained by the ARM program. In 2016, DOE continued investments in improvements to facilities and infrastructure at Oliktok Point Alaska to support operations of ground-based facilities and unmanned aerial systems for science missions in the Arctic. The Third ARM Mobile Facility, AMF3, now deployed at Oliktok Point, was further expanded in 2016. Tethered instrumented balloons were used at Oliktok to make measurements of clouds in the boundary layer including mixed-phase clouds and to compare measurements with those from the ground and from unmanned aircraft operating in the airspace above AMF3. The ARM facility at Oliktok Point includes Special Use Airspace. A Restricted Area, R-2204, is located at Oliktok Point. Roughly 4 miles in diameter, it facilitates operations of tethered balloons and unmanned aircraft. R-2204 and a new Warning Area north of Oliktok, W-220, are managed by Sandia National Laboratories for DOE Office of Science/BER. These Special Use Airspaces have been successfully used to launch and operate unmanned aircraft over the Arctic Ocean and in international airspace north of Oliktok Point.A steady progression towards routine operations of unmanned aircraft and tethered balloon systems continues at Oliktok. Small unmanned aircraft (DataHawks) and tethered balloons were successfully flown at Oliktok starting in June of 2016. This poster will discuss how principal investigators may apply for use of these Special Use Airspaces, acquire data from the Third ARM Mobile Facility, or bring their own instrumentation for deployment at Oliktok Point, Alaska.

  5. Computing a ground appropriateness index for route selection in permafrost regions

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2017-10-01

    Full Text Available The reasonable calculation of ground appropriateness index in permafrost region is the precondition of highway route design in permafrost region. The theory of knowledge base and fuzzy mathematics are applied, and the damage effect of permafrost is considered in the paper. Based on the idea of protecting permafrost the calculation method of ground appropriateness index is put forward. Firstly, based on the actual environment conditions, the paper determines the factors affecting the road layout in permafrost areas by qualitative and quantitative analysis, including the annual slope, the average annual ground temperature of permafrost, the amount of ice in frozen soil, and the interference engineering. Secondly, based on the knowledge base theory and the use of Delphi method, the paper establishes the knowledge base, the rule base of the permafrost region and inference mechanism. The method of selecting the road in permafrost region is completed and realized by using the software platform. Thirdly, taking the Tuotuo River to Kaixin Mountain section of permafrost region as an example, the application of the method is studied by using an ArcGIS platform. Results show that the route plan determined by the method of selecting the road in permafrost region can avoid the high temperature and high ice content area, conform the terrain changes and evade the heat disturbance among the existing projects. A reasonable route plan can be achieved, and it can provide the basis for the next engineering construction.

  6. Exponential vanishing of the ground-state gap of the quantum random energy model via adiabatic quantum computing

    Science.gov (United States)

    Adame, J.; Warzel, S.

    2015-11-01

    In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.

  7. Exponential vanishing of the ground-state gap of the quantum random energy model via adiabatic quantum computing

    International Nuclear Information System (INIS)

    Adame, J.; Warzel, S.

    2015-01-01

    In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM

  8. Environmental Assessment - Construct a Ground-to-Air Transmitter and Receiver (GATR) Facility at Grand Forks Air Force Base

    National Research Council Canada - National Science Library

    2006-01-01

    ...) facility on Grand Forks Air Force Base (AFB), North Dakota. The Communication Squadron is preparing to install new GATR communication antennas and systems, for tactical aircraft control and commercial air traffic control...

  9. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the Path to Ignition

    International Nuclear Information System (INIS)

    Lagin, L J; Bettenhauasen, R C; Bowers, G A; Carey, R W; Edwards, O D; Estes, C M; Demaret, R D; Ferguson, S W; Fisher, J M; Ho, J C; Ludwigsen, A P; Mathisen, D G; Marshall, C D; Matone, J M; McGuigan, D L; Sanchez, R J; Shelton, R T; Stout, E A; Tekle, E; Townsend, S L; Van Arsdall, P J; Wilson, E F

    2007-01-01

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of 8 beams each using laser hardware that is modularized into more than 6,000 line replaceable units such as optical assemblies, laser amplifiers, and multifunction sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-Megajoule capability of infrared light. During the next two years, the control system will be expanded to include automation of target area systems including final optics, target positioners and

  10. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    International Nuclear Information System (INIS)

    Lagin, L.J.; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J.

    2008-01-01

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including final optics

  11. Investigation of Pharmaceutical Residues in Hospital Effluents, in Ground- and Drinking Water from Bundeswehr Facilities, and their Removal During Drinking Water Purification (Arzneimittelrueckstaende in Trinkwasser(versorgungsanlagen) und Krankenhausabwaessern der Bundeswehr: Methodenentwicklung - Verkommen - Wasseraufbereitung)

    National Research Council Canada - National Science Library

    Heberer, Th; Feldmann, Dirk; Adam, Marc; Reddersen, Kirsten

    1999-01-01

    ... by the German Ministry of Defense. The project had three defined objectives including the investigation of pharmaceutical residues in ground water wells used for drinking water supply at military facilities...

  12. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1994 through September 1996

    Science.gov (United States)

    Torikai, J.D.

    1996-01-01

    This report describes the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1994 through September 1996, with a focus on data from July through September 1996 (third quarter of 1996). A complete database of ground-water withdrawals and chloride-concentration records since 1985 is maintained by the U.S. Geological Survey. Total rainfall for the period July through September 1996 was 8.94 inches, which is 60 percent less than the mean rainfall of 22.23 inches for the period July through September. July and August are part of the annual dry season, while September is the start of the annual wet season. Ground-water withdrawal during July through September 1996 averaged 1,038,300 gallons per day. Withdrawal for the same 3 months in 1995 averaged 888,500 gallons per day. Ground-water withdrawals have steadily increased since about April 1995. At the end of September 1996, the chloride concentration of water from the elevated tanks at Cantonment and Air Operations were 68 and 150 milligrams per liter, respectively. The chloride concentration from all five production areas increased throughout the third quarter of 1996, and started the upward trend in about April 1995. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations also increased throughout the third quarter of 1996, with the largest increases from water in the deepest monitoring wells. Chloride concentrations have not been at this level since the dry season of 1994. A fuel-pipeline leak at Air Operations in May 1991 decreased total islandwide withdrawals by 15 percent. This lost pumping capacity is being offset by increased pumpage at Cantonment. Six wells do not contribute to the water supply because they are being used to hydraulically divert fuel migration away from water-supply wells by a program of ground-water withdrawal and injection.

  13. New challenges for HEP computing: RHIC [Relativistic Heavy Ion Collider] and CEBAF [Continuous Electron Beam Accelerator Facility

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1990-01-01

    We will look at two facilities; RHIC and CEBF. CEBF is in the construction phase, RHIC is about to begin construction. For each of them, we examine the kinds of physics measurements that motivated their construction, and the implications of these experiments for computing. Emphasis will be on on-line requirements, driven by the data rates produced by these experiments

  14. Surface Water Modeling Using an EPA Computer Code for Tritiated Waste Water Discharge from the heavy Water Facility

    International Nuclear Information System (INIS)

    Chen, K.F.

    1998-06-01

    Tritium releases from the D-Area Heavy Water Facilities to the Savannah River have been analyzed. The U.S. EPA WASP5 computer code was used to simulate surface water transport for tritium releases from the D-Area Drum Wash, Rework, and DW facilities. The WASP5 model was qualified with the 1993 tritium measurements at U.S. Highway 301. At the maximum tritiated waste water concentrations, the calculated tritium concentration in the Savannah River at U.S. Highway 301 due to concurrent releases from D-Area Heavy Water Facilities varies from 5.9 to 18.0 pCi/ml as a function of the operation conditions of these facilities. The calculated concentration becomes the lowest when the batch releases method for the Drum Wash Waste Tanks is adopted

  15. Development of a computer code for shielding calculation in X-ray facilities

    International Nuclear Information System (INIS)

    Borges, Diogo da S.; Lava, Deise D.; Affonso, Renato R.W.; Moreira, Maria de L.; Guimaraes, Antonio C.F.

    2014-01-01

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011

  16. Computational investigation of reshock strength in hydrodynamic instability growth at the National Ignition Facility

    Science.gov (United States)

    Bender, Jason; Raman, Kumar; Huntington, Channing; Nagel, Sabrina; Morgan, Brandon; Prisbrey, Shon; MacLaren, Stephan

    2017-10-01

    Experiments at the National Ignition Facility (NIF) are studying Richtmyer-Meshkov and Rayleigh-Taylor hydrodynamic instabilities in multiply-shocked plasmas. Targets feature two different-density fluids with a multimode initial perturbation at the interface, which is struck by two X-ray-driven shock waves. Here we discuss computational hydrodynamics simulations investigating the effect of second-shock (``reshock'') strength on instability growth, and how these simulations are informing target design for the ongoing experimental campaign. A Reynolds-Averaged Navier Stokes (RANS) model was used to predict motion of the spike and bubble fronts and the mixing-layer width. In addition to reshock strength, the reshock ablator thickness and the total length of the target were varied; all three parameters were found to be important for target design, particularly for ameliorating undesirable reflected shocks. The RANS data are compared to theoretical models that predict multimode instability growth proportional to the shock-induced change in interface velocity, and to currently-available data from the NIF experiments. Work performed under the auspices of the U.S. D.O.E. by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344. LLNL-ABS-734611.

  17. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  18. Development of a computational code for calculations of shielding in dental facilities

    International Nuclear Information System (INIS)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L.

    2014-01-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report

  19. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  20. Thermal studies of the canister staging pit in a hypothetical Yucca Mountain canister handling facility using computational fluid dynamics

    International Nuclear Information System (INIS)

    Soltani, Mehdi; Barringer, Chris; Bues, Timothy T. de

    2007-01-01

    The proposed Yucca Mountain nuclear waste storage site will contain facilities for preparing the radioactive waste canisters for burial. A previous facility design considered was the Canister Handling Facility Staging Pit. This design is no longer used, but its thermal evaluation is typical of such facilities. Structural concrete can be adversely affected by the heat from radioactive decay. Consequently, facilities must have heating ventilation and air conditioning (HVAC) systems for cooling. Concrete temperatures are a function of conductive, convective and radiative heat transfer. The prediction of concrete temperatures under such complex conditions can only be adequately handled by computational fluid dynamics (CFD). The objective of the CFD analysis was to predict concrete temperatures under normal and off-normal conditions. Normal operation assumed steady state conditions with constant HVAC flow and temperatures. However, off-normal operation was an unsteady scenario which assumed a total HVAC failure for a period of 30 days. This scenario was particularly complex in that the concrete temperatures would gradually rise, and air flows would be buoyancy driven. The CFD analysis concluded that concrete wall temperatures would be at or below the maximum temperature limits in both the normal and off-normal scenarios. While this analysis was specific to a facility design that is no longer used, it demonstrates that such facilities are reasonably expected to have satisfactory thermal performance. (author)

  1. A High Performance Computing Framework for Physics-based Modeling and Simulation of Military Ground Vehicles

    Science.gov (United States)

    2011-03-25

    cluster. The co-processing idea is the enabler of the heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale ...Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com/pressroom/archive/reference/ISC_2010_Skaugen_keynote.pdf in

  2. Water activities in Forsmark (Part II). The final disposal facility for spent fuel: water activities above ground

    International Nuclear Information System (INIS)

    Werner, Kent; Hamren, Ulrika; Collinder, Per; Ridderstolpe, Peter

    2010-09-01

    The construction of the repository for spent nuclear fuel in Forsmark is associated with a number of measures above ground that constitute water operations according to Chapter 11 in the Swedish Environmental Code. This report, which is an appendix to the Environmental Impact Assessment, describes these water operations, their effects and consequences, and planned measures

  3. A high performance computing framework for physics-based modeling and simulation of military ground vehicles

    Science.gov (United States)

    Negrut, Dan; Lamb, David; Gorsich, David

    2011-06-01

    This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and

  4. Summary of ground water and surface water flow and contaminant transport computer codes used at the Idaho National Engineering Laboratory (INEL)

    International Nuclear Information System (INIS)

    Bandy, P.J.; Hall, L.F.

    1993-03-01

    This report presents information on computer codes for numerical and analytical models that have been used at the Idaho National Engineering Laboratory (INEL) to model ground water and surface water flow and contaminant transport. Organizations conducting modeling at the INEL include: EG ampersand G Idaho, Inc., US Geological Survey, and Westinghouse Idaho Nuclear Company. Information concerning computer codes included in this report are: agency responsible for the modeling effort, name of the computer code, proprietor of the code (copyright holder or original author), validation and verification studies, applications of the model at INEL, the prime user of the model, computer code description, computing environment requirements, and documentation and references for the computer code

  5. Image processing and computer controls for video profile diagnostic system in the ground test accelerator (GTA)

    International Nuclear Information System (INIS)

    Wright, R.; Zander, M.; Brown, S.; Sandoval, D.; Gilpatrick, D.; Gibson, H.

    1992-01-01

    This paper describes the application of video image processing to beam profile measurements on the Ground Test Accelerator (GTA). A diagnostic was needed to measure beam profiles in the intermediate matching section (IMS) between the radio-frequency quadrupole (RFQ) and the drift tube linac (DTL). Beam profiles are measured by injecting puffs of gas into the beam. The light emitted from the beam-gas interaction is captured and processed by a video image processing system, generating the beam profile data. A general purpose, modular and flexible video image processing system, imagetool, was used for the GTA image profile measurement. The development of both software and hardware for imagetool and its integration with the GTA control system (GTACS) is discussed. The software includes specialized algorithms for analyzing data and calibrating the system. The underlying design philosophy of imagetool was tested by the experience of building and using the system, pointing the way for future improvements. (Author) (3 figs., 4 refs.)

  6. Burial ground as a containment system: 25 years of subsurface monitoring at the Savannah River Plant Facility

    International Nuclear Information System (INIS)

    Fenimore, J.W.

    1982-01-01

    As the Savannah River Plant (SRP) solid wastes containing small quantities of radionuclides are buried in shallow (20' deep) trenches. The hydrogeology of the burial site is described together with a variety of subsurface monitoring techniques employed to ensure the continued safe operation of this disposal facility. conclusions from over two decades of data collection are presented

  7. X-ray facility for the ground calibration of the X-ray monitor JEM-X on board INTEGRAL

    DEFF Research Database (Denmark)

    Loffredo, G.; Pelliciari, C.; Frontera, F.

    2003-01-01

    We describe the X-ray facility developed for the calibration of the X-ray monitor JEM-X on board the INTEGRAL satellite. The apparatus allowed the scanning of the detector geometric area with a pencil beam of desired energy over the major part of the passband of the instrument. The monochromatic...

  8. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford Facilities: Progress report, July 1--September 30, 1989

    International Nuclear Information System (INIS)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-12-01

    This is Volume 1 of a two-volume document that describes the progress of 14 Hanford Site ground-water monitoring projects for the period July 1 to September 30, 1989. This volume discusses the projects; Volume 2 provides as-built diagrams, completion/inspection reports, drilling logs, and geophysical logs for wells drilled, completed, or logged during this period. Volume 2 can be found on microfiche in the back pocket of Volume 1. The work described in this document is conducted by the Pacific Northwest Laboratory under the management of Westinghouse Hanford Company for the US Department of Energy. Concentrations of ground-water constituents are compared to federal drinking water standards throughout this document for reference purposes. All drinking water supplied from the sampled aquifer meets regulatory standards for drinking water quality

  9. Computer based plant display and digital control system of Wolsong NPP Tritium Removal Facility

    International Nuclear Information System (INIS)

    Jung, C.; Smith, B.; Tosello, G.; Grosbois, J. de; Ahn, J.

    2007-01-01

    The Wolsong Tritium Removal Facility (WTRF) is an AECL-designed, first-of-a-kind facility that removes tritium from the heavy water that is used in systems of the CANDUM reactors in operation at the Wolsong Nuclear Power Plant in South Korea. The Plant Display and Control System (PDCS) provides digital plant monitoring and control for the WTRF and offers the advantages of state-of-the-art digital control system technologies for operations and maintenance. The overall features of the PDCS will be described and some of the specific approaches taken on the project to save construction time and costs, to reduce in-service life-cycle costs and to improve quality will be presented. The PDCS consists of two separate computer sub-systems: the Digital Control System (DCS) and the Plant Display System (PDS). The PDS provides the computer-based Human Machine Interface (HMI) for operators, and permits efficient supervisory or device level monitoring and control. A System Maintenance Console (SMC) is included in the PDS for the purpose of software and hardware configuration and on-line maintenance. A Historical Data System (HDS) is also included in the PDS as a data-server that continuously captures and logs process data and events for long-term storage and on-demand selective retrieval. The PDCS of WTRF has been designed and implemented based on an off-the-self PDS/DCS product combination, the Delta-V System from Emerson. The design includes fully redundant Ethernet network communications, controllers, power supplies and redundancy on selected I/O modules. The DCS provides field bus communications to interface with 3rd party controllers supplied on specialized skids, and supports HART communication with field transmitters. The DCS control logic was configured using a modular and graphical approach. The control strategies are primarily device control modules implemented as autonomous control loops, and implemented using IEC 61131-3 Function Block Diagram (FBD) and Structured

  10. Computational Modeling in Support of High Altitude Testing Facilities, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  11. Computational Modeling in Support of High Altitude Testing Facilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  12. Current status of ground motions evaluation in seismic design guide for nuclear power facilities. Investigation on IAEA and US.NRC

    International Nuclear Information System (INIS)

    Nakajima, Masato; Ito, Hiroshi; Hirata, Kazuta

    2009-01-01

    Recently, IAEA (International Atomic Energy Agency) and US.NRC (US. Nuclear Regulatory Commission) published several standards and technical reports on seismic design and safety evaluation for nuclear power facilities. This report summarizes the current status of the international guidelines on seismic design and safety evaluation for nuclear power facilities in order to explore the future research topics. The main results obtained are as follows: 1 IAEA: (1) In the safety standard series, two levels are defined as seismic design levels, and design earthquake ground motion is determined corresponding to each seismic design level. (2) A new framework on seismic design which consists of conventional deterministic method and risk-based method is discussed in the technical report although the framework is not adopted in the safety guidelines. 2 USA: (1) US.NRC discusses a performance-based seismic design framework which has been originally developed by the private organization (American Society of Civil Engineers). (2) Design earthquakes and earthquake ground motion are mainly evaluated and determined based on probabilistic seismic hazard evaluations. 3 Future works: It should be emphasized that IAEA and US.NRC have investigated the implementation of risk-based concept into seismic design. The implementation of risk-based concept into regulation and seismic design makes it possible to consider various uncertainties and to improve accountability. Therefore, we need to develop the methods for evaluating seismic risk of structures, and to correlate seismic margin and seismic risk quantitatively. Moreover, the probabilistic method of earthquake ground motions, that is required in the risk-based design, should be applied to sites in Japan. (author)

  13. Use of borehole and surface geophysics to investigate ground-water quality near a road-deicing salt-storage facility, Valparaiso, Indiana

    Science.gov (United States)

    Risch, M.R.; Robinson, B.A.

    2001-01-01

    Borehole and surface geophysics were used to investigate ground-water quality affected by a road-deicing salt-storage facility located near a public water-supply well field. From 1994 through 1998, borehole geophysical logs were made in an existing network of monitoring wells completed near the bottom of a thick sand aquifer. Logs of natural gamma activity indicated a uniform and negligible contribution of clay to the electromagnetic conductivity of the aquifer so that the logs of electromagnetic conductivity primarily measured the amount of dissolved solids in the ground water near the wells. Electromagneticconductivity data indicated the presence of a saltwater plume near the bottom of the aquifer. Increases in electromagnetic conductivity, observed from sequential logging of wells, indicated the saltwater plume had moved north about 60 to 100 feet per year between 1994 and 1998. These rates were consistent with estimates of horizontal ground-water flow based on velocity calculations made with hydrologic data from the study area.

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. A compressive sensing-based computational method for the inversion of wide-band ground penetrating radar data

    Science.gov (United States)

    Gelmini, A.; Gottardi, G.; Moriyama, T.

    2017-10-01

    This work presents an innovative computational approach for the inversion of wideband ground penetrating radar (GPR) data. The retrieval of the dielectric characteristics of sparse scatterers buried in a lossy soil is performed by combining a multi-task Bayesian compressive sensing (MT-BCS) solver and a frequency hopping (FH) strategy. The developed methodology is able to benefit from the regularization capabilities of the MT-BCS as well as to exploit the multi-chromatic informative content of GPR measurements. A set of numerical results is reported in order to assess the effectiveness of the proposed GPR inverse scattering technique, as well as to compare it to a simpler single-task implementation.

  16. The Emergence of Large-Scale Computer Assisted Summative Examination Facilities in Higher Education

    NARCIS (Netherlands)

    Draaijer, S.; Warburton, W. I.

    2014-01-01

    A case study is presented of VU University Amsterdam where a dedicated large-scale CAA examination facility was established. In the facility, 385 students can take an exam concurrently. The case study describes the change factors and processes leading up to the decision by the institution to

  17. Computational rationalization for the observed ground-state multiplicities of fluorinated acylnitrenes.

    Science.gov (United States)

    Sherman, Matthew P; Jenks, William S

    2014-10-03

    Computational methods are used to investigate the mechanism by which fluorination of acetylnitrene reduces the stabilization of the singlet configuration. ΔEST is made more positive (favoring the triplet state) by 1.9, 1.3, and 0.7 kcal/mol by the addition of the first, second, and third fluorine, respectively, at the CR-CC(2,3)/6-311(3df,2p)//B3LYP/6-31G(d,p) level of theory. Smaller effects observed with substitution of β-fluorines in propanoylnitrene derivatives and examination of molecular geometries and orbitals demonstrate that the effect is due to inductive electron withdrawal by the fluorines, rather than hyperconjugation.

  18. Electrical Subsurface Grounding Analysis

    International Nuclear Information System (INIS)

    J.M. Calle

    2000-01-01

    The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements

  19. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    International Nuclear Information System (INIS)

    Rivera, A.L.; Singh, S.P.N.; Ferrada, J.J.

    1991-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab

  20. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1993 through December 1995

    Science.gov (United States)

    Torikai, J.D.

    1996-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1993 through December 1995, although the report focuses on hydrologic events from October through December 1995 (fourth quarter of 1995). Cumulative rainfall for October through December 1995 was about 41 inches, which is 32 percent more than the mean cumulative rainfall of about 31 inches for October through December. The period October through December is within the annual wet season. Mean cumulative rainfall is calculated for the fixed base period 1951-90. Ground-water withdrawal during October through December 1995 averaged 931,000 gallons per day. Withdrawal for the same 3 months in 1994 averaged 902,900 gallons per day. Patterns of withdrawal during the fourth quarter of 1995 did not change significantly since 1993 at all five ground-water production areas. At the end of December 1995, the chloride concentration of the composite water supply was 60 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from October through December 1995 ranged between 28 and 67 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations continued to decrease during the fourth quarter of 1995, with water from the deepest monitoring wells decreasing in chloride concentration by as much as 2,000 milligrams per liter. This trend follows increases in chloride concentration during the first half of 1995. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water-supply purposes in April 1992. The remaining six wells are being used to hydraulically divert fuel migration away from water-supply wells by recirculating about 150,000 gallons of water

  1. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1993 through September 1995

    Science.gov (United States)

    Torikai, J.D.

    1996-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1993 through September 1995, although the report focuses on hydrologic events from July through September 1995. Cumulative rainfall for July through September 1995 was about 15 inches which is 32 percent less than the mean cumulative rainfall of about 22 inches for July through September. July and August are within the annual dry season, while September is the start of the annual wet season. Mean cumulative rainfall is calculated for the fixed base period 1951-90. Ground-water withdrawal during July through September 1995 averaged 888,500 gallons per day. Withdrawal for the same 3 months in 1994 averaged 919,400 gallons per day. Patterns of withdrawal during the third quarter of 1995 did not change significantly since 1993 at all five ground-water production areas. At the end of September 1995, the chloride concentration of the composite water supply was 51 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from July through September 1995 ranged between 42 and 68 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations continued to increase since April 1995, with water from the deepest monitoring wells increasing in chloride concentration by as much as 2,000 milligrams per liter. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water-supply purposes in April 1992. The remaining six wells are being used to hydraulically divert fuel migration away from water-supply wells by recirculating about 150,000 gallons of water each day.

  2. Clinical, pathological, and radiological characteristics of solitary ground-glass opacity lung nodules on high-resolution computed tomography

    Directory of Open Access Journals (Sweden)

    Qiu ZX

    2016-09-01

    Full Text Available Zhi-Xin Qiu,1 Yue Cheng,1 Dan Liu,1 Wei-Ya Wang,2 Xia Wu,2 Wei-Lu Wu,2 Wei-Min Li1,2 1Department of Respiratory Medicine, 2Department of Pathology, West China Hospital, Sichuan University, Chengdu, People’s Republic of China Background: Lung nodules are being detected at an increasing rate year by year with high-resolution computed tomography (HRCT being widely used. Ground-glass opacity nodule is one of the special types of pulmonary nodules that is confirmed to be closely associated with early stage of lung cancer. Very little is known about solitary ground-glass opacity nodules (SGGNs. In this study, we analyzed the clinical, pathological, and radiological characteristics of SGGNs on HRCT.Methods: A total of 95 resected SGGNs were evaluated with HRCT scan. The clinical, pathological, and radiological characteristics of these cases were analyzed.Results: Eighty-one adenocarcinoma and 14 benign nodules were observed. The nodules included 12 (15% adenocarcinoma in situ (AIS, 14 (17% minimally invasive adenocarcinoma (MIA, and 55 (68% invasive adenocarcinoma (IA. No patients with recurrence till date have been identified. The positive expression rates of anaplastic lymphoma kinase and ROS-1 (proto-oncogene tyrosine-protein kinase ROS were only 2.5% and 8.6%, respectively. The specificity and accuracy of HRCT of invasive lung adenocarcinoma were 85.2% and 87.4%. The standard uptake values of only two patients determined by 18F-FDG positron emission tomography/computed tomography (PET/CT were above 2.5. The size, density, shape, and pleural tag of nodules were significant factors that differentiated IA from AIS and MIA. Moreover, the size, shape, margin, pleural tag, vascular cluster, bubble-like sign, and air bronchogram of nodules were significant determinants for mixed ground-glass opacity nodules (all P<0.05.Conclusion: We analyzed the clinical, pathological, and radiological characteristics of SGGNs on HRCT and found that the size, density

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  5. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    International Nuclear Information System (INIS)

    Donvito, Giacinto; Italiano, Alessandro; Salomoni, Davide

    2014-01-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  6. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  7. Comparison of Stereo-PIV and Plenoptic-PIV Measurements on the Wake of a Cylinder in NASA Ground Test Facilities.

    Science.gov (United States)

    Fahringer, Timothy W.; Thurow, Brian S.; Humphreys, William M., Jr.; Bartram, Scott M.

    2017-01-01

    A series of comparison experiments have been performed using a single-camera plenoptic PIV measurement system to ascertain the systems performance capabilities in terms of suitability for use in NASA ground test facilities. A proof-of-concept demonstration was performed in the Langley Advanced Measurements and Data Systems Branch 13-inch (33- cm) Subsonic Tunnel to examine the wake of a series of cylinders at a Reynolds number of 2500. Accompanying the plenoptic-PIV measurements were an ensemble of complementary stereo-PIV measurements. The stereo-PIV measurements were used as a truth measurement to assess the ability of the plenoptic-PIV system to capture relevant 3D/3C flow field features in the cylinder wake. Six individual tests were conducted as part of the test campaign using three different cylinder diameters mounted in two orientations in the tunnel test section. This work presents a comparison of measurements with the cylinders mounted horizontally (generating a 2D flow field in the x-y plane). Results show that in general the plenoptic-PIV measurements match those produced by the stereo-PIV system. However, discrepancies were observed in extracted pro les of the fuctuating velocity components. It is speculated that spatial smoothing of the vector fields in the stereo-PIV system could account for the observed differences. Nevertheless, the plenoptic-PIV system performed extremely well at capturing the flow field features of interest and can be considered a viable alternative to traditional PIV systems in smaller NASA ground test facilities with limited optical access.

  8. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1994 through March 1996

    Science.gov (United States)

    Torikai, J.D.

    1996-01-01

    This report describes the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1994 through March 1996, with a focus on data from January through March 1996 (first quarter of 1996). A complete database of ground-water withdrawals and chloride-concentration records since 1985 is maintained by the U.S. Geological Survey. Cumulative rainfall for January through March 1996 was about 30 inches, which is 9 percent less than the mean cumulative rainfall of about 33 inches for January through March. The period January through February is the end of the annual wet season, while March marks the start of the annual dry season. Ground-water withdrawal during January through March 1996 averaged 970,300 gallons per day. Withdrawal for the same 3 months in 1995 averaged 894,600 gallons per day. With- drawal patterns during the first quarter of 1996 did not change significantly since 1991, with the Cantonment and Air Operations areas supplying about 99 percent of total islandwide pumpage. At the end of March 1996, the chloride concentration of water from the elevated tanks at Cantonment and Air Operations were 47 and 80 milligrams per liter, respectively. The chloride data from all five production areas showed no significant upward or downward trends throughout the first quarter of 1996. Potable levels of chloride concentrations have been maintained by adjusting individual pumping rates, and also because of the absence of long-term droughts. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations also showed no significant trends throughout the first quarter of 1996. Chloride concentrations have been about the same since the last quarter of 1995. A fuel-pipeline leak at Air Operations in May 1991 decreased total islandwide withdrawals by 15 percent. This lost pumping capacity is being offset by increased pumpage at Cantonment. Six wells do not contribute to the water supply because they

  9. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1994 through June 1996

    Science.gov (United States)

    Torikai, J.D.

    1996-01-01

    This report describes the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1994 through June 1996, with a focus on data from April through June 1996 (second quarter of 1996). A complete database of ground-water withdrawals and chloride-concentration records since 1985 is maintained by the U.S. Geological Survey. Cumulative rainfall for April through June 1996 was 22.64 inches, which is 12 percent more than the mean cumulative rainfall of 20.21 inches for April through June. The period April through June is part of the annual dry season. Ground-water withdrawal during April through June 1996 averaged 1,048,000 gallons per day. Withdrawal for the same 3 months in 1995 averaged 833,700 gallons per day. Withdrawal patterns during the second quarter of 1996 did not change significantly since 1991, with the Cantonment and Air Operations areas supplying about 99 percent of total islandwide pumpage. At the end of June 1996, the chloride concentration of water from the elevated tanks at Cantonment and Air Operations were 52 and 80 milligrams per liter, respectively. The chloride data from all five production areas showed no significant upward or downward trends throughout the second quarter of 1996. Potable levels of chloride concentrations have been maintained by adjusting individual pumping rates, and also because of the absence of long-term droughts. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations also showed no significant trends throughout the second quarter of 1996. Chloride concentrations have been about the same since the last quarter of 1995. A fuel-pipeline leak at Air Operations in May 1991 decreased total islandwide withdrawals by 15 percent. This lost pumping capacity is being offset by increased pumpage at Cantonment. Six wells do not contribute to the water supply because they are being used to hydraulically divert fuel migration away from water

  10. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  11. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  12. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  13. Effects on radionuclide concentrations by cement/ground-water interactions in support of performance assessment of low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Krupka, K.M.; Serne, R.J.

    1998-05-01

    The US Nuclear Regulatory Commission is developing a technical position document that provides guidance regarding the performance assessment of low-level radioactive waste disposal facilities. This guidance considers the effects that the chemistry of the vault disposal system may have on radionuclide release. The geochemistry of pore waters buffered by cementitious materials in the disposal system will be different from the local ground water. Therefore, the cement-buffered environment needs to be considered within the source term calculations if credit is taken for solubility limits and/or sorption of dissolved radionuclides within disposal units. A literature review was conducted on methods to model pore-water compositions resulting from reactions with cement, experimental studies of cement/water systems, natural analogue studies of cement and concrete, and radionuclide solubilities experimentally determined in cement pore waters. Based on this review, geochemical modeling was used to calculate maximum concentrations for americium, neptunium, nickel, plutonium, radium, strontium, thorium, and uranium for pore-water compositions buffered by cement and local ground-water. Another literature review was completed on radionuclide sorption behavior onto fresh cement/concrete where the pore water pH will be greater than or equal 10. Based on this review, a database was developed of preferred minimum distribution coefficient values for these radionuclides in cement/concrete environments

  14. Effects on radionuclide concentrations by cement/ground-water interactions in support of performance assessment of low-level radioactive waste disposal facilities

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Serne, R.J. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-05-01

    The US Nuclear Regulatory Commission is developing a technical position document that provides guidance regarding the performance assessment of low-level radioactive waste disposal facilities. This guidance considers the effects that the chemistry of the vault disposal system may have on radionuclide release. The geochemistry of pore waters buffered by cementitious materials in the disposal system will be different from the local ground water. Therefore, the cement-buffered environment needs to be considered within the source term calculations if credit is taken for solubility limits and/or sorption of dissolved radionuclides within disposal units. A literature review was conducted on methods to model pore-water compositions resulting from reactions with cement, experimental studies of cement/water systems, natural analogue studies of cement and concrete, and radionuclide solubilities experimentally determined in cement pore waters. Based on this review, geochemical modeling was used to calculate maximum concentrations for americium, neptunium, nickel, plutonium, radium, strontium, thorium, and uranium for pore-water compositions buffered by cement and local ground-water. Another literature review was completed on radionuclide sorption behavior onto fresh cement/concrete where the pore water pH will be greater than or equal 10. Based on this review, a database was developed of preferred minimum distribution coefficient values for these radionuclides in cement/concrete environments.

  15. Research on advancement of method for evaluating aseismatic ability of rock discontinuity plane in ground and surrounding slopes of nuclear power facilities

    International Nuclear Information System (INIS)

    Kusunose, Kinichiro; Cho, Akio; Takahashi, Manabu; Kamai, Toshitaka

    1997-01-01

    The purpose of this research is to carry out the technical development required for exploring with high accuracy the distribution and shapes of the discontinuity planes in rocks in the ground and surrounding cut-off slopes of nuclear power facilities, and to advance the techniques of interpreting and evaluating quantitatively the stability against earthquakes of the discontinuity planes. This research consists of two themes: the research on the method of investigating the three-dimensional distribution of the crevices in the ground and the research on the method of evaluating the aseismatic ability in the slopes. As for the first theme, one of the techniques for exploring underground structure with elastic waves, tomography, is explained, and the development of the 12 channel receiver and the program for the multi-channel analysis and processing of waveform are reported. As for the second theme, the stability analysis was carried out on three actual cases of landslide. The equation for stability analysis is shown, and the results are reported. The strength at the time of forming separation plane gives the most proper result. (K.I.)

  16. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1992 through September 1994

    Science.gov (United States)

    Torikai, J.D.

    1995-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data are presented from January 1992 through September 1994. This report concentrates on data from July through September 1994, and references historic data from 1992 through June 1994. Total rainfall for the first nine months of 1994 was about 77 inches which is 72 percent of the mean annual rainfall of 106 inches. In comparison, total rainfall for the first nine months of 1992 and 1993 was 67 inches and 69 inches, respectively. Annual rainfall totals in 1992 and 1993 were 93 inches and 95 inches, respectively. Ground-water withdrawal during July through September 1994 has averaged 919,400 gallons per day, while annual withdrawals in 1992 and 1993 averaged 935,900 gallons per day and 953,800 gallons per day, respectively. At the end of September 1994, the chloride concentration of the composite water supply was 56 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from July through September 1994 ranged between 51 and 78 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations increased in July and August, but have leveled off or decreased in September. There has been a general trend of increasing chloride concentrations in the deeper monitoring wells since the 1992 dry season, which began in March 1992. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water-supply purposes in April 1992. The remaining six wells are being used to hydraulically contain and divert fuel migration by recirculating 150,000 gallons of water each day.

  17. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1993 through March 1995

    Science.gov (United States)

    Torikai, J.D.

    1995-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1993 through March 1995, although the report focuses on hydrologic events from January through March 1995. Cumulative rainfall for January through March 1995 was about 42 inches which is higher than the mean cumulative rainfall of about 33 inches for the same 3 months in a year. January and February are part of the annual wet season and March is the start of the annual dry season. Rainfall for each month was above average from the respective mean monthly rainfall. Ground- water withdrawal during January through March 1995 averaged 894,600 gallons per day. Withdrawal for the same 3 months in 1994 averaged 999,600 gallons per day. At the end of March 1995, the chloride concentration of the composite water supply was 26 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from January through March 1995 ranged between 19 and 49 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations decreased since November 1994. The deepest monitoring wells show declines in chloride concentration by as much as 4,000 milligrams per liter. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water- supply purposes in April 1992. The remaining six wells are being used to hydraulically contain and divert fuel migration by recirculating about 150,000 gallons of water each day.

  18. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1993 through June 1995

    Science.gov (United States)

    Torikai, J.D.

    1995-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1993 through June 1995, although the report focuses on hydrologic events from April through June 1995. Cumulative rainfall for April through June 1995 was about 14 inches which is 70 percent of the mean cumulative rainfall of about 20 inches for the same 3 months in a year. April through June is within the annual dry season. Rainfall for each month was below average from the respective mean monthly rainfall. All mean rainfall values are calculated for the fixed base period 1951-90. Ground-water withdrawal during April through June 1995 averaged 833,700 gallons per day. Withdrawal for the same 3 months in 1994 averaged 950,000 gallons per day. At the end of June 1995, the chloride concentration of the composite water supply was 57 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from April through June 1995 ranged between 26 and 62 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations increased since April 1995, with water from the deepest monitoring wells increasing in chloride concentra- tion by about 1000 milligrams per liter. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water-supply purposes in April 1992. The remaining six wells are being used to hydraulically contain and divert fuel migration away from water-supply wells by recirculating about 150,000 gallons of water each day.

  19. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport

  20. Norovirus contamination levels in ground water treatment systems used for food-catering facilities in South Korea.

    Science.gov (United States)

    Lee, Bo-Ram; Lee, Sung-Geun; Park, Jong-Hyun; Kim, Kwang-Yup; Ryu, Sang-Ryeol; Rhee, Ok-Jae; Park, Jeong-Woong; Lee, Jeong-Su; Paik, Soon-Young

    2013-07-02

    This study aimed to inspect norovirus contamination of groundwater treatment systems used in food-catering facilities located in South Korea. A nationwide study was performed in 2010. Water samples were collected and, for the analysis of water quality, the temperature, pH, turbidity, and residual chlorine content were assessed. To detect norovirus genotypes GI and GII, RT-PCR and semi-nested PCR were performed with specific NV-GI and NV-GII primer sets, respectively. The PCR products amplified from the detected strains were then subjected to sequence analyses. Of 1,090 samples collected in 2010, seven (0.64%) were found to be norovirus-positive. Specifically, one norovirus strain was identified to have the GI-6 genotype, and six GII strains had the GII, GII-3, GII-4, and GII-17 genotypes. The very low detection rate of norovirus most likely reflects the preventative measures used. However, this virus can spread rapidly from person to person in crowded, enclosed places such as the schools investigated in this study. To promote better public health and sanitary conditions, it is necessary to periodically monitor noroviruses that frequently cause epidemic food poisoning in South Korea.

  1. Ground-water flow and transport modeling of the NRC-licensed waste disposal facility, West Valley, New York

    International Nuclear Information System (INIS)

    Kool, J.B.; Wu, Y.S.

    1991-10-01

    This report describes a simulation study of groundwater flow and radionuclide transport from disposal at the NRC licensed waste disposal facility in West Valley, New York. A transient, precipitation driven, flow model of the near-surface fractured till layer and underlying unweathered till was developed and calibrated against observed inflow data into a recently constructed interceptor trench for the period March--May 1990. The results suggest that lateral flow through the upper, fractured till layer may be more significant than indicated by previous, steady state flow modeling studies. A conclusive assessment of the actual magnitude of lateral flow through the fractured till could however not be made. A primary factor contributing to this uncertainty is the unknown contribution of vertical infiltration through the interceptor trench cap to the total trench inflow. The second part of the investigation involved simulation of the migration of Sr-90, Cs-137 and Pu-239 from the one of the fuel hull disposal pits. A first-order radionuclide leach rate with rate coefficient of 10 -6 /day was assumed to describe radionuclide release into the disposal pit. The simulations indicated that for wastes buried below the fractured till zone, no significant migration would occur. However, under the assumed conditions, significant lateral migration could occur for radionuclides present in the upper, fractured till zone. 23 refs., 68 figs., 12 tabs

  2. Draft of diagnostic techniques for primary coolant circuit facilities using control computer

    International Nuclear Information System (INIS)

    Suchy, R.; Procka, V.; Murin, V.; Rybarova, D.

    A method is proposed of in-service on-line diagnostics of primary circuit selected parts by means of a control computer. Computer processing will involve the measurements of neutron flux, pressure difference in pumps and in the core, and the vibrations of primary circuit mechanical parts. (H.S.)

  3. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    International Nuclear Information System (INIS)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-01-01

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above

  4. A computational test facility for distributed analysis of gravitational wave signals

    International Nuclear Information System (INIS)

    Amico, P; Bosi, L; Cattuto, C; Gammaitoni, L; Punturo, M; Travasso, F; Vocca, H

    2004-01-01

    In the gravitational wave detector Virgo, the in-time detection of a gravitational wave signal from a coalescing binary stellar system is an intensive computational task. A parallel computing scheme using the message passing interface (MPI) is described. Performance results on a small-scale cluster are reported

  5. National Ignition Facility system design requirements NIF integrated computer controls SDR004

    International Nuclear Information System (INIS)

    Bliss, E.

    1996-01-01

    This System Design Requirement document establishes the performance, design, development, and test requirements for the NIF Integrated Computer Control System. The Integrated Computer Control System (ICCS) is covered in NIF WBS element 1.5. This document responds directly to the requirements detailed in the NIF Functional Requirements/Primary Criteria, and is supported by subsystem design requirements documents for each major ICCS Subsystem

  6. Laser performance operations model (LPOM): a computational system that automates the setup and performance analysis of the national ignition facility

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M; House, R; Williams, W; Haynam, C; White, R; Orth, C; Sacks, R [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA, 94550 (United States)], E-mail: shaw7@llnl.gov

    2008-05-15

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF will be the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed and deployed that automates the laser setup process, and accurately predict laser energetics. LPOM determines the settings of the injection laser system required to achieve the desired main laser output, provides equipment protection, determines the diagnostic setup, and supplies post shot data analysis and reporting.

  7. Advantages for the introduction of computer techniques in centralized supervision of radiation levels in nuclear facilities

    International Nuclear Information System (INIS)

    Vialettes, H.; Leblanc, P.

    1980-01-01

    A new computerized information system at the Saclay Center comprising 120 measuring channels is described. The advantages offered by this system with respect to the systems in use up to now are presented. Experimental results are given which support the argument that the system can effectively supervise the radioisotope facility at the Center. (B.G.)

  8. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  9. Simple computational modeling for human extracorporeal irradiation using the BNCT facility of the RA-3 Reactor

    International Nuclear Information System (INIS)

    Farias, Ruben; Gonzalez, S.J.; Bellino, A.; Sztenjberg, M.; Pinto, J.; Thorp, Silvia I.; Gadan, M.; Pozzi, Emiliano; Schwint, Amanda E.; Heber, Elisa M.; Trivillin, V.A.; Zarza, Leandro G.; Estryk, Guillermo; Miller, M.; Bortolussi, S.; Soto, M.S.; Nigg, D.W.

    2009-01-01

    We present a simple computational model of the reactor RA-3 developed using Monte Carlo transport code MCNP. The model parameters are adjusted in order to reproduce experimental measured points in air and the source validation is performed in an acrylic phantom. Performance analysis is carried out using computational models of animal extracorporeal irradiation in liver and lung. Analysis is also performed inside a neutron shielded receptacle use for the irradiation of rats with a model of hepatic metastases.The computational model reproduces the experimental behavior in all the analyzed cases with a maximum difference of 10 percent. (author)

  10. A personal computer code for seismic evaluations of nuclear power plants facilities

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Graves, H.

    1990-01-01

    The program CARES (Computer Analysis for Rapid Evaluation of Structures) is an integrated computational system being developed by Brookhaven National Laboratory (BNL) for the U.S. Nuclear Regulatory Commission. It is specifically designed to be a personal computer (PC) operated package which may be used to determine the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants. CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the various features which have been implemented into the Seismic Module of CARES

  11. GRASP [GRound-Water Adjunct Sensitivity Program]: A computer code to perform post-SWENT [simulator for water, energy, and nuclide transport] adjoint sensitivity analysis of steady-state ground-water flow: Technical report

    International Nuclear Information System (INIS)

    Wilson, J.L.; RamaRao, B.S.; McNeish, J.A.

    1986-11-01

    GRASP (GRound-Water Adjunct Senstivity Program) computes measures of the behavior of a ground-water system and the system's performance for waste isolation, and estimates the sensitivities of these measures to system parameters. The computed measures are referred to as ''performance measures'' and include weighted squared deviations of computed and observed pressures or heads, local Darcy velocity components and magnitudes, boundary fluxes, and travel distance and time along travel paths. The sensitivities are computed by the adjoint method and are exact derivatives of the performance measures with respect to the parameters for the modeled system, taken about the assumed parameter values. GRASP presumes steady-state, saturated grondwater flow, and post-processes the results of a multidimensional (1-D, 2-D, 3-D) finite-difference flow code. This document describes the mathematical basis for the model, the algorithms and solution techniques used, and the computer code design. The implementation of GRASP is verified with simple one- and two-dimensional flow problems, for which analytical expressions of performance measures and sensitivities are derived. The linkage between GRASP and multidimensional finite-difference flow codes is described. This document also contains a detailed user's manual. The use of GRASP to evaluate nuclear waste disposal issues has been emphasized throughout the report. The performance measures and their sensitivities can be employed to assist in directing data collection programs, expedite model calibration, and objectively determine the sensitivity of projected system performance to parameters

  12. Specific features of organizng the computer-aided design of radio-electronic equipment for electrophysical facilities

    International Nuclear Information System (INIS)

    Mozin, I.V.; Vasil'ev, M.P.

    1985-01-01

    Problems of developing systems for computer-aided design (CAD) of radioelectronic equipment for large electrophysical facilities such as charged particle accelerators of new generation are discussed. The PLATA subsystem representing a part of CAD and used for printed circuit design is described. The subsystem PLATA is utilized to design, on the average, up to 150 types of circuits a year, 100-120 of which belong to circuits of increased complexity. In this case labour productivity of a designer at documentation increases almost two times

  13. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...... are transferred to a large high-speed computer for bulk processing and for the production of isophot and equiphase contour maps or profiles. The performance of the system is demonstrated through results for a single conical horn, for interacting rectangular horns, for multiple cylindrical scatterers...

  14. Requirements Report Computer Software System for a Semi-Automatic Pipe Handling System and Fabrication Facility

    National Research Council Canada - National Science Library

    1980-01-01

    .... This report is to present the requirements of the computer software that must be developed to create Pipe Detail Drawings and to support the processing of the Pipe Detail Drawings through the Pipe Shop...

  15. Computing Facilities for AI: A Survey of Present and Near-Future Options

    OpenAIRE

    Fahlman, Scott

    1981-01-01

    At the recent AAAI conference at Stanford, it became apparent that many new AI research centers are being established around the country in industrial and governmental settings and in universities that have not paid much attention to AI in the past. At the same time, many of the established AI centers are in the process of converting from older facilities, primarily based on Decsystem-10 and Decsystem-20 machines, to a variety of newer options. At present, unfortunately, there is no simple an...

  16. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. A stand alone computer system to aid the development of mirror fusion test facility RF heating systems

    International Nuclear Information System (INIS)

    Thomas, R.A.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  19. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  20. Workplan/RCRA Facility Investigation/Remedial Investigation Report for the Old Radioactive Waste Burial Ground 643-E, S01-S22 - Volume I - Text and Volume II - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Conner, K.R.

    2000-12-12

    This document presents the assessment of environmental impacts resulting from releases of hazardous substances from the facilities in the Old Radioactive Waste Burial Ground 643-E, including Solvent Tanks 650-01E to 650-22E, also referred to as Solvent Tanks at the Savannah River Site, Aiken, South Carolina.

  1. Status of ground-water resources at U.S. Navy Support Facility, Diego Garcia; summary of hydrologic and climatic data, January 1992 through December 1994

    Science.gov (United States)

    Torikai, J.D.

    1995-01-01

    This report contains hydrologic and climatic data that describe the status of ground-water resources at U.S. Navy Support Facility, Diego Garcia. Data presented are from January 1992 through December 1994. This report concentrates on data from October through December 1994, and references previous data from 1992 through 1994. Cumulative rainfall for October through December 1994 was 55 inches which is higher than the mean cumulative rainfall of about 31 inches for the same 3 months. Total rainfall for 1994 was 131 inches which is 24 percent higher than the mean annual rainfall of 106 inches. In com- parison, total rainfall in 1992 and 1993 were 93 inches and 95 inches, respectively. Ground-water withdrawal during October through December 1994 averaged 903,000 gallons per day, while the annual withdrawal in 1994 was 942,700 gallons per day. Annual withdrawals in 1992 and 1993 averaged 935,900 gallons per day and 953,800 gallons per day, respectively. At the end of December 1994, the chloride concentration of the composite water supply was 28 milligrams per liter, well below the 250 milligrams per liter secondary drinking-water standard established by the U.S. Environmental Protection Agency. Chloride concentrations of the composite water supply from October through December 1994 ranged between 28 and 86 milligrams per liter. Chloride concentration of ground water in monitoring wells at Cantonment and Air Operations decreased in November and December, and seems to have leveled off by the end of the year. Although chloride concen- trations have decreased during the fourth quarter of 1994, there has been a general trend of increasing chloride concentrations in the deeper monitoring wells since the 1992 dry season, which began in March 1992. A fuel leak at Air Operations caused the shutdown of ten wells in May 1991. Four of the wells resumed pumping for water-supply purposes in April 1992. The remaining six wells are being used to hydraulically contain and divert fuel

  2. Application of positron emission tomography-computed tomography in the diagnosis of pulmonary ground-glass nodules.

    Science.gov (United States)

    Hu, Lili; Pan, Yuanwei; Zhou, Zhigang; Gao, Jianbo

    2017-11-01

    The aim of the present study was to investigate the value of positron emission tomography-computed tomography (PET-CT) using 18 F-fluorodeoxyglucose in the clinical diagnosis of pulmonary ground-glass nodule. In total, 54 patients with pulmonary GGN that were identified by PET-CT examination were selected and confirmed by pathology and clinical diagnosis in hospital between April 2014 and April 2015. The association between PET-CT findings and pathology, and the value of PET-CT were then evaluated. In the 54 patients, solitary pulmonary GGN with a nodule diameter of between 0.6 and 2.0 cm were detected. Amongst them, the PET-CT examination of 42 patients revealed hyper metabolic nodules, and were all mixed GGN type nodules with a diameter >1 cm. The PET-CT examination of the remaining 12 patients demonstrated no evidence of metabolic abnormalities and the nodules in these patients were pure or mixed GGN with a diameter <1 cm (except 2 cases with a diameter ≥1 cm). Furthermore, the diagnoses for all patients were pathologically confirmed by CT-guided needle biopsy or thoracoscopic surgical resection. Amongst them, there were 41 cases of lung adenocarcinoma, 4 cases of fungal infection, 7 cases of inflammation and 2 cases of adenomatoid hyperplasia. Additionally, PET-CT has a lower detection rate for smaller GGN exhibits no clear advantage for pure GGN, but has a higher detection rate for larger GGN. In conclusion, to a certain extent, PET-CT makes up for the shortcomings of traditional imaging and has some clinical value for the diagnosis of GGN.

  3. Burnup calculations for KIPT accelerator driven subcritical facility using Monte Carlo computer codes-MCB and MCNPX

    International Nuclear Information System (INIS)

    Gohar, Y.; Zhong, Z.; Talamo, A.

    2009-01-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an electron accelerator driven subcritical (ADS) facility, using the KIPT electron accelerator. The neutron source of the subcritical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The electron beam has a uniform spatial distribution and electron energy in the range of 100 to 200 MeV. The main functions of the subcritical assembly are the production of medical isotopes and the support of the Ukraine nuclear power industry. Neutron physics experiments and material structure analyses are planned using this facility. With the 100 KW electron beam power, the total thermal power of the facility is ∼375 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products reduce continuously the reactivity during the operation, which reduces the neutron flux level and consequently the facility performance. To preserve the neutron flux level during the operation, fuel assemblies should be added after long operating periods to compensate for the lost reactivity. This process requires accurate prediction of the fuel burnup, the decay behavior of the fission produces, and the introduced reactivity from adding fresh fuel assemblies. The recent developments of the Monte Carlo computer codes, the high speed capability of the computer processors, and the parallel computation techniques made it possible to perform three-dimensional detailed burnup simulations. A full detailed three-dimensional geometrical model is used for the burnup simulations with continuous energy nuclear data libraries for the transport calculations and 63-multigroup or one group cross sections libraries for the depletion calculations. Monte Carlo Computer code MCNPX and MCB are utilized for this study. MCNPX transports the electrons and the

  4. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... SUPPLEMENTARY INFORMATION section for electronic access to the guidance document. Submit electronic comments on... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the User's... document to http://www.regulations.gov or written comments to the Division of Dockets Management (see...

  5. Navier-Stokes Simulation of Airconditioning Facility of a Large Modem Computer Room

    Science.gov (United States)

    2005-01-01

    NASA recently assembled one of the world's fastest operational supercomputers to meet the agency's new high performance computing needs. This large-scale system, named Columbia, consists of 20 interconnected SGI Altix 512-processor systems, for a total of 10,240 Intel Itanium-2 processors. High-fidelity CFD simulations were performed for the NASA Advanced Supercomputing (NAS) computer room at Ames Research Center. The purpose of the simulations was to assess the adequacy of the existing air handling and conditioning system and make recommendations for changes in the design of the system if needed. The simulations were performed with NASA's OVERFLOW-2 CFD code which utilizes overset structured grids. A new set of boundary conditions were developed and added to the flow solver for modeling the roomls air-conditioning and proper cooling of the equipment. Boundary condition parameters for the flow solver are based on cooler CFM (flow rate) ratings and some reasonable assumptions of flow and heat transfer data for the floor and central processing units (CPU) . The geometry modeling from blue prints and grid generation were handled by the NASA Ames software package Chimera Grid Tools (CGT). This geometric model was developed as a CGT-scripted template, which can be easily modified to accommodate any changes in shape and size of the room, locations and dimensions of the CPU racks, disk racks, coolers, power distribution units, and mass-storage system. The compute nodes are grouped in pairs of racks with an aisle in the middle. High-speed connection cables connect the racks with overhead cable trays. The cool air from the cooling units is pumped into the computer room from a sub-floor through perforated floor tiles. The CPU cooling fans draw cool air from the floor tiles, which run along the outside length of each rack, and eject warm air into the center isle between the racks. This warm air is eventually drawn into the cooling units located near the walls of the room. One

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. Progress toward the development of a ground-water velocity model for the radioactive waste management facility, Savannah River Plant, South Carolina: Quarterly report

    International Nuclear Information System (INIS)

    Parizek, R.R.; Root, R.W. Jr.

    1984-01-01

    This report presents the status and results of work performed to develop a numerical groundwater velocity model for the radioactive waste management facility at the Savannah River Plant (SRP). Work dealt with developing a hydrologic budget for the McQueen Branch drainage basin. Two hydrologic budgets were developed, covering two periods of time. The first period was from November 1, 1982 to May 19, 1984; the second period was from March 1, 1983 to March 31, 1984. Total precipitation for this period was 52.48 inches, all as rainfall. Water levels measured in wells in the basin quarterly, monthly, and continuously showed basically the same response over the period of the study. Maximum fluctuation of water levels of wells in the basin was five to seven feet during the study. Stream discharge measurements in McQueen Branch showed base flow varying between 1.5 and 5.7 cfs. Lowest base flow occurred during the summer, when evapotranspiration was greatest. Some impact of daily ground-water evapotranspiration from the Branch floodplain was seen in continuous stream records. These daily effects peaked in magnitude during the summer, disappeared during winter, and gradually returned during spring. Underflow past the Branch gauging station out of the basin was determined to be negligible. Leakage downward through the Green Clay is difficult to determine but is believed to be small, based on the overall results of the budget study

  8. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. PROFEAT Update: A Protein Features Web Server with Added Facility to Compute Network Descriptors for Studying Omics-Derived Networks.

    Science.gov (United States)

    Zhang, P; Tao, L; Zeng, X; Qin, C; Chen, S Y; Zhu, F; Yang, S Y; Li, Z R; Chen, W P; Chen, Y Z

    2017-02-03

    The studies of biological, disease, and pharmacological networks are facilitated by the systems-level investigations using computational tools. In particular, the network descriptors developed in other disciplines have found increasing applications in the study of the protein, gene regulatory, metabolic, disease, and drug-targeted networks. Facilities are provided by the public web servers for computing network descriptors, but many descriptors are not covered, including those used or useful for biological studies. We upgraded the PROFEAT web server http://bidd2.nus.edu.sg/cgi-bin/profeat2016/main.cgi for computing up to 329 network descriptors and protein-protein interaction descriptors. PROFEAT network descriptors comprehensively describe the topological and connectivity characteristics of unweighted (uniform binding constants and molecular levels), edge-weighted (varying binding constants), node-weighted (varying molecular levels), edge-node-weighted (varying binding constants and molecular levels), and directed (oriented processes) networks. The usefulness of the network descriptors is illustrated by the literature-reported studies of the biological networks derived from the genome, interactome, transcriptome, metabolome, and diseasome profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Computer control and data acquisition system for the Mirror Fusion Test Facility Ion Cyclotron Resonant Heating System (ICRH)

    International Nuclear Information System (INIS)

    Cheshire, D.L.; Thomas, R.A.

    1985-01-01

    The Lawrence Livermore National Laboratory (LLNL) large Mirror Fusion Test Facility (MFTF-B) will employ an Ion Cyclotron Resonant Heating (ICRH) system for plasma startup. As the MFTF-B Industrial Participant, TRW has responsibility for the ICRH system, including development of the data acquisition and control system. During the MFTF-B Supervisory Control and Diagnostic System (SCDS). For subsystem development and checkout at TRW, and for verification and acceptance testing at LLNL, the system will be run from a stand-alone computer system designed to simulate the functions of SCDS. The ''SCDS Simulator'' was developed originally for the MFTF-B ECRH System; descriptions of the hardware and software are updated in this paper. The computer control and data acquisition functions implemented for ICRH are described, including development status, and test schedule at TRW and at LLNL. The application software is written for the SCDS Simulator, but it is programmed in PASCAL and designed to facilitate conversion for use on the SCDS computers

  13. Hazard-to-Risk: High-Performance Computing Simulations of Large Earthquake Ground Motions and Building Damage in the Near-Fault Region

    Science.gov (United States)

    Miah, M.; Rodgers, A. J.; McCallen, D.; Petersson, N. A.; Pitarka, A.

    2017-12-01

    We are running high-performance computing (HPC) simulations of ground motions for large (magnitude, M=6.5-7.0) earthquakes in the near-fault region (steel moment frame buildings throughout the near-fault domain. For ground motions, we are using SW4, a fourth order summation-by-parts finite difference time-domain code running on 10,000-100,000's of cores. Earthquake ruptures are generated using the Graves and Pitarka (2017) method. We validated ground motion intensity measurements against Ground Motion Prediction Equations. We considered two events (M=6.5 and 7.0) for vertical strike-slip ruptures with three-dimensional (3D) basin structures, including stochastic heterogeneity. We have also considered M7.0 scenarios for a Hayward Fault rupture scenario which effects the San Francisco Bay Area and northern California using both 1D and 3D earth structure. Dynamic, inelastic response of canonical buildings is computed with the NEVADA, a nonlinear, finite-deformation finite element code. Canonical buildings include 3-, 9-, 20- and 40-story steel moment frame buildings. Damage potential is tracked by the peak inter-story drift (PID) ratio, which measures the maximum displacement between adjacent floors of the building and is strongly correlated with damage. PID ratios greater 1.0 generally indicate non-linear response and permanent deformation of the structure. We also track roof displacement to identify permanent deformation. PID (damage) for a given earthquake scenario (M, slip distribution, hypocenter) is spatially mapped throughout the SW4 domain with 1-2 km resolution. Results show that in the near fault region building damage is correlated with peak ground velocity (PGV), while farther away (> 20 km) it is better correlated with peak ground acceleration (PGA). We also show how simulated ground motions have peaks in the response spectra that shift to longer periods for larger magnitude events and for locations of forward directivity, as has been reported by

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  15. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  16. On dosimetry of radiodiagnosis facilities, mainly focused on computed tomography units

    International Nuclear Information System (INIS)

    Ghitulescu, Zoe

    2008-01-01

    The 'talk' refers to the Dosimetry of computed tomography units and it has been thought and structured in three parts, more or less stressed each of them, thus: 1) Basics of image acquisition using computed tomography technique; 2) Effective Dose calculation for a patient and its assessment using BERT concept; 3) Recommended actions of getting a good compromise in between related dose and the image quality. The aim of the first part is that the reader to become acquainted with the CT technique in order to be able of understanding the Effective Dose calculation given example and its conversion into time units using the BERT concept . The drown conclusion is that: 1) Effective dose calculation accomplished by the medical physicist (using a special soft for the CT scanner and the exam type) and, converted in time units through BERT concept, could be then communicated by the radiologist together with the diagnostic notes. Thus, it is obviously necessary a minimum informal of the patients as regards the nature and type of radiation, for instance, by the help of some leaflets. In the third part are discussed the factors which lead to get a good image quality taking into account the ALARA principle of Radiation Protection which states the fact that the dose should be 'as low as reasonable achievable'. (author)

  17. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    International Nuclear Information System (INIS)

    Chan, M.K.; Ballinger, M.Y.; Owczarski, P.C.

    1989-02-01

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs

  18. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. Universal Drive Train Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This vehicle drive train research facility is capable of evaluating helicopter and ground vehicle power transmission technologies in a system level environment. The...

  1. Computational Analysis Supporting the Design of a New Beamline for the Mines Neutron Radiography Facility

    Science.gov (United States)

    Wilson, C.; King, J.

    The Colorado School of Mines installed a neutron radiography system at the United States Geological Survey TRIGA reactor in 2012. An upgraded beamline could dramatically improve the imaging capabilities of this system. This project performed computational analyses to support the design of a new beamline, with the major goals of minimizing beam divergence and maximizing beam intensity. The new beamline will consist of a square aluminum tube with an 11.43 cm (4.5 in) inner side length and 0.635 cm (0.25 in) thick walls. It is the same length as the original beam tube (8.53 m) and is composed of 1.22 m (4 ft) and 1.52 m (5 ft) flanged sections which bolt together. The bottom 1.22 m of the beamline is a cylindrical aluminum pre-collimator which is 0.635 cm (0.25 in) thick, with an inner diameter of 5.08 cm (2 in). Based on Monte Carlo model results, when a pre-collimator is present, the use of a neutron absorbing liner on the inside surface of the beam tube has almost no effect on the angular distribution of the neutron current at the collimator exit. The use of a pre-collimator may result in a non-uniform flux profile at the image plane; however, as long as the collimator is at least three times longer than the pre-collimator, the flux distortion is acceptably low.

  2. Animal facilities

    International Nuclear Information System (INIS)

    Fritz, T.E.; Angerman, J.M.; Keenan, W.G.; Linsley, J.G.; Poole, C.M.; Sallese, A.; Simkins, R.C.; Tolle, D.

    1981-01-01

    The animal facilities in the Division are described. They consist of kennels, animal rooms, service areas, and technical areas (examining rooms, operating rooms, pathology labs, x-ray rooms, and 60 Co exposure facilities). The computer support facility is also described. The advent of the Conversational Monitor System at Argonne has launched a new effort to set up conversational computing and graphics software for users. The existing LS-11 data acquisition systems have been further enhanced and expanded. The divisional radiation facilities include a number of gamma, neutron, and x-ray radiation sources with accompanying areas for related equipment. There are five 60 Co irradiation facilities; a research reactor, Janus, is a source for fission-spectrum neutrons; two other neutron sources in the Chicago area are also available to the staff for cell biology studies. The electron microscope facilities are also described

  3. Recommended practice for the design of a computer driven Alarm Display Facility for central control rooms of nuclear power generating stations

    International Nuclear Information System (INIS)

    Ben-Yaacov, G.

    1984-01-01

    This paper's objective is to explain the process by which design can prevent human errors in nuclear plant operation. Human factor engineering principles, data, and methods used in the design of computer driven alarm display facilities are discussed. A ''generic'', advanced Alarm Display Facility is described. It considers operator capabilities and limitations in decision-making processes, response dynamics, and human memory limitations. Highlighted are considerations of human factor criteria in the designing and layout of alarm displays. Alarm data sources are described, and their use within the Alarm Display Facility are illustrated

  4. Environmental Assessment and Finding of No Significant Impact: Interim Measures for the Mixed Waste Management Facility Groundwater at the Burial Ground Complex at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    N/A

    1999-12-08

    The U. S. Department of Energy (DOE) prepared this environmental assessment (EA) to analyze the potential environmental impacts associated with the proposed interim measures for the Mixed Waste Management Facility (MW) groundwater at the Burial Ground Complex (BGC) at the Savannah River Site (SRS), located near Aiken, South Carolina. DOE proposes to install a small metal sheet pile dam to impound water around and over the BGC groundwater seepline. In addition, a drip irrigation system would be installed. Interim measures will also address the reduction of volatile organic compounds (VOCS) from ''hot-spot'' regions associated with the Southwest Plume Area (SWPA). This action is taken as an interim measure for the MWMF in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC) to reduce the amount of tritium seeping from the BGC southwest groundwater plume. The proposed action of this EA is being planned and would be implemented concurrent with a groundwater corrective action program under the Resource Conservation and Recovery Act (RCRA). On September 30, 1999, SCDHEC issued a modification to the SRS RCRA Part B permit that adds corrective action requirements for four plumes that are currently emanating from the BGC. One of those plumes is the southwest plume. The RCRA permit requires SRS to submit a corrective action plan (CAP) for the southwest plume by March 2000. The permit requires that the initial phase of the CAP prescribe a remedy that achieves a 70-percent reduction in the annual amount of tritium being released from the southwest plume area to Fourmile Branch, a nearby stream. Approval and actual implementation of the corrective measure in that CAP may take several years. As an interim measure, the actions described in this EA would manage the release of tritium from the southwest plume area until the final actions under the CAP can be implemented. This proposed action is expected to reduce the

  5. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.; Schandorf, C.; Boadu, M.; Fletcher, J. J.

    2013-01-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s -1 . An average dose equivalent rate estimated for supervised areas is 3.4±0.27 μSv week -1 and that for the controlled area is 18.0±0.15 μSv week -1 , which are within acceptable values. (authors)

  6. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.

    2010-01-01

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  7. Research Facilities | Wind | NREL

    Science.gov (United States)

    Research Facilities Research Facilities NREL's state-of-the-art wind research facilities at the Research Facilities Photo of five men in hard hards observing the end of a turbine blade while it's being tested. Structural Research Facilities A photo of two people silhouetted against a computer simulation of

  8. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on

  9. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  10. Facile formation of dendrimer-stabilized gold nanoparticles modified with diatrizoic acid for enhanced computed tomography imaging applications.

    Science.gov (United States)

    Peng, Chen; Li, Kangan; Cao, Xueyan; Xiao, Tingting; Hou, Wenxiu; Zheng, Linfeng; Guo, Rui; Shen, Mingwu; Zhang, Guixiang; Shi, Xiangyang

    2012-11-07

    We report a facile approach to forming dendrimer-stabilized gold nanoparticles (Au DSNPs) through the use of amine-terminated fifth-generation poly(amidoamine) (PAMAM) dendrimers modified by diatrizoic acid (G5.NH(2)-DTA) as stabilizers for enhanced computed tomography (CT) imaging applications. In this study, by simply mixing G5.NH(2)-DTA dendrimers with gold salt in aqueous solution at room temperature, dendrimer-entrapped gold nanoparticles (Au DENPs) with a mean core size of 2.5 nm were able to be spontaneously formed. Followed by an acetylation reaction to neutralize the dendrimer remaining terminal amines, Au DSNPs with a mean size of 6 nm were formed. The formed DTA-containing [(Au(0))(50)-G5.NHAc-DTA] DSNPs were characterized via different techniques. We show that the Au DSNPs are colloid stable in aqueous solution under different pH and temperature conditions. In vitro hemolytic assay, cytotoxicity assay, flow cytometry analysis, and cell morphology observation reveal that the formed Au DSNPs have good hemocompatibility and are non-cytotoxic at a concentration up to 3.0 μM. X-ray absorption coefficient measurements show that the DTA-containing Au DSNPs have enhanced attenuation intensity, much higher than that of [(Au(0))(50)-G5.NHAc] DENPs without DTA or Omnipaque at the same molar concentration of the active element (Au or iodine). The formed DTA-containing Au DSNPs can be used for CT imaging of cancer cells in vitro as well as for blood pool CT imaging of mice in vivo with significantly improved signal enhancement. With the two radiodense elements of Au and iodine incorporated within one particle, the formed DTA-containing Au DSNPs may be applicable for CT imaging of various biological systems with enhanced X-ray attenuation property and detection sensitivity.

  11. Agustin de Betancourt’s wind machine for draining marshy ground: analysis of its construction through computer-aided engineering

    Directory of Open Access Journals (Sweden)

    J. I. Rojas-Sola

    2018-04-01

    Full Text Available The objective of this research is to analyze the construction of the wind machine for draining marshy ground designed by Agustin de Betancourt and Molina in 1789. To do this, a static analysis by finite elements method from the threedimensional model obtained with Autodesk Inventor Professional has been performed. The results show that the greatest stresses of the mechanism take place when the main shaft is meshed with the cogwheel, namely the point of contact between the worm screw and cogwheel. However, the maximum displacements and the greatest deformations take place in the blades. In addition, the mechanism is oversized, reaching at no point the tensile strength of the material, confirming the successful construction of this historical invention.

  12. Dynamic Thermal Loads and Cooling Requirements Calculations for V ACs System in Nuclear Fuel Processing Facilities Using Computer Aided Energy Conservation Models

    International Nuclear Information System (INIS)

    EL Fawal, M.M.; Gadalla, A.A.; Taher, B.M.

    2010-01-01

    In terms of nuclear safety, the most important function of ventilation air conditioning (VAC) systems is to maintain safe ambient conditions for components and structures important to safety inside the nuclear facility and to maintain appropriate working conditions for the plant's operating and maintenance staff. As a part of a study aimed to evaluate the performance of VAC system of the nuclear fuel cycle facility (NFCF) a computer model was developed and verified to evaluate the thermal loads and cooling requirements for different zones of fuel processing facility. The program is based on transfer function method (TFM) and it is used to calculate the dynamic heat gain by various multilayer walls constructions and windows hour by hour at any orientation of the building. The developed model was verified by comparing the obtained calculated results of the solar heat gain by a given building with the corresponding calculated values using finite difference method (FDM) and total equivalent temperature different method (TETD). As an example the developed program is used to calculate the cooling loads of the different zones of a typical nuclear fuel facility the results showed that the cooling capacities of the different cooling units of each zone of the facility meet the design requirements according to safety regulations in nuclear facilities.

  13. Cathare2 V1.3E post-test computations of SPE-1 and SPE-2 experiments at PMK-NVH facility

    International Nuclear Information System (INIS)

    Belliard, M.; Laugier, E.

    1994-01-01

    This paper presents the first CATHARE2 V1.3E simulations of the SPE-2 transients at PMK-NVH loop. Concerning the SPE-1 and the SPE-2 experimentations at PMK-NVH, it contains a description of the facilities and the transient, as well as different conditions of use. The paper includes also a presentation of the CATHARE2 model and different type of computation, such as the steady state computation or SPE-1 and SPE-2 transient (TEC). 4 refs., 12 figs., 4 tabs

  14. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  15. Forward Modeling and validation of a new formulation to compute self-potential signals associated with ground water flow

    Directory of Open Access Journals (Sweden)

    A. Bolève

    2007-10-01

    Full Text Available The classical formulation of the coupled hydroelectrical flow in porous media is based on a linear formulation of two coupled constitutive equations for the electrical current density and the seepage velocity of the water phase and obeying Onsager's reciprocity. This formulation shows that the streaming current density is controlled by the gradient of the fluid pressure of the water phase and a streaming current coupling coefficient that depends on the so-called zeta potential. Recently a new formulation has been introduced in which the streaming current density is directly connected to the seepage velocity of the water phase and to the excess of electrical charge per unit pore volume in the porous material. The advantages of this formulation are numerous. First this new formulation is more intuitive not only in terms of establishing a constitutive equation for the generalized Ohm's law but also in specifying boundary conditions for the influence of the flow field upon the streaming potential. With the new formulation, the streaming potential coupling coefficient shows a decrease of its magnitude with permeability in agreement with published results. The new formulation has been extended in the inertial laminar flow regime and to unsaturated conditions with applications to the vadose zone. This formulation is suitable to model self-potential signals in the field. We investigate infiltration of water from an agricultural ditch, vertical infiltration of water into a sinkhole, and preferential horizontal flow of ground water in a paleochannel. For the three cases reported in the present study, a good match is obtained between finite element simulations performed and field observations. Thus, this formulation could be useful for the inverse mapping of the geometry of groundwater flow from self-potential field measurements.

  16. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Science.gov (United States)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  17. Resource Conservation and Recovery Act ground-water monitoring projects for Hanford Facilities: Progress report for the period July 1 to September 30, 1989 - Volume 1 - Text

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.M.; Bates, D.J.; Lundgren, R.E.

    1989-12-01

    This is Volume 1 of a two-volume document that describes the progress of 14 Hanford Site ground-water monitoring projects for the period July 1 to September 30, 1989. This volume discusses the projects; Volume 2 provides as-built diagrams, completion/inspection reports, drilling logs, and geophysical logs for wells drilled, completed, or logged during this period. Volume 2 can be found on microfiche in the back pocket of Volume 1. The work described in this document is conducted by the Pacific Northwest Laboratory under the management of Westinghouse Hanford Company for the US Department of Energy. Concentrations of ground-water constituents are compared to federal drinking water standards throughout this document for reference purposes. All drinking water supplied from the sampled aquifer meets regulatory standards for drinking water quality.

  18. Design concepts for the Cherenkov Telescope Array CTA: an advanced facility for ground-based high-energy gamma-ray astronomy

    International Nuclear Information System (INIS)

    Allekotte, I.; Arnaldi, H.; Asorey, H.; Gomez Berisso, M.; Sofo Haro, M.; Cillis, A.; Rovero, A.C.; Supanitsky, A.D.; Actis, M.; Antico, F.; Bottani, A.; Ochoa, I.; Ringegni, P.; Vallejo, G.; De La Vega, G.; Etchegoyen, A.; Videla, M.; Gonzalez, F.; Pallota, J.; Quel, E.; Ristori, P.; Romero, G.E.; Suarez, A.; Papyan, G.; Pogosyan, L.; Sahakian, V.; Bissaldi, E.; Egberts, K.; Reimer, A.; Reimer, O.; Shellard, R.C.; Santos, E.M.; De Gouveia Dal Pino, E.M.; Kowal, G.; De Souza, V.; Todero Peixoto, C.J.; Maneva, G.; Temnikov, P.; Vankov, H.; Golev, V.; Ovcharov, E.; Bonev, T.; Dimitrov, D.; Hrupec, D.; Nedbal, D.; Rob, L.; Sillanpaa, A.; Takalo, L.; Beckmann, V.; Benallou, M.; Boutonnet, C.; Corlier, M.; Courty, B.; Djannati-Atai, A.; Dufour, C.; Gabici, S.; Guglielmi, L.; Olivetto, C.; Pita, S.; Punch, M.; Selmane, S.; Terrier, R.; Yoffo, B.; Brun, P.; Carton, P.H.; Cazaux, S.; Corpace, O.; Delagnes, E.; Disset, G.; Durand, D.; Glicenstein, J.F.; Guilloux, F.; Kosack, K.; Medina, C.; Micolon, P.; Mirabel, F.; Moulin, E.; Peyaud, B.; Reymond, J.M.; Veyssiere, C.

    2011-01-01

    Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA. (authors)

  19. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  20. Ground-water monitoring compliance projects for Hanford Site Facilities: Progress report for the period April 1--June 30, 1988: Volume 1, Text

    International Nuclear Information System (INIS)

    1988-09-01

    This is Volume 1 of a two-volume set of documents that describes the progress of 10 Hanford Site ground-water monitoring projects for the period April 1 to June 30, 1988. This volume discusses the projects; Volume 2 provides as-built diagrams, drilling logs, and geophysical logs for wells drilled during this period in the 100-N Area and near the 216-A-36B Crib

  1. A facility for training Space Station astronauts

    Science.gov (United States)

    Hajare, Ankur R.; Schmidt, James R.

    1992-01-01

    The Space Station Training Facility (SSTF) will be the primary facility for training the Space Station Freedom astronauts and the Space Station Control Center ground support personnel. Conceptually, the SSTF will consist of two parts: a Student Environment and an Author Environment. The Student Environment will contain trainers, instructor stations, computers and other equipment necessary for training. The Author Environment will contain the systems that will be used to manage, develop, integrate, test and verify, operate and maintain the equipment and software in the Student Environment.

  2. Prediction of ground motion from underground nuclear weapons tests as it relates to siting of a nuclear waste storage facility at NTS and compatibility with the weapons test program

    International Nuclear Information System (INIS)

    Vortman, L.J. IV.

    1980-04-01

    This report assumes reasonable criteria for NRC licensing of a nuclear waste storage facility at the Nevada Test Site where it would be exposed to ground motion from underground nuclear weapons tests. Prediction equations and their standard deviations have been determined from measurements on a number of nuclear weapons tests. The effect of various independent parameters on standard deviation is discussed. That the data sample is sufficiently large is shown by the fact that additional data have little effect on the standard deviation. It is also shown that coupling effects can be separated out of the other contributions to the standard deviation. An example, based on certain licensing assumptions, shows that it should be possible to have a nuclear waste storage facility in the vicinity of Timber Mountain which would be compatible with a 700 kt weapons test in the Buckboard Area if the facility were designed to withstand a peak vector acceleration of 0.75 g. The prediction equation is a log-log linear equation which predicts acceleration as a function of yield of an explosion and the distance from it

  3. In situ analysis of soil at an open burning/open detonation disposal facility: J-Field, Aberdeen Proving Ground, Maryland

    International Nuclear Information System (INIS)

    Martino, L.; Cho, E.; Wrobel, J.

    1994-01-01

    Investigators have used a field-portable X-Ray Fluorescence (XRF) Analyzer to screen soils for a suite of metals indicative of the open burning and open detonation (OB/OD) activities that occurred at the J-Field site at Aberdeen Proving Ground, Maryland. The field XRF results were incorporated into a multiphase investigation of contaminants at the Toxic Burning Pits Area of Concern at J-Field. The authors determined that the field-portable XRF unit used for the study and the general concept of field XRF screening are invaluable tools for investigating an OB/OD site where intrusive sampling techniques could present unacceptable hazards to site workers

  4. Optical studies in the holographic ground station

    Science.gov (United States)

    Workman, Gary L.

    1991-01-01

    The Holographic Group System (HGS) Facility in rooms 22 & 123, Building 4708 has been developed to provide for ground based research in determining pre-flight parameters and analyzing the results from space experiments. The University of Alabama, Huntsville (UAH) has researched the analysis aspects of the HGS and reports their findings here. Some of the results presented here also occur in the Facility Operating Procedure (FOP), which contains instructions for power up, operation, and powerdown of the Fluid Experiment System (FES) Holographic Ground System (HGS) Test Facility for the purpose of optically recording fluid and/or crystal behavior in a test article during ground based testing through the construction of holograms and recording of videotape. The alignment of the optical bench components, holographic reconstruction and and microscopy alignment sections were also included in the document for continuity even though they are not used until after optical recording of the test article) setup of support subsystems and the Automated Holography System (AHS) computer. The HGS provides optical recording and monitoring during GCEL runs or development testing of potential FES flight hardware or software. This recording/monitoring can be via 70mm holographic film, standard videotape, or digitized images on computer disk. All optical bench functions necessary to construct holograms will be under the control of the AHS personal computer (PC). These include type of exposure, time intervals between exposures, exposure length, film frame identification, film advancement, film platen evacuation and repressurization, light source diffuser introduction, and control of realtime video monitoring. The completed sequence of hologram types (single exposure, diffuse double exposure, etc.) and their time of occurrence can be displayed, printed, or stored on floppy disk posttest for the user.

  5. Robotics Research Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This 60 feet x 100 feet structure on the grounds of the Fort Indiantown Gap Pennsylvania National Guard (PNG) Base is a mixed-use facility comprising office space,...

  6. Aircraft Test & Evaluation Facility (Hush House)

    Data.gov (United States)

    Federal Laboratory Consortium — The Aircraft Test and Evaluation Facility (ATEF), or Hush House, is a noise-abated ground test sub-facility. The facility's controlled environment provides 24-hour...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. Computer aided detection in prostate cancer diagnostics: A promising alternative to biopsy? A retrospective study from 104 lesions with histological ground truth.

    Directory of Open Access Journals (Sweden)

    Anika Thon

    Full Text Available Prostate cancer (PCa diagnosis by means of multiparametric magnetic resonance imaging (mpMRI is a current challenge for the development of computer-aided detection (CAD tools. An innovative CAD-software (Watson Elementary™ was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade.To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies.The evaluation was retrospective for 104 lesions (47 PCa, 57 benign from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI. The analysis focused on (i the CAD sensitivity and specificity to classify suspect lesions and (ii the MAI correlation with the histopathological ground truth.The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test. Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02, which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation.The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis.

  9. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  10. High-resolution computed tomography to differentiate chronic diffuse interstitial lung diseases with predominant ground-glass pattern using logical analysis of data

    International Nuclear Information System (INIS)

    Martin, Sophie Grivaud; Brauner, Michel W.; Rety, Frederique; Kronek, Louis-Philippe; Brauner, Nadia; Valeyre, Dominique; Nunes, Hilario; Brillet, Pierre-Yves

    2010-01-01

    We evaluated the performance of high-resolution computed tomography (HRCT) to differentiate chronic diffuse interstitial lung diseases (CDILD) with predominant ground-glass pattern by using logical analysis of data (LAD). A total of 162 patients were classified into seven categories: sarcoidosis (n = 38), connective tissue disease (n = 32), hypersensitivity pneumonitis (n = 18), drug-induced lung disease (n = 15), alveolar proteinosis (n = 12), idiopathic non-specific interstitial pneumonia (n = 10) and miscellaneous (n = 37). First, 40 CT attributes were investigated by the LAD to build up patterns characterising a category. From the association of patterns, LAD determined models specific to each CDILD. Second, data were recomputed by adding eight clinical attributes to the analysis. The 20 x 5 cross-folding method was used for validation. Models could be individualised for sarcoidosis, hypersensitivity pneumonitis, connective tissue disease and alveolar proteinosis. An additional model was individualised for drug-induced lung disease by adding clinical data. No model was demonstrated for idiopathic non-specific interstitial pneumonia and the miscellaneous category. The results showed that HRCT had a good sensitivity (≥64%) and specificity (≥78%) and a high negative predictive value (≥93%) for diseases with a model. Higher sensitivity (≥78%) and specificity (≥89%) were achieved by adding clinical data. The diagnostic performance of HRCT is high and can be increased by adding clinical data. (orig.)

  11. Loss of cellular viability in areas of ground-glass opacity on computed tomography images immediately after pulmonary radiofrequency ablation in rabbits

    International Nuclear Information System (INIS)

    Kuroki, Masaomi; Nakada, Hiroshi; Yamashita, Atsushi; Sawaguchi, Akira; Uchino, Noriko; Sato, Shinya; Asada, Yujiro; Tamura, Shozo; Asanuma, Taketoshi

    2012-01-01

    The purpose of this study was to determine cellular viability of lung parenchyma and neoplastic cells in areas of ground-glass opacity (GGO) on computed tomography (CT) images immediately after pulmonary radiofrequency ablation (RFA) in rabbits. A LeVeen RFA electrode was placed percutaneously into rabbit lungs with or without metastatic VX2 tumors. Five minutes later, seven isolated lungs were imaged by use of a multi-detector row CT scanner, and the images were compared with histological features. The cellular viability of the lung tissues was assessed by nicotinamide adenine dinucleotide hydrogen (NADH) staining in eight normal lungs and in three lungs with multiple metastatic tumors. All lung lesions appeared as bilayered structures with a central, dense, attenuated area and an outer area of GGO on CT images, and as three-layered structures on macroscopic and microscopic images 5 min after RFA. The GGO areas approximately corresponded to the outer two layers in macroscopic images that were exudative and congestive on microscopic images. Staining for NADH was significantly reduced in the GGO and densely attenuated areas with or without tumor tissue staining compared with the non-ablated area. Our results suggest that an area of GGO that appears on CT immediately after RFA can be effectively treated by RFA. (author)

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. Impact of revised 10 CFR 20 on existing performance assessment computer codes used for LLW disposal facilities

    International Nuclear Information System (INIS)

    Leonard, P.R.; Seitz, R.R.

    1992-04-01

    The US Nuclear Regulatory Commission (NRC) recently announced a revision to Chapter 10 of the Code of Federal Regulations, Part 20 (10 CFR 20) ''Standards for Protection Against Radiation,'' which incorporates recommendations contained in Publications 26 and 30 of the International Commission on Radiological Protection (ICRP), issued in 1977 and 1979, respectively. The revision to 10 CFR 20 was also developed in parallel with Presidential Guidance on occupational radiation protection published in the Federal Register. Thus, this study concludes that the issuance of the revised 10 CFR 20 will not affect calculations using the computer codes considered in this report. In general, the computer codes and EPA and DOE guidance on which computer codes are based were developed in a manner consistent with the guidance provided in ICRP 26/30, well before the revision of 10 CFR 20

  14. Predictive value of mutant p53 expression index obtained from nonenhanced computed tomography measurements for assessing invasiveness of ground-glass opacity nodules

    Directory of Open Access Journals (Sweden)

    Wang W

    2016-03-01

    Full Text Available Wei Wang,1 Jian Li,2 Ransheng Liu,1 Aixu Zhang,1 Zhiyong Yuan1 1Department of Radiation Oncology, Tianjin Medical University Cancer Institute and Hospital, National Clinical Research Center of Cancer, Key Laboratory of Cancer Prevention and Therapy, Tianjin, People’s Republic of China; 2Department of Radiology, Tianjin Hospital, Tianjin, People’s Republic of China Purpose: To predict p53 expression index (p53-EI based on measurements from computed tomography (CT for preoperatively assessing pathologies of nodular ground-glass opacities (nGGOs. Methods: Information of 176 cases with nGGOs on high-resolution CT that were pathologically confirmed adenocarcinoma was collected. Diameters, total volumes (TVs, maximum (MAX, average (AVG, and standard deviation (STD of CT attenuations within nGGOs were measured. p53-EI was evaluated through immunohistochemistry with Image-Pro Plus 6.0. A multiple linear stepwise regression model was established to calculate p53-EI prediction from CT measurements. Receiver-operating characteristic curve analysis was performed to compare the diagnostic performance of variables in differentiating preinvasive adenocarcinoma (PIA, minimally invasive adenocarcinoma (MIA, and invasive adenocarcinoma (IAC. Results: Diameters, TVs, MAX, AVG, and STD showed significant differences among PIAs, MIAs, and IACs (all P-values <0.001, with only MAX being incapable to differentiate MIAs from IACs (P=0.106. The mean p53-EIs of PIAs, MIAs, and IACs were 3.4±2.0, 7.2±1.9, and 9.8±2.7, with significant intergroup differences (all P-values <0.001. An equation was established by multiple linear regression as: p53-EI prediction =0.001* TVs +0.012* AVG +0.022* STD +9.345, through which p53-EI predictions were calculated to be 4.4%±1.0%, 6.8%±1.3%, and 8.5%±1.4% for PIAs, MIAs, and IACs (Kruskal–Wallis test P<0.001; Tamhane’s T2 test: PIA vs MIA P<0.001, MIA vs IAC P<0.001, respectively. Although not significant, p53-EI prediction

  15. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  16. Modeling bubble condenser containment with computer code COCOSYS: post-test calculations of the main steam line break experiment at ELECTROGORSK BC V-213 test facility

    International Nuclear Information System (INIS)

    Lola, I.; Gromov, G.; Gumenyuk, D.; Pustovit, V.; Sholomitsky, S.; Wolff, H.; Arndt, S.; Blinkov, V.; Osokin, G.; Melikhov, O.; Melikhov, V.; Sokoline, A.

    2005-01-01

    Containment of the WWER-440 Model 213 nuclear power plant features a Bubble Condenser, a complex passive pressure suppression system, intended to limit pressure rise in the containment during accidents. Due to lack of experimental evidence of its successful operation in the original design documentation, the performance of this system under accidents with ruptures of large high-energy pipes of the primary and secondary sides remains a known safety concern for this containment type. Therefore, a number of research and analytical studies have been conducted by the countries operating WWER-440 reactors and their Western partners in the recent years to verify Bubble Condenser operation under accident conditions. Comprehensive experimental research studies at the Electrogorsk BC V-213 test facility, commissioned in 1999 in Electrogorsk Research and Engineering Centre (EREC), constitute essential part of these efforts. Nowadays this is the only operating large-scale facility enabling integral tests on investigation of the Bubble Condenser performance. Several large international research projects, conducted at this facility in 1999-2003, have covered a spectrum of pipe break accidents. These experiments have substantially improved understanding of the overall system performance and thermal hydraulic phenomena in the Bubble Condenser Containment, and provided valuable information for validating containment codes against experimental results. One of the recent experiments, denoted as SLB-G02, has simulated steam line break. The results of this experiment are of especial value for the engineers working in the area of computer code application for WWER-440 containment analyses, giving an opportunity to verify validity of the code predictions and identify possibilities for model improvement. This paper describes the results of the post-test calculations of the SLB-G02 experiment, conducted as a joint effort of GRS, Germany and Ukrainian technical support organizations for

  17. NuSTAR calibration facility and multilayer reference database: Optic response model comparison to NuSTAR on-ground calibration data

    DEFF Research Database (Denmark)

    Brejnholt, Nicolai

    . To couple the as-coated multilayer to the actual optics, ray tracing is carried out in a detailed geometric model of the optic, including in-situ measured figure error for the mounted substrates. The effective area as a function of energy estimated from ray tracing is compared to NuSTAR on......The Nuclear Spectroscopic Telescope ARray (NuSTAR) is a NASA Small Explorer mission carrying the first focusing hard X-ray telescope (5 − 80 keV ) to orbit. NuSTAR is slated for launch in 2012. Through a leap in sensitivity, the realization of focusing optics holds promise of heralding in a golden...... the optic response for both on- and off-axis NuSTAR observations, detailed knowledge of the as-coated multilayer is required. The purpose of this thesis is to establish a multilayer reference database. As an integral part of this effort, a hard X-ray calibration facility was designed and constructed. Each...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  20. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  4. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  5. Implementation of the Principal Component Analysis onto High-Performance Computer Facilities for Hyperspectral Dimensionality Reduction: Results and Comparisons

    Directory of Open Access Journals (Sweden)

    Ernestina Martel

    2018-06-01

    Full Text Available Dimensionality reduction represents a critical preprocessing step in order to increase the efficiency and the performance of many hyperspectral imaging algorithms. However, dimensionality reduction algorithms, such as the Principal Component Analysis (PCA, suffer from their computationally demanding nature, becoming advisable for their implementation onto high-performance computer architectures for applications under strict latency constraints. This work presents the implementation of the PCA algorithm onto two different high-performance devices, namely, an NVIDIA Graphics Processing Unit (GPU and a Kalray manycore, uncovering a highly valuable set of tips and tricks in order to take full advantage of the inherent parallelism of these high-performance computing platforms, and hence, reducing the time that is required to process a given hyperspectral image. Moreover, the achieved results obtained with different hyperspectral images have been compared with the ones that were obtained with a field programmable gate array (FPGA-based implementation of the PCA algorithm that has been recently published, providing, for the first time in the literature, a comprehensive analysis in order to highlight the pros and cons of each option.

  6. Using computer graphics to analyze the placement of neutral-beam injectors for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Horvath, J.A.

    1977-01-01

    To optimize the neutral-beam current incident on the fusion plasma and limit the heat load on exposed surfaces of the Mirror Fusion Test Facility magnet coils, impingement of the neutral beams on the magnet structure must be minimized. Also, placement of the neutral-beam injectors must comply with specifications for neutral-current heating of the plasma and should allow maximum flexibility to accommodate alternative beam aiming patterns without significant hardware replacement or experiment down-time. Injector placements and aimings are analyzed by means of the Structural Analysis Movie Post Processor (SAMPP), a general-purpose graphics code for the display of three-dimensional finite-element models. SAMPP is used to visually assemble, disassemble, or cut away sections of the complex three-dimensional apparatus, which is represented by an assemblage of 8-node solid finite elements. The resulting picture is used to detect and quantify interactions between the structure and the neutral-particle beams

  7. Pure ground glass nodular adenocarcinomas: Are preoperative positron emission tomography/computed tomography and brain magnetic resonance imaging useful or necessary?

    Science.gov (United States)

    Cho, Hyoun; Lee, Ho Yun; Kim, Jhingook; Kim, Hong Kwan; Choi, Joon Young; Um, Sang-Won; Lee, Kyung Soo

    2015-09-01

    The utility of (18)F-Fluorodeoxyglucose positron emission tomography/computed tomography (FDG PET/CT) scanning and brain magnetic resonance imaging (MRI) as a staging workup for lung adenocarcinoma manifesting as pure ground glass opacity (GGO) is unknown. The purpose of this study was to determine the utility of these 2 tests for preoperative staging of pure GGO nodular lung adenocarcinoma. The study included 164 patients (male:female, 73:91; mean age, 62 years) with pure GGO nodular lung adenocarcinoma who underwent PET/CT (in 136 patients) and/or brain MRI (in 109 patients) before surgery. Pathologic N staging and dedicated standard imaging or follow-up imaging findings for M staging were used as reference standards. The median follow-up time was 47.9 months. On PET/CT scan, abnormal FDG uptake of lymph nodes was found in 2 of 136 patients (1.5%); both were negative on final pathology. Abnormal FDG uptake of the liver was detected in 1 patient, which was also confirmed to be negative by dedicated abdominal CT. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of PET/CT in detecting metastases were not applicable, 98% (95% confidence interval [CI], 94%-100%), 0% (95% CI, 0%-71%), 100% (95% CI, 97%-100%), and 98% (95% CI, 94%-100%), respectively. No brain metastasis was found in preoperative brain MRI of 109 patients. Of 109 patients, 1 (0.9%) developed brain metastasis 30 months after surgical resection. PET/CT and brain MRI is not necessary in the staging of pure GGO nodular lung adenocarcinoma. Copyright © 2015 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  8. Investigation of optimal display size for detecting ground-glass opacity on high resolution computed tomography using a new digital contrast-detail phantom

    International Nuclear Information System (INIS)

    Yamaguchi, Michihiro; Fujita, Hideki; Bessho, Yuichi; Inoue, Tatsuro; Asai, Yoshiyuki; Murase, Kenya

    2011-01-01

    The purpose of this study was to clarify the relationship between display sizes of high resolution computed tomography (HRCT) images for detecting ground-glass opacity (GGO) and observer performance using a digital contrast-detail (d-CD) phantom. A structure of the d-CD phantom was determined on the basis of the actual images of GGOs and background noises of 22 patients who were diagnosed as GGO by chest HRCT. The d-CD phantom has a 512 × 512 matrix in size and has total of 100 holes: the diameter of these holes increases stepwise from 2 to 20 pixels with 2 pixels interval in a vertical direction and the CT value varies stepwise from 2 to 200 HU in a horizontal direction. The observer performance study was carried out for three different display sizes (30 cm × 30 cm as an enlarged size, 13 cm × 13 cm as an original size, and 7 cm × 7 cm as a reduced size) using a 2-megapixels LCD monitor, and it was analyzed using Friedman and Wilcoxon statistical tests. As a result, the observer performance for the original display and the reduced display sizes was superior to that for the enlarged size (P = 0.006 and 0.037 for the original display and the reduced display sizes, respectively), whereas there was no significant difference between the original display and reduced display sizes (P = 0.77). The d-CD phantom enables a short-term evaluation of observer performance and is useful in analyzing relationship between display size and observer performance.

  9. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  11. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  12. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  15. Steam condensation induced water hammer in a vertical up-fill configuration within an integral test facility. Experiments and computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dirndorfer, Stefan

    2017-01-17

    Condensation induced water hammer is a source of danger and unpredictable loads in pipe systems. Studies concerning condensation induced water hammer were predominantly made for horizontal pipes, studies concerning vertical pipe geometries are quite rare. This work presents a new integral test facility and an analysis of condensation induced water hammer in a vertical up-fill configuration. Thanks to the state of the art technology, the phenomenology of vertical condensation induced water hammer can be analysed by means of sufficient high-sampled experimental data. The system code ATHLET is used to simulate UniBw condensation induced water hammer experiments. A newly developed and implemented direct contact condensation model enables ATHLET to calculate condensation induced water hammer. Selected experiments are validated by the modified ATHLET system code. A sensitivity analysis in ATHLET, together with the experimental data, allows to assess the performance of ATHLET to compute condensation induced water hammer in a vertical up-fill configuration.

  16. Steam condensation induced water hammer in a vertical up-fill configuration within an integral test facility. Experiments and computational simulations

    International Nuclear Information System (INIS)

    Dirndorfer, Stefan

    2017-01-01

    Condensation induced water hammer is a source of danger and unpredictable loads in pipe systems. Studies concerning condensation induced water hammer were predominantly made for horizontal pipes, studies concerning vertical pipe geometries are quite rare. This work presents a new integral test facility and an analysis of condensation induced water hammer in a vertical up-fill configuration. Thanks to the state of the art technology, the phenomenology of vertical condensation induced water hammer can be analysed by means of sufficient high-sampled experimental data. The system code ATHLET is used to simulate UniBw condensation induced water hammer experiments. A newly developed and implemented direct contact condensation model enables ATHLET to calculate condensation induced water hammer. Selected experiments are validated by the modified ATHLET system code. A sensitivity analysis in ATHLET, together with the experimental data, allows to assess the performance of ATHLET to compute condensation induced water hammer in a vertical up-fill configuration.

  17. Radiological Risk Assessments for Occupational Exposure at Fuel Fabrication Facility in AlTuwaitha Site Baghdad – Iraq by using RESRAD Computer Code

    Science.gov (United States)

    Ibrahim, Ziadoon H.; Ibrahim, S. A.; Mohammed, M. K.; Shaban, A. H.

    2018-05-01

    The purpose of this study is to evaluate the radiological risks for workers for one year of their activities at Fuel Fabrication Facility (FFF) so as to make the necessary protection to prevent or minimize risks resulted from these activities this site now is under the Iraqi decommissioning program (40). Soil samples surface and subsurface were collected from different positions of this facility and analyzed by gamma rays spectroscopy technique High Purity Germanium detector (HPGe) was used. It was found out admixture of radioactive isotopes (232Th 40K 238U 235U137Cs) according to the laboratory results the highest values were (975758) for 238U (21203) for 235U (218) for 232Th (4046) for 40K and (129) for 137Cs in (Bqkg1) unit. The annual total radiation dose and risks were estimated by using RESRAD (onsite) 70 computer code. The highest total radiation dose was (5617μSv/year) in area that represented by soil sample (S7) and the radiological risks morbidity and mortality (118E02 8661E03) respectively in the same area

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  19. Computational fluid dynamics as a virtual facility for R and D in the IRIS project: an overview

    International Nuclear Information System (INIS)

    Colombo, E.; Inzoli, F.; Ricotti, M.; Uddin, R.; Yan, Y.; Sobh, N.

    2004-01-01

    The pressurized light water cooled, medium power (1000 MWt) IRIS (International Reactor Innovative and Secure) has been under development for four years by an international consortium of over 21 organizations from ten countries. The plant conceptual design was completed in 2001 and the preliminary design is nearing completion. The pre-application licensing process with NRC started in October, 2002 and IRIS is one of the designs considered by US utilities as part of the ESP (Early Site Permit) process. The development of a new nuclear plant concept presents the opportunity and potential for a significant usage of computational fluid dynamics (CFD) in the design process, as it is in many conventional applications related to power generation. A CFD-Group of international scientists has been given the mission of investigating the many application opportunities for CFD related to the IRIS project and to verify the support that the IRIS design process may gain from CFD in terms of time, costs, resource saving, and visibility. The key objective identified is the use of CFD as a design tool for virtual tests in order to simplify the optimization effort for the nuclear plant's components and support the IRIS testing program. In this paper, the CFD-Group is described in terms of their resources and capabilities. A program of activities with identified goals and a possible schedule is also presented.(author)

  20. Computing facilities available to final-year students at 3 UK dental schools in 1997/8: their use, and students' attitudes to information technology.

    Science.gov (United States)

    Grigg, P; Macfarlane, T V; Shearer, A C; Jepson, N J; Stephens, C D

    2001-08-01

    To identify computer facilities available in 3 dental schools where 3 different approaches to the use of technology-based learning material have been adopted and assess dental students' perception of their own computer skills and their attitudes towards information technology. Multicentre cross sectional by questionnaire. All 181 dental students in their final year of study (1997-8). The overall participation rate was 80%. There were no differences between schools in the students' self assessment of their IT skills but only 1/3 regarded themselves as competent in basic skills and nearly 50% of students in all 3 schools felt that insufficient IT training had been provided to enable them to follow their course without difficulty. There were significant differences between schools in most of the other areas examined which reflect the different ways in which IT can be used to support the dental course. 1. Students value IT as an educational tool. 2. Their awareness of the relevance of a knowledge of information technology for their future careers remains generally low. 3. There is a need to provide effective instruction in IT skills for those dental students who do not acquire these during secondary education.

  1. Vehicle Test Facilities at Aberdeen Proving Ground

    Science.gov (United States)

    1981-07-06

    warehouse and rough terrain forklifts. Two 5-ton-capacity manual chain hoists at the rear of the table regulate its slope from 0 to 40 percent. The overall...Capacity at 24-Inch Load Center. 5. TOP/ HTP 2-2-608, Braking, Wheeled Vehicles, 15 Jav.&ry 1971. 6. TOP 2-2-603, Vehicle Fuel Consumption, 1 November 1977. A-1 r -. ’,’

  2. Ground-glass opacity in diffuse lung diseases: high-resolution computed tomography-pathology correlation; Opacidades em vidro fosco nas doencas pulmonares difusas: correlacao da tomografia computadorizada de alta resolucao com a anatomopatologia

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maria Lucia de Oliveira; Vianna, Alberto Domingues; Marchiori, Edson [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Radiologia; Souza Junior, Arthur Soares [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Disciplina de Radiologia; Moraes, Heleno Pinto de [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Patologia]. E-mail: edmarchiori@zipmail.com.br

    2003-12-01

    Ground-glass opacity is a finding frequently seen in high-resolution computed tomography examinations of the chest and is characterized by hazy increased attenuation of lung, however without blurring of bronchial and vascular margins. Due to its un specificity, association with other radiological, clinical and pathological findings must be considered for an accurate diagnostic interpretation. In this paper were reviewed 62 computed tomography examinations of patients with diffuse pulmonary diseases of 14 different etiologies in which ground-glass opacity was the only or the most remarkable finding, and correlated this findings with pathology abnormalities seen on specimens obtained from biopsies or necropsies. In pneumocystosis, ground-glass opacities correlated histologically with alveolar occupation by a foaming material containing parasites, in bronchiole alveolar cell carcinoma with thickening of the alveolar septa and occupation of the lumen by mucus and tumoral cells, in paracoccidioidomycosis with thickening of the alveolar septa, areas of fibrosis and alveolar bronchopneumonia exudate, in sarcoidosis with fibrosis or clustering of granulomas and in idiopathic pulmonary fibrosis with alveolar septa thickening due to fibrosis. Alveolar occupation by blood was found in cases of leptospirosis, idiopathic hemo siderosis, metastatic kidney tumor and invasive aspergillosis whereas oily vacuole were seen in lipoid pneumonia, proteinaceous and lipo proteinaceous material in silico proteinosis and pulmonary alveolar proteinosis, and edematous fluid in cardiac failure. (author)

  3. A three-dimensional ground-water-flow model modified to reduce computer-memory requirements and better simulate confining-bed and aquifer pinchouts

    Science.gov (United States)

    Leahy, P.P.

    1982-01-01

    The Trescott computer program for modeling groundwater flow in three dimensions has been modified to (1) treat aquifer and confining bed pinchouts more realistically and (2) reduce the computer memory requirements needed for the input data. Using the original program, simulation of aquifer systems with nonrectangular external boundaries may result in a large number of nodes that are not involved in the numerical solution of the problem, but require computer storage. (USGS)

  4. Support facilities

    International Nuclear Information System (INIS)

    Williamson, F.S.; Blomquist, J.A.; Fox, C.A.

    1977-01-01

    Computer support is centered on the Remote Access Data Station (RADS), which is equipped with a 1000 lpm printer, 1000 cpm reader, and a 300 cps paper tape reader with 500-foot spools. The RADS is located in a data preparation room with four 029 key punches (two of which interpret), a storage vault for archival magnetic tapes, card files, and a 30 cps interactive terminal principally used for job inquiry and routing. An adjacent room provides work space for users, with a documentation library and a consultant's office, plus file storage for programs and their documentations. The facility has approximately 2,600 square feet of working laboratory space, and includes two fully equipped photographic darkrooms, sectioning and autoradiographic facilities, six microscope cubicles, and five transmission electron microscopes and one Cambridge scanning electron microscope equipped with an x-ray energy dispersive analytical system. Ancillary specimen preparative equipment includes vacuum evaporators, freeze-drying and freeze-etching equipment, ultramicrotomes, and assorted photographic and light microscopic equipment. The extensive physical plant of the animal facilities includes provisions for holding all species of laboratory animals under controlled conditions of temperature, humidity, and lighting. More than forty rooms are available for studies of the smaller species. These have a potential capacity of more than 75,000 mice, or smaller numbers of larger species and those requiring special housing arrangements. There are also six dog kennels to accommodate approximately 750 dogs housed in runs that consist of heated indoor compartments and outdoor exercise areas

  5. Ground water in Fountain and Jimmy Camp Valleys, El Paso County, Colorado with a section on Computations of drawdowns caused by the pumping of wells in Fountain Valley

    Science.gov (United States)

    Jenkins, Edward D.; Glover, Robert E.

    1964-01-01

    The part of Fountain Valley considered in this report extends from Colorado Springs to the Pueblo County line. It is 23 miles long and has an area of 26 square miles. The part of Jimmy Camp Valley discussed is 11 miles long and has an area of 9 square miles. The topography is characterized by level flood plains and alluvial terraces that parallel the valley and by rather steep hills along the valley sides. The climate is semiarid, average annual precipitation being about 13 inches. Farming and stock raising are the principal occupations in the valleys; however, some of the agricultural land near Colorado Springs is being used for housing developments. The Pierre Shale and alluvium underlie most of the area, and mesa gravel caps the shale hills adjacent to Fountain Valley. The alluvium yields water to domestic, stock, irrigation, and public-supply wells and is capable of yielding large quantities of water for intermittent periods. Several springs issue along the sides of the valley at the contact of the mesa gravel and the underlying Pierre Shale. The water table ranges in depth from less than 10 feet along the bottom lands to about 80 feet along the sides of the valleys; the saturated thickness ranges from less than a foot to about 50 feet. The ground-water reservoir in Fountain Valley is recharged by precipitation that falls within the area, by percolation from Fountain Creek, which originates in the Pikes Peak, Monument Valley, and Rampart Range areas, and by seepage from irrigation water. This reservoir contains about 70,000 acre-feet of ground water in storage. The ground-water reservoir in Jimmy Camp Valley is recharged from precipitation that falls within the area, by percolation from Jimmy Camp Creek during periods of streamflow, and by seepage from irrigation water. The Jimmy Camp ground-water reservoir contains about 25,000 acre-feet of water in storage. Ground water is discharged from the area by movement to the south, by evaporation and transpiration in

  6. Electrical Ground System Design of PEFP

    International Nuclear Information System (INIS)

    Mun, Kyeong Jun; Jeon, Gye Po; Park, Sung Sik; Min, Yi Sub; Nam, Jung Min; Cho, Jang Hyung; Kim, Jun Yeon

    2010-01-01

    Since host site host site was selected Gyeong-ju city in January, 2006. we need design revision of Proton Accelerator research center to reflect on host site characteristics and several conditions. In this paper, electrical grounding and lightning protection design scheme is introduced. In electrical grounding system design of PEFP, we classified electrical facilities into 4 groups; equipment grounding (type A), instrument grounding (Type A), high frequency instrument grounding (Type C) and lightning arrestor grounding (Type D). Lightning protection system is designed in all buildings of proton accelerator research center of PEFP, including switchyard

  7. Electrical Ground System Design of PEFP

    Energy Technology Data Exchange (ETDEWEB)

    Mun, Kyeong Jun; Jeon, Gye Po; Park, Sung Sik; Min, Yi Sub; Nam, Jung Min; Cho, Jang Hyung; Kim, Jun Yeon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Since host site host site was selected Gyeong-ju city in January, 2006. we need design revision of Proton Accelerator research center to reflect on host site characteristics and several conditions. In this paper, electrical grounding and lightning protection design scheme is introduced. In electrical grounding system design of PEFP, we classified electrical facilities into 4 groups; equipment grounding (type A), instrument grounding (Type A), high frequency instrument grounding (Type C) and lightning arrestor grounding (Type D). Lightning protection system is designed in all buildings of proton accelerator research center of PEFP, including switchyard

  8. SU-E-T-531: Performance Evaluation of Multithreaded Geant4 for Proton Therapy Dose Calculations in a High Performance Computing Facility

    International Nuclear Information System (INIS)

    Shin, J; Coss, D; McMurry, J; Farr, J; Faddegon, B

    2014-01-01

    Purpose: To evaluate the efficiency of multithreaded Geant4 (Geant4-MT, version 10.0) for proton Monte Carlo dose calculations using a high performance computing facility. Methods: Geant4-MT was used to calculate 3D dose distributions in 1×1×1 mm3 voxels in a water phantom and patient's head with a 150 MeV proton beam covering approximately 5×5 cm2 in the water phantom. Three timestamps were measured on the fly to separately analyze the required time for initialization (which cannot be parallelized), processing time of individual threads, and completion time. Scalability of averaged processing time per thread was calculated as a function of thread number (1, 100, 150, and 200) for both 1M and 50 M histories. The total memory usage was recorded. Results: Simulations with 50 M histories were fastest with 100 threads, taking approximately 1.3 hours and 6 hours for the water phantom and the CT data, respectively with better than 1.0 % statistical uncertainty. The calculations show 1/N scalability in the event loops for both cases. The gains from parallel calculations started to decrease with 150 threads. The memory usage increases linearly with number of threads. No critical failures were observed during the simulations. Conclusion: Multithreading in Geant4-MT decreased simulation time in proton dose distribution calculations by a factor of 64 and 54 at a near optimal 100 threads for water phantom and patient's data respectively. Further simulations will be done to determine the efficiency at the optimal thread number. Considering the trend of computer architecture development, utilizing Geant4-MT for radiotherapy simulations is an excellent cost-effective alternative for a distributed batch queuing system. However, because the scalability depends highly on simulation details, i.e., the ratio of the processing time of one event versus waiting time to access for the shared event queue, a performance evaluation as described is recommended

  9. Development of a computational code for calculations of shielding in dental facilities; Desenvolvimento de um codigo computacional para calculos de blindagem em instalacoes odontologicas

    Energy Technology Data Exchange (ETDEWEB)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L., E-mail: deise_dy@hotmail.com, E-mail: diogosb@outlook.com, E-mail: raoniwa@yahoo.com.br, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report.

  10. Facility effluent monitoring plan determinations for the 200 Area facilities

    International Nuclear Information System (INIS)

    Nickels, J.M.

    1991-11-01

    The following facility effluent monitoring plan determinations document the evaluations conducted for the Westinghouse Hanford Company 200 Area facilities (chemical processing, waste management, 222-S Laboratory, and laundry) on the Hanford Site in south central Washington State. These evaluations determined the need for facility effluent monitoring plans for the 200 Area facilities. The facility effluent monitoring plan determinations have been prepared in accordance with A Guide for Preparing Hanford Site Facility Effluent Monitoring Plans, WHC-EP-0438 (WHC 1991). The Plutonium/Uranium Extraction Plant and UO 3 facility effluent monitoring plan determinations were prepared by Los Alamos Technical Associates, Richland, Washington. The Plutonium Finishing Plant, Transuranic Waste Storage and Assay Facility, T Plant, Tank Farms, Low Level Burial Grounds, and 222-S Laboratory determinations were prepared by Science Applications International Corporation of Richland, Washington. The B Plant Facility Effluent Monitoring Plan Determination was prepared by ERCE Environmental Services of Richland, Washington

  11. Nuclear physics accelerator facilities

    International Nuclear Information System (INIS)

    1985-01-01

    The Department of Energy's Nuclear Physics program is a comprehensive program of interdependent experimental and theoretical investigation of atomic nuclei. Long range goals are an understanding of the interactions, properties, and structures of atomic nuclei and nuclear matter at the most elementary level possible and an understanding of the fundamental forces of nature by using nuclei as a proving ground. Basic ingredients of the program are talented and imaginative scientists and a diversity of facilities to provide the variety of probes, instruments, and computational equipment needed for modern nuclear research. Approximately 80% of the total Federal support of basic nuclear research is provided through the Nuclear Physics program; almost all of the remaining 20% is provided by the National Science Foundation. Thus, the Department of Energy (DOE) has a unique responsibility for this important area of basic science and its role in high technology. Experimental and theoretical investigations are leading us to conclude that a new level of understanding of atomic nuclei is achievable. This optimism arises from evidence that: (1) the mesons, protons, and neutrons which are inside nuclei are themselves composed of quarks and gluons and (2) quantum chromodynamics can be developed into a theory which both describes correctly the interaction among quarks and gluons and is also an exact theory of the strong nuclear force. These concepts are important drivers of the Nuclear Physics program

  12. Ground and excited state behavior of 1,4-dimethoxy-3-methyl-anthracene-9,10-dione in silver nanoparticles: Spectral and computational investigations

    Energy Technology Data Exchange (ETDEWEB)

    Umadevi, M., E-mail: ums10@yahoo.com [Department of Physics, Mother Teresa Women' s University, Kodaikanal 624101, Tamil Nadu (India); Kavitha, S.R. [Department of Physics, Mother Teresa Women' s University, Kodaikanal 624101, Tamil Nadu (India); Vanelle, P.; Terme, T.; Khoumeri, O. [Laboratoire de Pharmaco-Chimie Radicalaire, Faculté de Pharmacie, Aix-Marseille Univ, CNRS, Institut de Chimie Radicalaire ICR, UMR 7273, 27 Boulevard Jean Moulin, 13385 Marseille Cedex 05 (France)

    2013-10-15

    Silver nanoparticles (Ag NPs) of various sizes have been successfully synthesized by the simple and convenient Creighton method using sodium borohydride as the reducing agent under microwave irradiation. Optical absorption and fluorescence emission spectroscopic techniques were employed to investigate the effect of silver nanoparticles on the ground and excited state of 1,4-dimethoxy-3-methylanthracene-9,10-dione (DMMAD). The surface plasmon resonance (SPR) peak of the prepared silver colloidal solution was observed at 400 nm. Fluorescence quenching of DMMAD by silver nanoparticles has been found to increase with increase in the size of Ag. The fluorescence quenching has been explained by Forster Resonance Energy Transfer (FRET) theory between DMMAD and silver nanoparticles. The Stern–Volmer quenching constant and Benesi–Hildebrand association constant for the above system were calculated. DFT calculations were also performed to study the charge distribution of DMMAD in Ag both in ground and excited states. -- Highlights: • Silver nanoparticles (Ag NPs) have been synthesized using the Creighton method. • Effect of Ag NPs on the ground state of DMMAD was studied. • Influence of Ag NPs on the excited state of DMMAD was investigated. • Fluorescence quenching has been explained by Forster Resonance Energy Transfer. • Quenching and binding constants were also calculated.

  13. Ground water

    International Nuclear Information System (INIS)

    Osmond, J.K.; Cowart, J.B.

    1982-01-01

    The subject is discussed under the headings: background and theory (introduction; fractionation in the hydrosphere; mobility factors; radioisotope evolution and aquifer classification; aquifer disequilibria and geochemical fronts); case studies (introduction; (a) conservative, and (b) non-conservative, behaviour); ground water dating applications (general requirements; radon and helium; radium isotopes; uranium isotopes). (U.K.)

  14. Ground water

    International Nuclear Information System (INIS)

    Osmond, J.K.; Cowart, J.B.

    1992-01-01

    The great variations in concentrations and activity ratios of 234 U/ 238 U in ground waters and the features causing elemental and isotopic mobility in the hydrosphere are discussed. Fractionation processes and their application to hydrology and other environmental problems such as earthquake, groundwater and aquifer dating are described. (UK)

  15. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  16. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov (United States)

    Computing, Environment and Life Sciences Research Divisions BIOBiosciences CPSComputational Science DSLData Argonne Leadership Computing Facility Biosciences Division Environmental Science Division Mathematics and Computer Science Division Facilities and Institutes Argonne Leadership Computing Facility News Events About

  17. Ground testing and simulation. II - Aerodynamic testing and simulation: Saving lives, time, and money

    Science.gov (United States)

    Dayman, B., Jr.; Fiore, A. W.

    1974-01-01

    The present work discusses in general terms the various kinds of ground facilities, in particular, wind tunnels, which support aerodynamic testing. Since not all flight parameters can be simulated simultaneously, an important problem consists in matching parameters. It is pointed out that there is a lack of wind tunnels for a complete Reynolds-number simulation. Using a computer to simulate flow fields can result in considerable reduction of wind-tunnel hours required to develop a given flight vehicle.

  18. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    Science.gov (United States)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of

  19. Facilities & Leadership

    Data.gov (United States)

    Department of Veterans Affairs — The facilities web service provides VA facility information. The VA facilities locator is a feature that is available across the enterprise, on any webpage, for the...

  20. Ground Pollution Science

    International Nuclear Information System (INIS)

    Oh, Jong Min; Bae, Jae Geun

    1997-08-01

    This book deals with ground pollution science and soil science, classification of soil and fundamentals, ground pollution and human, ground pollution and organic matter, ground pollution and city environment, environmental problems of the earth and ground pollution, soil pollution and development of geological features of the ground, ground pollution and landfill of waste, case of measurement of ground pollution.

  1. Evaluation of ground level concentration of pollutant due to gas flaring by computer simulation: A case study of Niger - Delta area of Nigeria

    Directory of Open Access Journals (Sweden)

    A. S. ABDULKAREEM

    2005-01-01

    Full Text Available The disposal of associated gases through flaring has been a major problem for the Nigerian oil and gas industries and most of theses gases are flared due to the lack of commercial out lets. The resultant effects of gas flaring are the damaging effect of the environment due to acid rain formation, green house effect, global warming and ozone depletion.This writes up is aimed at evaluating ground level concentration of CO2, SO2, NO2 and total hydrocarbon (THC, which are product of gas flared in oil producing areas. Volumes of gas flared at different flow station were collected as well as geometrical parameters. The results of simulation of model developed based on the principles of gaseous dispersion by Gaussian showed a good agreement with dispersion pattern.The results showed that the dispersion pattern of pollutants at ground level depends on the volume of gas flared, wind speed, velocity of discharge and nearness to the source of flaring. The results shows that continuous gas flaring irrespective of the quantity deposited in the immediate environment will in long run lead to change in the physicochemical properties of soil.

  2. Biochemistry Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Biochemistry Facility provides expert services and consultation in biochemical enzyme assays and protein purification. The facility currently features 1) Liquid...

  3. Ground Control System Description Document

    International Nuclear Information System (INIS)

    Eric Loros

    2001-01-01

    The Ground Control System contributes to the safe construction and operation of the subsurface facility, including accesses and waste emplacement drifts, by maintaining the configuration and stability of the openings during construction, development, emplacement, and caretaker modes for the duration of preclosure repository life. The Ground Control System consists of ground support structures installed within the subsurface excavated openings, any reinforcement made to the rock surrounding the opening, and inverts if designed as an integral part of the system. The Ground Control System maintains stability for the range of geologic conditions expected at the repository and for all expected loading conditions, including in situ rock, construction, operation, thermal, and seismic loads. The system maintains the size and geometry of operating envelopes for all openings, including alcoves, accesses, and emplacement drifts. The system provides for the installation and operation of sensors and equipment for any required inspection and monitoring. In addition, the Ground Control System provides protection against rockfall for all subsurface personnel, equipment, and the engineered barrier system, including the waste package during the preclosure period. The Ground Control System uses materials that are sufficiently maintainable and that retain the necessary engineering properties for the anticipated conditions of the preclosure service life. These materials are also compatible with postclosure waste isolation performance requirements of the repository. The Ground Control System interfaces with the Subsurface Facility System for operating envelopes, drift orientation, and excavated opening dimensions, Emplacement Drift System for material compatibility, Monitored Geologic Repository Operations Monitoring and Control System for ground control instrument readings, Waste Emplacement/Retrieval System to support waste emplacement operations, and the Subsurface Excavation System

  4. Denuded Data! Grounded Theory Using the NUDIST Computer Analysis Program: In Researching the Challenge to Teacher Self-Efficacy Posed by Students with Learning Disabilities in Australian Education.

    Science.gov (United States)

    Burroughs-Lange, Sue G.; Lange, John

    This paper evaluates the effects of using the NUDIST (Non-numerical, Unstructured Data Indexing, Searching and Theorising) computer program to organize coded, qualitative data. The use of the software is discussed within the context of the study for which it was used: an Australian study that aimed to develop a theoretical understanding of the…

  5. Computing many-body wave functions with guaranteed precision: the first-order Møller-Plesset wave function for the ground state of helium atom.

    Science.gov (United States)

    Bischoff, Florian A; Harrison, Robert J; Valeev, Edward F

    2012-09-14

    We present an approach to compute accurate correlation energies for atoms and molecules using an adaptive discontinuous spectral-element multiresolution representation for the two-electron wave function. Because of the exponential storage complexity of the spectral-element representation with the number of dimensions, a brute-force computation of two-electron (six-dimensional) wave functions with high precision was not practical. To overcome the key storage bottlenecks we utilized (1) a low-rank tensor approximation (specifically, the singular value decomposition) to compress the wave function, and (2) explicitly correlated R12-type terms in the wave function to regularize the Coulomb electron-electron singularities of the Hamiltonian. All operations necessary to solve the Schrödinger equation were expressed so that the reconstruction of the full-rank form of the wave function is never necessary. Numerical performance of the method was highlighted by computing the first-order Møller-Plesset wave function of a helium atom. The computed second-order Møller-Plesset energy is precise to ~2 microhartrees, which is at the precision limit of the existing general atomic-orbital-based approaches. Our approach does not assume special geometric symmetries, hence application to molecules is straightforward.

  6. Water Activities in Laxemar Simpevarp. The final disposal facility for spent nuclear fuel - removal of groundwater and water activities above ground; Vattenverksamhet i Laxemar-Simpevarp. Slutfoervarsanlaeggning foer anvaent kaernbraensle - bortledande av grundvatten samt vattenverksamheter ovan mark

    Energy Technology Data Exchange (ETDEWEB)

    Werner, Kent (EmpTec (Sweden)); Hamren, Ulrika; Collinder, Per (Ekologigruppen AB (Sweden))

    2010-12-15

    operations would include a bridge across Laxemaraan and measures in the vicinity of the surface facility (the industrial area) for the repository, in Laxemaraan and in a ditch (Oxhagsbaecken). During construction of the bridge, measures would be taken to reduce the consequences of turbid water, for instance for spawning fish. No intermediate support in the stream would be required, and the bridge would be constructed not to influence the flow conditions of the stream and not to be a wandering obstacle for people and animals. Other water operations above ground would be executed for handling of drainage water from the underground part of the repository and leachate from a rock dump. These waters would be diverted to Laxemaraan via a constructed 'lake' adjacent to the stream. The leachate would also be treated in a broad irrigation area with a recirculation- and detention pond (Laxemarkaerren).

  7. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs

    International Nuclear Information System (INIS)

    Nyman, Ulf; Kristiansson, Mattias; Leitz, Wolfram; Paahlstorp, Per-Aake

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations

  8. Waste migration studies at the Savannah River Plant burial ground

    International Nuclear Information System (INIS)

    Stone, J.A.; Oblath, S.B.; Hawkins, R.H.; Grant, M.W.; Hoeffner, S.L.; King, C.M.

    1985-01-01

    The low-level radioactive waste burial ground at the Savannah River Plant is a typical shallow-land-burial disposal site in a humid region. Studies of waste migration at this site provide generic data for designing other disposal facilities. A program of field, laboratory, and modeling studies for the SRP burial ground has been conducted for several years. Recent results of lysimeter tests, soil-water chemistry studies, and transport modeling are reported. The lysimeter experiments include ongoing tests with 40 lysimeters containing a variety of defense wastes, and recently concluded lysimeter tests with tritium and plutonium waste forms. The tritium lysimeter operated 12 years. In chemistry studies, measurements of soil-water distribution coefficients (K/sub d/) were concluded. Current emphasis is on identification of trace organic compounds in groundwater from the burial site. Development of the dose-to-man model was completed, and the computer code is available for routine use. 16 refs., 2 figs., 2 tabs

  9. Dance Facilities.

    Science.gov (United States)

    Ashton, Dudley, Ed.; Irey, Charlotte, Ed.

    This booklet represents an effort to assist teachers and administrators in the professional planning of dance facilities and equipment. Three chapters present the history of dance facilities, provide recommended dance facilities and equipment, and offer some adaptations of dance facilities and equipment, for elementary, secondary and college level…

  10. Implementation of computer codes for performance assessment of the Republic repository of LLW/ILW Mochovce

    International Nuclear Information System (INIS)

    Hanusik, V.; Kopcani, I.; Gedeon, M.

    2000-01-01

    This paper describes selection and adaptation of computer codes required to assess the effects of radionuclide release from Mochovce Radioactive Waste Disposal Facility. The paper also demonstrates how these codes can be integrated into performance assessment methodology. The considered codes include DUST-MS for source term release, MODFLOW for ground-water flow and BS for transport through biosphere and dose assessment. (author)

  11. Staff experiences within the implementation of computer-based nursing records in residential aged care facilities: a systematic review and synthesis of qualitative research.

    Science.gov (United States)

    Meißner, Anne; Schnepp, Wilfried

    2014-06-20

    Since the introduction of electronic nursing documentation systems, its implementation in recent years has increased rapidly in Germany. The objectives of such systems are to save time, to improve information handling and to improve quality. To integrate IT in the daily working processes, the employee is the pivotal element. Therefore it is important to understand nurses' experience with IT implementation. At present the literature shows a lack of understanding exploring staff experiences within the implementation process. A systematic review and meta-ethnographic synthesis of primary studies using qualitative methods was conducted in PubMed, CINAHL, and Cochrane. It adheres to the principles of the PRISMA statement. The studies were original, peer-reviewed articles from 2000 to 2013, focusing on computer-based nursing documentation in Residential Aged Care Facilities. The use of IT requires a different form of information processing. Some experience this new form of information processing as a benefit while others do not. The latter find it more difficult to enter data and this result in poor clinical documentation. Improvement in the quality of residents' records leads to an overall improvement in the quality of care. However, if the quality of those records is poor, some residents do not receive the necessary care. Furthermore, the length of time necessary to complete the documentation is a prominent theme within that process. Those who are more efficient with the electronic documentation demonstrate improved time management. For those who are less efficient with electronic documentation the information processing is perceived as time consuming. Normally, it is possible to experience benefits when using IT, but this depends on either promoting or hindering factors, e.g. ease of use and ability to use it, equipment availability and technical functionality, as well as attitude. In summary, the findings showed that members of staff experience IT as a benefit when

  12. Ground Motion Characteristics of Induced Earthquakes in Central North America

    Science.gov (United States)

    Atkinson, G. M.; Assatourians, K.; Novakovic, M.

    2017-12-01

    The ground motion characteristics of induced earthquakes in central North America are investigated based on empirical analysis of a compiled database of 4,000,000 digital ground-motion records from events in induced-seismicity regions (especially Oklahoma). Ground-motion amplitudes are characterized non-parametrically by computing median amplitudes and their variability in magnitude-distance bins. We also use inversion techniques to solve for regional source, attenuation and site response effects. Ground motion models are used to interpret the observations and compare the source and attenuation attributes of induced earthquakes to those of their natural counterparts. Significant conclusions are that the stress parameter that controls the strength of high-frequency radiation is similar for induced earthquakes (depth of h 5 km) and shallow (h 5 km) natural earthquakes. By contrast, deeper natural earthquakes (h 10 km) have stronger high-frequency ground motions. At distances close to the epicenter, a greater focal depth (which increases distance from the hypocenter) counterbalances the effects of a larger stress parameter, resulting in motions of similar strength close to the epicenter, regardless of event depth. The felt effects of induced versus natural earthquakes are also investigated using USGS "Did You Feel It?" reports; 400,000 reports from natural events and 100,000 reports from induced events are considered. The felt reports confirm the trends that we expect based on ground-motion modeling, considering the offsetting effects of the stress parameter versus focal depth in controlling the strength of motions near the epicenter. Specifically, felt intensity for a given magnitude is similar near the epicenter, on average, for all event types and depths. At distances more than 10 km from the epicenter, deeper events are felt more strongly than shallow events. These ground-motion attributes imply that the induced-seismicity hazard is most critical for facilities in

  13. Implementation is crucial but must be neurobiologically grounded. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L.

    2014-09-01

    From the perspective of language, Fitch's [1] claim that theories of cognitive computation should not be separated from those of implementation surely deserves applauding. Recent developments in the Cognitive Neuroscience of Language, leading to the new field of the Neurobiology of Language [2-4], emphasise precisely this point: rather than attempting to simply map cognitive theories of language onto the brain, we should aspire to understand how the brain implements language. This perspective resonates with many of the points raised by Fitch in his review, such as the discussion of unhelpful dichotomies (e.g., Nature versus Nurture). Cognitive dichotomies and debates have repeatedly turned out to be of limited usefulness when it comes to understanding language in the brain. The famous modularity-versus-interactivity and dual route-versus-connectionist debates are cases in point: in spite of hundreds of experiments using neuroimaging (or other techniques), or the construction of myriad computer models, little progress has been made in their resolution. This suggests that dichotomies proposed at a purely cognitive (or computational) level without consideration of biological grounding appear to be "asking the wrong questions" about the neurobiology of language. In accordance with these developments, several recent proposals explicitly consider neurobiological constraints while seeking to explain language processing at a cognitive level (e.g. [5-7]).

  14. 'Grounded' Politics

    DEFF Research Database (Denmark)

    Schmidt, Garbi

    2012-01-01

    play within one particular neighbourhood: Nørrebro in the Danish capital, Copenhagen. The article introduces the concept of grounded politics to analyse how groups of Muslim immigrants in Nørrebro use the space, relationships and history of the neighbourhood for identity political statements....... The article further describes how national political debates over the Muslim presence in Denmark affect identity political manifestations within Nørrebro. By using Duncan Bell’s concept of mythscape (Bell, 2003), the article shows how some political actors idealize Nørrebro’s past to contest the present...... ethnic and religious diversity of the neighbourhood and, further, to frame what they see as the deterioration of genuine Danish identity....

  15. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, Exploration, and Human Health and Safety

    Science.gov (United States)

    Koontz, Steve

    2015-01-01

    In this presentation a review of galactic cosmic ray (GCR) effects on microelectronic systems and human health and safety is given. The methods used to evaluate and mitigate unwanted cosmic ray effects in ground-based, atmospheric flight, and space flight environments are also reviewed. However not all GCR effects are undesirable. We will also briefly review how observation and analysis of GCR interactions with planetary atmospheres and surfaces and reveal important compositional and geophysical data on earth and elsewhere. About 1000 GCR particles enter every square meter of Earth’s upper atmosphere every second, roughly the same number striking every square meter of the International Space Station (ISS) and every other low- Earth orbit spacecraft. GCR particles are high energy ionized atomic nuclei (90% protons, 9% alpha particles, 1% heavier nuclei) traveling very close to the speed of light. The GCR particle flux is even higher in interplanetary space because the geomagnetic field provides some limited magnetic shielding. Collisions of GCR particles with atomic nuclei in planetary atmospheres and/or regolith as well as spacecraft materials produce nuclear reactions and energetic/highly penetrating secondary particle showers. Three twentieth century technology developments have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems and assess effects on human health and safety effects. The key technology developments are: 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems. Space and geophysical exploration needs drove the development of the instruments and analytical tools needed to recover compositional and structural data from GCR induced nuclear reactions and secondary particle showers. Finally, the

  16. Significant RF-EMF and thermal levels observed in a computational model of a person with a tibial plate for grounded 40 MHz exposure.

    Science.gov (United States)

    McIntosh, Robert L; Iskra, Steve; Anderson, Vitas

    2014-05-01

    Using numerical modeling, a worst-case scenario is considered when a person with a metallic implant is exposed to a radiofrequency (RF) electromagnetic field (EMF). An adult male standing on a conductive ground plane was exposed to a 40 MHz vertically polarized plane wave field, close to whole-body resonance where maximal induced current flows are expected in the legs. A metal plate (50-300 mm long) was attached to the tibia in the left leg. The findings from this study re-emphasize the need to ensure compliance with limb current reference levels for exposures near whole-body resonance, and not just rely on compliance with ambient electric (E) and magnetic (H) field reference levels. Moreover, we emphasize this recommendation for someone with a tibial plate, as failure to comply may result in significant tissue damage (increases in the localized temperature of 5-10 °C were suggested by the modeling for an incident E-field of 61.4 V/m root mean square (rms)). It was determined that the occupational reference level for limb current (100 mA rms), as stipulated in the 1998 guidelines of the International Commission on Non-Ionizing Radiation Protection (ICNIRP), is satisfied if the plane wave incident E-field levels are no more than 29.8 V/m rms without an implant and 23.4 V/m rms for the model with a 300 mm implant. © 2014 Wiley Periodicals, Inc.

  17. Outline of NUCEF facility

    International Nuclear Information System (INIS)

    Takeshita, Isao

    1996-01-01

    NUCEF is a multipurpose research facility in the field of safety and advanced technology of nuclear fuel cycle back-end. Various experiment facilities and its supporting installations, in which nuclear fuel materials, radio isotopes and TRU elements can be handled, are arranged in more than one hundred rooms of two experiment buildings. Its construction was completed in middle of 1994 and hot experiments have been started since then. NUCEF is located on the site (30,000 m 2 ) of southeastern part in the Tokai Research Establishment of JAERI facing to the Pacific Ocean. The base of Experiment Buildings A and B was directly founded on the rock existing at 10-15 m below ground level taking the aseismatic design into consideration. Each building is almost same sized and composed of one basement and three floors of which area is 17,500 m 2 in total. In the basement, there are exhaust facilities of ventilation system, treatment system of solution fuel and radioactive waste solution and storage tanks of them. Major experiment facilities are located on the first or the second floors in each building. An air-inlet facility of ventilation system for each building is equipped on the third floor. Most of experiment facilities for criticality safety research including two critical facilities: Static Experiment Critical Facility (STACY) and Transient Experiment Critical Facility (TRACY) are installed in Experiment Building A. Experiment equipments for research on advanced fuel reprocessing process and on TRU waste management, which are named BECKY (Back End Fuel Cycle Key Elements Research Facility), are installed in laboratories and a-g cells in Experiment Building B. (J.P.N.)

  18. Wind Energy Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurie, Carol

    2017-02-01

    This book takes readers inside the places where daily discoveries shape the next generation of wind power systems. Energy Department laboratory facilities span the United States and offer wind research capabilities to meet industry needs. The facilities described in this book make it possible for industry players to increase reliability, improve efficiency, and reduce the cost of wind energy -- one discovery at a time. Whether you require blade testing or resource characterization, grid integration or high-performance computing, Department of Energy laboratory facilities offer a variety of capabilities to meet your wind research needs.

  19. Waste Facilities

    Data.gov (United States)

    Vermont Center for Geographic Information — This dataset was developed from the Vermont DEC's list of certified solid waste facilities. It includes facility name, contact information, and the materials...

  20. Health Facilities

    Science.gov (United States)

    Health facilities are places that provide health care. They include hospitals, clinics, outpatient care centers, and specialized care centers, ... psychiatric care centers. When you choose a health facility, you might want to consider How close it ...

  1. Fabrication Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Fabrication Facilities are a direct result of years of testing support. Through years of experience, the three fabrication facilities (Fort Hood, Fort Lewis, and...

  2. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  3. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  4. Measurement of ground motion in various sites

    International Nuclear Information System (INIS)

    Bialowons, W.; Amirikas, R.; Bertolini, A.; Kruecker, D.

    2007-04-01

    Ground vibrations may affect low emittance beam transport in linear colliders, Free Electron Lasers (FEL) and synchrotron radiation facilities. This paper is an overview of a study program to measure ground vibrations in various sites which can be used for site characterization in relation to accelerator design. Commercial broadband seismometers have been used to measure ground vibrations and the resultant database is available to the scientific community. The methodology employed is to use the same equipment and data analysis tools for ease of comparison. This database of ground vibrations taken in 19 sites around the world is first of its kind. (orig.)

  5. Regional analysis of ground and above-ground climate

    Science.gov (United States)

    1981-12-01

    The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of Earth tempering as a practice and of specific Earth sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermal advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground are included. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 20 locations in the United States.

  6. Regional analysis of ground and above-ground climate

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-01

    The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long-term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of earth tempering as a practice and of specific earth-sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermal advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Also contained in the report are reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 29 locations in the United States.

  7. Water activities in Forsmark (Part II). The final disposal facility for spent fuel: water activities above ground; Vattenverksamhet i Forsmark (del II). Slutfoervarsanlaeggningen foer anvaent kaernbraensle: Vattenverksamheter ovan mark

    Energy Technology Data Exchange (ETDEWEB)

    Werner, Kent [EmpTec (Sweden); Hamren, Ulrika; Collinder, Per [Ekologigruppen AB (Sweden); Ridderstolpe, Peter [WRS Uppsala AB (Sweden)

    2010-09-15

    The construction of the repository for spent nuclear fuel in Forsmark is associated with a number of measures above ground that constitute water operations according to Chapter 11 in the Swedish Environmental Code. This report, which is an appendix to the Environmental Impact Assessment, describes these water operations, their effects and consequences, and planned measures

  8. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  9. Cleanup Verification Package for the 618-2 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2006-12-28

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities.

  10. Cleanup Verification Package for the 618-2 Burial Ground

    International Nuclear Information System (INIS)

    Thompson, W.S.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities

  11. OFF-Stagnation point testing in plasma facility

    Science.gov (United States)

    Viladegut, A.; Chazot, O.

    2015-06-01

    Reentry space vehicles face extreme conditions of heat flux when interacting with the atmosphere at hypersonic velocities. Stagnation point heat flux is normally used as a reference for Thermal Protection Material (TPS) design; however, many critical phenomena also occur at off-stagnation point. This paper adresses the implementation of an offstagnation point methodology able to duplicate in ground facility the hypersonic boundary layer over a flat plate model. The first analysis using two-dimensional (2D) computational fluid dynamics (CFD) simulations is carried out to understand the limitations of this methodology when applying it in plasma wind tunnel. The results from the testing campaign at VKI Plasmatron are also presented.

  12. 6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

    CERN Multimedia

    Teams : M. Brice, JC Gadmer

    2010-01-01

    6th July 2010 - United Kingdom Science and Technology Facilities Council W. Whitehorn signing the guest book with Head of International relations F. Pauss, visiting the Computing Centre with Information Technology Department Head Deputy D. Foster, the LHC superconducting magnet test hall with Technology Department P. Strubin,the Centre Control Centre with Operation Group Leader M. Lamont and the CLIC/CTF3 facility with Project Leader J.-P. Delahaye.

  13. Viability of infrared FEL facilities

    International Nuclear Information System (INIS)

    Schwettman, H.A.

    2004-01-01

    Infrared FELs have broken important ground in optical science in the past decade. The rapid development of optical parametric amplifiers and oscillators, and THz sources, however, has changed the competitive landscape and compelled FEL facilities to identify and exploit their unique advantages. The viability of infrared FEL facilities depends on targeting unique world-class science and providing adequate experimental beam time at competitive costs

  14. Thin-section computed tomography–histopathologic comparisons of pulmonary focal interstitial fibrosis, atypical adenomatous hyperplasia, adenocarcinoma in situ, and minimally invasive adenocarcinoma with pure ground-glass opacity

    Energy Technology Data Exchange (ETDEWEB)

    Si, Ming-Jue, E-mail: smjsh@hotmail.com [Department of Radiology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Tao, Xiao-Feng, E-mail: taoxiaofeng1963@hotmail.com [Department of Radiology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Du, Guang-Ye, E-mail: 715376158@qq.com [Department of Pathology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Cai, Ling-Ling, E-mail: caill_00@163.com [Department of Radiology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Han, Hong-Xiu, E-mail: hanhongxiu@hotmail.com [Department of Pathology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Liang, Xi-Zi, E-mail: liangxizish@hotmail.com [Department of Pathology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China); Zhao, Jiang-Min, E-mail: zhaojiangmin1962@hotmail.com [Department of Radiology, Shanghai Ninth People’s Hospital, Shanghai Jiao Tong University School of Medicine, No. 280, Mohe Road, Shanghai 201999 (China)

    2016-10-15

    Objective: To retrospectively compare focal interstitial fibrosis (FIF), atypical adenomatous hyperplasia (AAH), adenocarcinoma in situ (AIS), and minimally invasive adenocarcinoma (MIA) with pure ground-glass opacity (GGO) using thin-section computed tomography (CT). Materials and methods: Sixty pathologically confirmed cases were reviewed including 7 cases of FIF, 17 of AAH, 23of AIS, and 13 of MIA. All nodules kept pure ground glass appearances before surgical resection and their last time of thin-section CT imaging data before operation were collected. Differences of patient demographics and CT features were compared among these four types of lesions. Results: FIF occurred more frequently in males and smokers while the others occurred more frequently in female nonsmokers. Nodule size was significant larger in MIA (P < 0.001, cut-off value = 7.5 mm). Nodule shape (P = 0.045), margin characteristics (P < 0.001), the presence of pleural indentation (P = 0.032), and vascular ingress (P < 0.001) were significant factors that differentiated the 4 groups. A concave margin was only demonstrated in a high proportion of FIF at 85.7% (P = 0.002). There were no significant differences (all P > 0.05) in age, malignant history, attenuation value, location, and presence of bubble-like lucency. Conclusion: A nodule size >7.5 mm increases the possibility of MIA. A concave margin could be useful for differentiation of FIF from the other malignant or pre-malignant GGO nodules. The presence of spiculation or pleural indentation may preclude the diagnosis of AAH.

  15. Ground Truth Annotation in T Analyst

    DEFF Research Database (Denmark)

    2015-01-01

    This video shows how to annotate the ground truth tracks in the thermal videos. The ground truth tracks are produced to be able to compare them to tracks obtained from a Computer Vision tracking approach. The program used for annotation is T-Analyst, which is developed by Aliaksei Laureshyn, Ph...

  16. Optimizing Engineering Tools Using Modern Ground Architectures

    Science.gov (United States)

    2017-12-01

    ENGINEERING TOOLS USING MODERN GROUND ARCHITECTURES by Ryan P. McArdle December 2017 Thesis Advisor: Marc Peters Co-Advisor: I.M. Ross...Master’s thesis 4. TITLE AND SUBTITLE OPTIMIZING ENGINEERING TOOLS USING MODERN GROUND ARCHITECTURES 5. FUNDING NUMBERS 6. AUTHOR(S) Ryan P. McArdle 7... engineering tools. First, the effectiveness of MathWorks’ Parallel Computing Toolkit is assessed when performing somewhat basic computations in

  17. Design basis ground motion (Ss) required on new regulatory guide

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro

    2013-01-01

    New regulatory guide is enforced on July 8. Here, it is introduced how the design basis ground motion (Ss) for seismic design of nuclear power reactor facilities was revised on the new guide. Ss is formulated as two types of earthquake ground motions, earthquake ground motions with site specific earthquake source and with no such specific source locations. The latter is going to be revised based on the recent observed near source ground motions. (author)

  18. Facilities Programming.

    Science.gov (United States)

    Bullis, Robert V.

    1992-01-01

    A procedure for physical facilities management written 17 years ago is still worth following today. Each of the steps outlined for planning, organizing, directing, controlling, and evaluating must be accomplished if school facilities are to be properly planned and constructed. However, lessons have been learned about energy consumption and proper…

  19. Nuclear facilities

    International Nuclear Information System (INIS)

    Anon.

    2000-01-01

    Here is given the decree (2000-1065) of the 25. of October 2000 reporting the publication of the convention between the Government of the French Republic and the CERN concerning the safety of the LHC (Large Hadron Collider) and the SPS (Proton Supersynchrotron) facilities, signed in Geneva on July 11, 2000. By this convention, the CERN undertakes to ensure the safety of the LHC and SPS facilities and those of the operations of the LEP decommissioning. The French legislation and regulations on basic nuclear facilities (concerning more particularly the protection against ionizing radiations, the protection of the environment and the safety of facilities) and those which could be decided later on apply to the LHC, SPS and auxiliary facilities. (O.M.)

  20. Patient-specific radiation dose and cancer risk in computed tomography examinations in some selected CT facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Osei, R. K.

    2012-01-01

    The effective dose and cancer risk were determined for patients undergoing seven different types of CT examinations in two CT facilities in the Greater Accra region of Ghana. The two facilities, namely; the Diagnostic Centre Ltd and Cocoa Clinic were chosen because of their significant patient throughput. The effective dose was from patient data namely age, sex, height, weight and technique factors; namely scan length, KVp (Kilovolts peak), mAs (milliamperes per second) and CTDIv from the control console of the CT machines. The effective dose was also estimated using the dose length product (DLP) and k Coefficients which is the anatomic region specific conversion factors. The cancer risk for each patient for a particular examination was determined from the effective dose, age and sex of each patient with the help of BEIR VII. In all, a total number of 800 adult patients with 400 from each of the two CT facilities were compiled. From Diagnostic Centre Ltd, the average effective dose was 5.61mSv in the range of 1.41mSv to 13.34mSv with average BMI of 26.19kg/m 2 in the range of 16.90kg/m 2 to 48.28kg/m 2 for all types of examinations. The average cancer risk was 0.0458 Sv - 1 for 400 patients in the range of 0.0001 Sv - 1 to 0.3036 Sv -1 compared with a population of 900 patients undergoing CT examination per year. From Cocoa Clinic, the average effective dose was 3.91MSv in the range of 0.54mSv to 27.32mSv with an average BMI of 25.59 kg/m 2 in the range of 17.18kg/m 2 to 35.34kg/m 2 and the average cancer risk was 0.0371 Sv - 1 in the range of 0.0001 Sv - 1 and 0.7125 Sv -1 . Some of the values were within the range of values of typical for typical effective dose for CT examinations reported by the ICRP. It was evident from this study that the variations in scanning parameters had significant impact on the effective doses to patient for similar CT examinations among the two facilities.(au)

  1. Ground water '89

    International Nuclear Information System (INIS)

    1989-01-01

    The proceedings of the 5th biennial symposium of the Ground Water Division of the Geological Society of South Africa are presented. The theme of the symposium was ground water and mining. Papers were presented on the following topics: ground water resources; ground water contamination; chemical analyses of ground water and mining and its influece on ground water. Separate abstracts were prepared for 5 of the papers presented. The remaining papers were considered outside the subject scope of INIS

  2. Unique life sciences research facilities at NASA Ames Research Center

    Science.gov (United States)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  3. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  4. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  5. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  6. Meteorological instrumentation for nuclear facilities

    International Nuclear Information System (INIS)

    Costa, A.C.L. da.

    1983-01-01

    The main requirements of regulatory agencies, concerning the meteorological instrumentation needed for the licensing of nuclear facilities are discussed. A description is made of the operational principles of sensors for the various meteorological parameters and associated electronic systems. An analysis of the problems associated with grounding of a typical meteorological station is presented. (Author) [pt

  7. DRY TRANSFER FACILITY SEISMIC ANALYSIS

    International Nuclear Information System (INIS)

    EARNEST, S.; KO, H.; DOCKERY, W.; PERNISI, R.

    2004-01-01

    The purpose of this calculation is to perform a dynamic and static analysis on the Dry Transfer Facility, and to determine the response spectra seismic forces for the design basis ground motions. The resulting seismic forces and accelerations will be used in a subsequent calculation to complete preliminary design of the concrete shear walls, diaphragms, and basemat

  8. Ground Vehicle Convoying

    Science.gov (United States)

    Gage, Douglas W.; Pletta, J. Bryan

    1987-01-01

    Initial investigations into two different approaches for applying autonomous ground vehicle technology to the vehicle convoying application are described. A minimal capability system that would maintain desired speed and vehicle spacing while a human driver provided steering control could improve convoy performance and provide positive control at night and in inclement weather, but would not reduce driver manpower requirements. Such a system could be implemented in a modular and relatively low cost manner. A more capable system would eliminate the human driver in following vehicles and reduce manpower requirements for the transportation of supplies. This technology could also be used to aid in the deployment of teleoperated vehicles in a battlefield environment. The needs, requirements, and several proposed solutions for such an Attachable Robotic Convoy Capability (ARCC) system will be discussed. Included are discussions of sensors, communications, computers, control systems and safety issues. This advanced robotic convoy system will provide a much greater capability, but will be more difficult and expensive to implement.

  9. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  10. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  11. Enhanced Research Opportunity to Study the Atmospheric Forcing by High-Energy Particle Precipitation at High Latitudes: Emerging New Satellite Data and the new Ground-Based Observations in Northern Scandinavia, including the EISCAT_3D Incoherent Scatter Facility.

    Science.gov (United States)

    Turunen, E. S.; Ulich, T.; Kero, A.; Tero, R.; Verronen, P. T.; Norberg, J.; Miyoshi, Y.; Oyama, S. I.; Saito, S.; Hosokawa, K.; Ogawa, Y.

    2017-12-01

    Recent observational and model results on the particle precipitation as source of atmospheric variability challenge us to implement better and continuously monitoring observational infrastructure for middle and upper atmospheric research. An example is the effect of high-energy electron precipitation during pulsating aurora on mesospheric ozone, the concentration of which may be reduced by several tens of percent, similarily as during some solar proton events, which are known to occur more rarely than pulsating aurora. So far the Assessment Reports by the Intergovernmental Panel on Climate Change did not include explicitely the particle forcing of middle and upper atmosphere in their climate model scenarios. This will appear for the first time in the upcoming climate simulations. We review recent results related to atmospheric forcing by particle precipitation via effects on chemical composition. We also show the research potential of new ground-based radio measurement techniques, such as spectral riometry and incoherent scatter by new phased-array radars, such as EISCAT_3D, which will be a volumetric, 3- dimensionally imaging radar, distributed in Norway, Sweden, and Finland. It is expected to be operational from 2020 onwards, surpassing all the current IS radars of the world in technology. It will be able to produce continuous information of ionospheric plasma parameters in a volume, including 3D-vector plasma velocities. For the first time we will be able to map the 3D electric currents in ionosphere, as well as we will have continuous vector wind measurements in mesosphere. The geographical area covered by the EISCAT_3D measurements can be expanded by suitably selected other continuous observations, such as optical and satellite tomography networks. A new 100 Hz all-sky camera network was recently installed in Northern Scandinavia in order to support the Japanese Arase satellite mission. In near future the ground-based measurement network will also include new

  12. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    International Nuclear Information System (INIS)

    Sugino, Hideharu; Onizawa, Kunio; Suzuki, Masahide

    2005-09-01

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  13. Decommissioning Facility Characterization DB System

    International Nuclear Information System (INIS)

    Park, S. K.; Ji, Y. H.; Park, J. H.; Chung, U. S.

    2010-01-01

    Basically, when a decommissioning is planed for a nuclear facility, an investigation into the characterization of the nuclear facility is first required. The results of such an investigation are used for calculating the quantities of dismantled waste and estimating the cost of the decommissioning project. In this paper, it is presented a computer system for the characterization of nuclear facilities, called DEFACS (DEcommissioning FAcility Characterization DB System). This system consists of four main parts: a management coding system for grouping items, a data input system, a data processing system and a data output system. All data is processed in a simplified and formatted manner in order to provide useful information to the decommissioning planner. For the hardware, PC grade computers running Oracle software on Microsoft Windows OS were selected. The characterization data results for the nuclear facility under decommissioning will be utilized for the work-unit productivity calculation system and decommissioning engineering system as basic sources of information

  14. Decommissioning Facility Characterization DB System

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. K.; Ji, Y. H.; Park, J. H.; Chung, U. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Basically, when a decommissioning is planed for a nuclear facility, an investigation into the characterization of the nuclear facility is first required. The results of such an investigation are used for calculating the quantities of dismantled waste and estimating the cost of the decommissioning project. In this paper, it is presented a computer system for the characterization of nuclear facilities, called DEFACS (DEcommissioning FAcility Characterization DB System). This system consists of four main parts: a management coding system for grouping items, a data input system, a data processing system and a data output system. All data is processed in a simplified and formatted manner in order to provide useful information to the decommissioning planner. For the hardware, PC grade computers running Oracle software on Microsoft Windows OS were selected. The characterization data results for the nuclear facility under decommissioning will be utilized for the work-unit productivity calculation system and decommissioning engineering system as basic sources of information

  15. Mammography Facilities

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mammography Facility Database is updated periodically based on information received from the four FDA-approved accreditation bodies: the American College of...

  16. Canyon Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — B Plant, T Plant, U Plant, PUREX, and REDOX (see their links) are the five facilities at Hanford where the original objective was plutonium removal from the uranium...

  17. Methodology to evaluate the site standard seismic motion to a nuclear facility

    International Nuclear Information System (INIS)

    Soares, W.A.

    1983-01-01

    For the seismic design of nuclear facilities, the input motion is normally defined by the predicted maximum ground horizontal acceleration and the free field ground response spectrum. This spectrum is computed on the basis of records of strong motion earthquakes. The pair maximum acceleration-response spectrum is called the site standard seismic motion. An overall view of the subjects involved in the determination of the site standard seismic motion to a nuclear facility is presented. The main topics discussed are: basic principles of seismic instrumentation; dynamic and spectral concepts; design earthquakes definitions; fundamentals of seismology; empirical curves developed from prior seismic data; available methodologies and recommended procedures to evaluate the site standard seismic motion. (Author) [pt

  18. Brayton Isotope Power System (BIPS) facility specification

    International Nuclear Information System (INIS)

    1976-01-01

    General requirements for the Brayton Isotope Power System (BIPS)/Ground Demonstration System (GDS) assembly and test facility are defined. The facility will include provisions for a complete test laboratory for GDS checkout, performance, and endurance testing, and a contamination-controlled area for assembly, fabrication, storage, and storage preparation of GDS components. Specifications, schedules, and drawings are included

  19. Brayton Isotope Power System (BIPS) facility specification

    Energy Technology Data Exchange (ETDEWEB)

    1976-05-31

    General requirements for the Brayton Isotope Power System (BIPS)/Ground Demonstration System (GDS) assembly and test facility are defined. The facility will include provisions for a complete test laboratory for GDS checkout, performance, and endurance testing, and a contamination-controlled area for assembly, fabrication, storage, and storage preparation of GDS components. Specifications, schedules, and drawings are included.

  20. Designing Facilities for Collaborative Operations

    Science.gov (United States)

    Norris, Jeffrey; Powell, Mark; Backes, Paul; Steinke, Robert; Tso, Kam; Wales, Roxana

    2003-01-01

    A methodology for designing operational facilities for collaboration by multiple experts has begun to take shape as an outgrowth of a project to design such facilities for scientific operations of the planned 2003 Mars Exploration Rover (MER) mission. The methodology could also be applicable to the design of military "situation rooms" and other facilities for terrestrial missions. It was recognized in this project that modern mission operations depend heavily upon the collaborative use of computers. It was further recognized that tests have shown that layout of a facility exerts a dramatic effect on the efficiency and endurance of the operations staff. The facility designs (for example, see figure) and the methodology developed during the project reflect this recognition. One element of the methodology is a metric, called effective capacity, that was created for use in evaluating proposed MER operational facilities and may also be useful for evaluating other collaboration spaces, including meeting rooms and military situation rooms. The effective capacity of a facility is defined as the number of people in the facility who can be meaningfully engaged in its operations. A person is considered to be meaningfully engaged if the person can (1) see, hear, and communicate with everyone else present; (2) see the material under discussion (typically data on a piece of paper, computer monitor, or projection screen); and (3) provide input to the product under development by the group. The effective capacity of a facility is less than the number of people that can physically fit in the facility. For example, a typical office that contains a desktop computer has an effective capacity of .4, while a small conference room that contains a projection screen has an effective capacity of around 10. Little or no benefit would be derived from allowing the number of persons in an operational facility to exceed its effective capacity: At best, the operations staff would be underutilized

  1. Ground water and energy

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    This national workshop on ground water and energy was conceived by the US Department of Energy's Office of Environmental Assessments. Generally, OEA needed to know what data are available on ground water, what information is still needed, and how DOE can best utilize what has already been learned. The workshop focussed on three areas: (1) ground water supply; (2) conflicts and barriers to ground water use; and (3) alternatives or solutions to the various issues relating to ground water. (ACR)

  2. Large mass storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, Arnold M.

    1978-08-01

    This is the final report of a study group organized to investigate questions surrounding the acquisition of a large mass storage facility. The programatic justification for such a system at Brookhaven is reviewed. Several candidate commercial products are identified and discussed. A draft of a procurement specification is developed. Some thoughts on possible new directions for computing at Brookhaven are also offered, although this topic was addressed outside of the context of the group's deliberations. 2 figures, 3 tables.

  3. Leaders break ground for INFINITY

    Science.gov (United States)

    2008-01-01

    Community leaders from Mississippi and Louisiana break ground for the new INFINITY at NASA Stennis Space Center facility during a Nov. 20 ceremony. Groundbreaking participants included (l to r): Gottfried Construction representative John Smith, Mississippi Highway Commissioner Wayne Brown, INFINITY board member and Apollo 13 astronaut Fred Haise, Stennis Director Gene Goldman, Studio South representative David Hardy, Leo Seal Jr. family representative Virginia Wagner, Hancock Bank President George Schloegel, Mississippi Rep. J.P. Compretta, Mississippi Band of Choctaw Indians representative Charlie Benn and Louisiana Sen. A.G. Crowe.

  4. Planning School Grounds for Outdoor Learning

    Science.gov (United States)

    Wagner, Cheryl; Gordon, Douglas

    2010-01-01

    This publication covers the planning and design of school grounds for outdoor learning in new and existing K-12 facilities. Curriculum development as well as athletic field planning and maintenance are not covered although some references on these topics are provided. It discusses the different types of outdoor learning environments that can be…

  5. Irradiation facilities in JRR-3M

    International Nuclear Information System (INIS)

    Ohtomo, Akitoshi; Sigemoto, Masamitsu; Takahashi, Hidetake

    1992-01-01

    Irradiation facilities have been installed in the upgraded JRR-3 (JRR-3M) in Japan Atomic Energy Research Institute (JAERI). There are hydraulic rabbit facilities (HR), pneumatic rabbit facilities (PN), neutron activation analysis facility (PN3), uniform irradiation facility (SI), rotating irradiation facility and capsule irradiation facilities to carry out the neutron irradiation in the JRR-3M. These facilities are operated using a process control computer system to centerize the process information. Some of the characteristics for the facilities were satisfactorily measured at the same time of reactor performance test in 1990. During reactor operation, some of the tests are continued to confirm the basic characteristics on facilities, for example, PN3 was confirmed to have enough performance for activation analysis. Measurement of neutron flux at all irradiation positions has been carried out for the equilibrium core. (author)

  6. Service facilities

    International Nuclear Information System (INIS)

    Lestyan, Ernoe

    1988-01-01

    Major structural features of the supplementary buildings where among others the cloack-rooms, laundries, the dosimetric and radiochemical laboratories are situated are given. Additional buildings, not in close connection with the technology such as the central office, chemical water pretreatment, boiler, heat centre, transducer station, kitchen, canteen etc. are listed. Ground plans and photos are presented. (V.N.) 11 figs

  7. Space shuttle/food system study. Volume 2, Appendix G: Ground support system analysis. Appendix H: Galley functional details analysis

    Science.gov (United States)

    1974-01-01

    The capabilities for preflight feeding of flight personnel and the supply and control of the space shuttle flight food system were investigated to determine ground support requirements; and the functional details of an onboard food system galley are shown in photographic mockups. The elements which were identified as necessary to the efficient accomplishment of ground support functions include the following: (1) administration; (2) dietetics; (3) analytical laboratories; (4) flight food warehouse; (5) stowage module assembly area; (6) launch site module storage area; (7) alert crew restaurant and disperse crew galleys; (8) ground food warehouse; (9) manufacturing facilities; (10) transport; and (11) computer support. Each element is discussed according to the design criteria of minimum cost, maximum flexibility, reliability, and efficiency consistent with space shuttle requirements. The galley mockup overview illustrates the initial operation configuration, food stowage locations, meal assembly and serving trays, meal preparation configuration, serving, trash management, and the logistics of handling and cleanup equipment.

  8. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  9. SRS Burial Ground Complex: Remediation in Progress

    International Nuclear Information System (INIS)

    Griffin, M.; Crapse, B.; Cowan, S.

    1998-01-01

    Closure of the various areas in the Burial Ground Complex (BGC) represents a major step in the reduction of risk at the Savannah River Site (SRS) and a significant investment of resources. The Burial Ground Complex occupies approximately 195 acres in the central section of the SRS. Approximately 160 acres of the BGC consists of hazardous and radioactive waste disposal sites that require remediation. Of these source acres, one-third have been remediated while two-thirds are undergoing interim or final action. These restoration activities have been carried out in a safe and cost effective manner while minimizing impact to operating facilities. Successful completion of these activities is in large part due to the teamwork demonstrated by the Department of Energy, contractor/subcontractor personnel, and the regulatory agencies. The experience and knowledge gained from the closure of these large disposal facilities can be used to expedite closure of similar facilities

  10. Visitor's Computer Guidelines | CTIO

    Science.gov (United States)

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  11. Facile synthesis of silver nanoparticles and its antibacterial activity against Escherichia coli and unknown bacteria on mobile phone touch surfaces/computer keyboards

    Science.gov (United States)

    Reddy, T. Ranjeth Kumar; Kim, Hyun-Joong

    2016-07-01

    In recent years, there has been significant interest in the development of novel metallic nanoparticles using various top-down and bottom-up synthesis techniques. Kenaf is a huge biomass product and a potential component for industrial applications. In this work, we investigated the green synthesis of silver nanoparticles (AgNPs) by using kenaf ( Hibiscus cannabinus) cellulose extract and sucrose, which act as stabilizing and reducing agents in solution. With this method, by changing the pH of the solution as a function of time, we studied the optical, morphological and antibacterial properties of the synthesized AgNPs. In addition, these nanoparticles were characterized by Ultraviolet-visible spectroscopy, transmission electron microscopy (TEM), field-emission scanning electron microscopy, Fourier transform infrared (FTIR) spectroscopy and energy-dispersive X-ray spectroscopy (EDX). As the pH of the solution varies, the surface plasmon resonance peak also varies. A fast rate of reaction at pH 10 compared with that at pH 5 was identified. TEM micrographs confirm that the shapes of the particles are spherical and polygonal. Furthermore, the average size of the nanoparticles synthesized at pH 5, pH 8 and pH 10 is 40.26, 28.57 and 24.57 nm, respectively. The structure of the synthesized AgNPs was identified as face-centered cubic (fcc) by XRD. The compositional analysis was determined by EDX. FTIR confirms that the kenaf cellulose extract and sucrose act as stabilizing and reducing agents for the silver nanoparticles. Meanwhile, these AgNPs exhibited size-dependent antibacterial activity against Escherichia coli ( E. coli) and two other unknown bacteria from mobile phone screens and computer keyboard surfaces.

  12. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  13. 40 CFR 257.3-4 - Ground water.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Ground water. 257.3-4 Section 257.3-4... and Practices § 257.3-4 Ground water. (a) A facility or practice shall not contaminate an underground drinking water source beyond the solid waste boundary or beyond an alternative boundary specified in...

  14. Fire hazards analysis for solid waste burial grounds

    International Nuclear Information System (INIS)

    McDonald, K.M.

    1995-01-01

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation

  15. Constructivist Grounded Theory?

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2012-06-01

    Full Text Available AbstractI refer to and use as scholarly inspiration Charmaz’s excellent article on constructivist grounded theory as a tool of getting to the fundamental issues on why grounded theory is not constructivist. I show that constructivist data, if it exists at all, is a very, very small part of the data that grounded theory uses.

  16. Communication, concepts and grounding

    NARCIS (Netherlands)

    van der Velde, Frank; van der Velde, F.

    2015-01-01

    This article discusses the relation between communication and conceptual grounding. In the brain, neurons, circuits and brain areas are involved in the representation of a concept, grounding it in perception and action. In terms of grounding we can distinguish between communication within the brain

  17. Cryogenic Fluid Management Facility

    Science.gov (United States)

    Eberhardt, R. N.; Bailey, W. J.

    1985-01-01

    The Cryogenic Fluid Management Facility is a reusable test bed which is designed to be carried within the Shuttle cargo bay to investigate the systems and technologies associated with the efficient management of cryogens in space. Cryogenic fluid management consists of the systems and technologies for: (1) liquid storage and supply, including capillary acquisition/expulsion systems which provide single-phase liquid to the user system, (2) both passive and active thermal control systems, and (3) fluid transfer/resupply systems, including transfer lines and receiver tanks. The facility contains a storage and supply tank, a transfer line and a receiver tank, configured to provide low-g verification of fluid and thermal models of cryogenic storage and transfer processes. The facility will provide design data and criteria for future subcritical cryogenic storage and transfer system applications, such as Space Station life support, attitude control, power and fuel depot supply, resupply tankers, external tank (ET) propellant scavenging, and ground-based and space-based orbit transfer vehicles (OTV).

  18. Systems management of facilities agreements

    International Nuclear Information System (INIS)

    Blundell, A.

    1998-01-01

    The various types of facilities agreements, the historical obstacles to implementation of agreement management systems and the new opportunities emerging as industry is beginning to make an effort to overcome these obstacles, are reviewed. Barriers to computerized agreement management systems (lack of consistency, lack of standards, scarcity of appropriate computer software) are discussed. Characteristic features of a model facilities agreement management system and the forces driving the changing attitudes towards such systems (e.g. mergers) are also described

  19. Cleanup Verification Package for the 118-F-2 Burial Ground

    International Nuclear Information System (INIS)

    Capron, J.M.; Anselm, K.A.

    2008-01-01

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-2 Burial Ground. This burial ground, formerly called Solid Waste Burial Ground No. 1, was the original solid waste disposal site for the 100-F Area. Eight trenches contained miscellaneous solid waste from the 105-F Reactor and one trench contained solid waste from the biology facilities

  20. A Bioinformatics Facility for NASA

    Science.gov (United States)

    Schweighofer, Karl; Pohorille, Andrew

    2006-01-01

    Building on an existing prototype, we have fielded a facility with bioinformatics technologies that will help NASA meet its unique requirements for biological research. This facility consists of a cluster of computers capable of performing computationally intensive tasks, software tools, databases and knowledge management systems. Novel computational technologies for analyzing and integrating new biological data and already existing knowledge have been developed. With continued development and support, the facility will fulfill strategic NASA s bioinformatics needs in astrobiology and space exploration. . As a demonstration of these capabilities, we will present a detailed analysis of how spaceflight factors impact gene expression in the liver and kidney for mice flown aboard shuttle flight STS-108. We have found that many genes involved in signal transduction, cell cycle, and development respond to changes in microgravity, but that most metabolic pathways appear unchanged.