WorldWideScience

Sample records for computing facility manual

  1. Computer Security at Nuclear Facilities. Reference Manual (Arabic Edition)

    International Nuclear Information System (INIS)

    2011-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  2. Computer Security at Nuclear Facilities. Reference Manual (Russian Edition)

    International Nuclear Information System (INIS)

    2012-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  3. Computer Security at Nuclear Facilities. Reference Manual (Chinese Edition)

    International Nuclear Information System (INIS)

    2012-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  4. NNS computing facility manual P-17 Neutron and Nuclear Science

    International Nuclear Information System (INIS)

    Hoeberling, M.; Nelson, R.O.

    1993-11-01

    This document describes basic policies and provides information and examples on using the computing resources provided by P-17, the Neutron and Nuclear Science (NNS) group. Information on user accounts, getting help, network access, electronic mail, disk drives, tape drives, printers, batch processing software, XSYS hints, PC networking hints, and Mac networking hints is given

  5. A computer code to estimate accidental fire and radioactive airborne releases in nuclear fuel cycle facilities: User's manual for FIRIN

    International Nuclear Information System (INIS)

    Chan, M.K.; Ballinger, M.Y.; Owczarski, P.C.

    1989-02-01

    This manual describes the technical bases and use of the computer code FIRIN. This code was developed to estimate the source term release of smoke and radioactive particles from potential fires in nuclear fuel cycle facilities. FIRIN is a product of a broader study, Fuel Cycle Accident Analysis, which Pacific Northwest Laboratory conducted for the US Nuclear Regulatory Commission. The technical bases of FIRIN consist of a nonradioactive fire source term model, compartment effects modeling, and radioactive source term models. These three elements interact with each other in the code affecting the course of the fire. This report also serves as a complete FIRIN user's manual. Included are the FIRIN code description with methods/algorithms of calculation and subroutines, code operating instructions with input requirements, and output descriptions. 40 refs., 5 figs., 31 tabs

  6. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  7. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  8. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  9. Manual for operation of the multipurpose thermalhydraulic test facility TOPFLOW (Transient Two Phase Flow Test Facility)

    International Nuclear Information System (INIS)

    Beyer, M.; Carl, H.; Schuetz, H.; Pietruske, H.; Lenk, S.

    2004-07-01

    The Forschungszentrum Rossendorf (FZR) e. V. is constructing a new large-scale test facility, TOPFLOW, for thermalhydraulic single effect tests. The acronym stands for transient two phase flow test facility. It will mainly be used for the investigation of generic and applied steady state and transient two phase flow phenomena and the development and validation of models of computational fluid dynamic (CFD) codes. The manual of the test facility must always be available for the staff in the control room and is restricted condition during operation of personnel and also reconstruction of the facility. (orig./GL)

  10. Manual on internal dose computation and reporting

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Sawant, Jyoti V.; Gurg, R.P.; Rudran, Kamala; Gupta, V.K.; Abani, M.C.

    1999-05-01

    Whole body counting and bioassay measurement are carried out for estimation of radioactivity content in the whole body or in a particular organ/tissue of interest. These measurements are routinely carried out for occupational workers at nuclear power plants, reprocessing plants, radiochemical laboratories, radioisotope laboratories and radioactive waste management facilities to evaluate individual internal dose due to 3 H, 60 Co, 90 Sr, 137 Cs, transuranics and other isotopes of interest. This manual is prepared to provide guidelines for computation of intake, committed equivalent dose and committed effective dose from direct measurement of tissue and/or body content of radioactivity for 60 Co, 131 I, and 137 Cs employing in-vivo monitoring procedures and/or bioassay measurements only. Bioassay measurements are used for determination of 90 Sr in the body since it is a pure beta emitter. This manual can be used as a ready reckoner for assessment of radiation dose due to internal contamination of occupational workers as estimated using above techniques in the middle and back-end of the nuclear fuel cycle operations. The methodology used in computation of dose is based on the principles and biokinetic models given by ICRP. Recording level recommended in the manual is 0.6 mSv for both, routine as well as special monitoring, which is lower than 1 mSv recommended by ICRP (ICRP-75, 1997) for individual routine monitoring and 0.66 mSv for special monitoring. The Annual Limit on Intake is taken equivalent to Annual Effective Dose Limit of 20 mSv as prescribed by the Atomic Energy Regulatory Board (AERB), India. (author)

  11. TUNL computer facilities

    International Nuclear Information System (INIS)

    Boyd, M.; Edwards, S.E.; Gould, C.R.; Roberson, N.R.; Westerfeldt, C.R.

    1985-01-01

    The XSYS system has been relatively stable during the last year, and most of our efforts have involved routine software maintenance and enhancement of existing XSYS capabilities. Modifications were made in the MBD program GDAP to increase the execution speed in key GDAP routines. A package of routines has been developed to allow communication between the XSYS and the new Wien filter microprocessor. Recently the authors have upgraded their operating system from VSM V3.7 to V4.1. This required numerous modifications to XSYS, mostly in the command procedures. A new reorganized edition of the XSYS manual will be issued shortly. The TUNL High Resolution Laboratory's VAX 11/750 computer has been in operation for its first full year as a replacement for the PRIME 300 computer which was purchased in 1974 and retired nine months ago. The data acquisition system on the VAX has been in use for the past twelve months performing a number of experiments

  12. Clean Lead Facility Inventory System user's manual

    International Nuclear Information System (INIS)

    Garcia, J.F.

    1994-12-01

    The purpose of this user's manual is to provide instruction and guidance needed to enter and maintain inventory information for the Clean Lead Facility (CLF), PER-612. Individuals responsible for maintaining and using the system should study and understand the information provided. The user's manual describes how to properly use and maintain the CLF Inventory System. Annual, quarterly, monthly, and current inventory reports may be printed from the Inventory System for reporting purposes. Profile reports of each shipment of lead may also be printed for verification and documentation of lead transactions. The CLF Inventory System was designed on Microsoft Access version 2.0. Similar inventory systems are in use at the Idaho National Engineering Laboratory (INEL) to facilitate site-wide compilations of mixed waste data. The CLF Inventory System was designed for inventorying the clean or non-radioactive contaminated lead stored at the CLF. This data, along with the mixed waste data, will be compiled into the Idaho Mixed Waste Information (IMWI) system for reporting to the Department of Energy Idaho Office, Department of Energy Headquarters, and/or the State of Idaho

  13. Operating manual for the Tower Shielding Facility

    International Nuclear Information System (INIS)

    1985-12-01

    This manual provides information necessary to operate and perform maintenance on the reactor systems and all equipment or systems which can affect their operation or the safety of personnel at the Tower Shielding Facility. The first four chapters consist of introductory and descriptive material of benefit to personnel in training, the qualifications required for training, the responsibilities of the personnel in the organization, and the procedures for reviewing proposed experiments. Chapter 8, Emergency Procedures, is also a necessary part of the indoctrination of personnel. The procedures for operation of the Tower Shielding Reactor (TSR-II), its water cooling system, and the main tower hoists are outlined in Chapters 5, 6, and 7. The Technical Specification surveillance requirements for the TSR-II are summarized in Chapter 9. The maintenance and calibration schedule is spelled out in Chapter 10. The procedures for assembly and disassembly of the TSR-II are outlined in Chapter 11

  14. Operating manual for the critical experiments facility

    International Nuclear Information System (INIS)

    1986-01-01

    The operation of the Critical Experiments Facility (CEF) requires careful attention to procedures in order that all safety precautions are observed. Since an accident could release large amounts of radioactivity, careful operation and strict enforcement of procedures are necessary. To provide for safe operation, detailed procedures have been written for all phases of the operation of this facility. The CEF operating procedures are not to be construed to constitute a part ofthe Technical Specifications. In the event of any discrepancy between the information given herein and the Technical Specifications, limits set forth in the Technical Specifications apply. All normal and most emergency operation conditions are covered by procedures presented in this manual. These procedures are designed to be followed by the operating personnel. Strict adherence to these procedures is expected for the following reasons. (1) To provide a standard, safe method of performing all operations, the procedures were written by reactor engineers experienced in supervising the operation of reactors and were reviewed by an organization with over 30 years of reactor operating experience. (2) To have an up-to-date description of operating techniques available at all times for reference and review, it is necessary that the procedures be written

  15. Operating manual for the critical experiments facility

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    The operation of the Critical Experiments Facility (CEF) requires careful attention to procedures in order that all safety precautions are observed. Since an accident could release large amounts of radioactivity, careful operation and strict enforcement of procedures are necessary. To provide for safe operation, detailed procedures have been written for all phases of the operation of this facility. The CEF operating procedures are not to be construed to constitute a part ofthe Technical Specifications. In the event of any discrepancy between the information given herein and the Technical Specifications, limits set forth in the Technical Specifications apply. All normal and most emergency operation conditions are covered by procedures presented in this manual. These procedures are designed to be followed by the operating personnel. Strict adherence to these procedures is expected for the following reasons. (1) To provide a standard, safe method of performing all operations, the procedures were written by reactor engineers experienced in supervising the operation of reactors and were reviewed by an organization with over 30 years of reactor operating experience. (2) To have an up-to-date description of operating techniques available at all times for reference and review, it is necessary that the procedures be written.

  16. Facility Interface Capability Assessment (FICA) user manual

    International Nuclear Information System (INIS)

    Pope, R.B.; MacDonald, R.R.; Massaglia, J.L.; Williamson, D.A.; Viebrock, J.M.; Mote, N.

    1995-09-01

    The US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing the Civilian Radioactive Waste Management System (CRWMS) to accept spent nuclear fuel from commercial facilities. The objective of the Facility Interface Capability Assessment (FICA) project was to assess the capability of each commercial spent nuclear fuel (SNF) storage facility, at which SNF is stored, to handle various SNF shipping casks. The purpose of this report is describe the FICA computer software and to provide the FICA user with a guide on how to use the FICA system. The FICA computer software consists of two executable programs: the FICA Reactor Report program and the FICA Summary Report program (written in the Ca-Clipper version 5.2 development system). The complete FICA software system is contained on either a 3.5 in. (double density) or a 5.25 in. (high density) diskette and consists of the two FICA programs and all the database files (generated using dBASE III). The FICA programs are provided as ''stand alone'' systems and neither the Ca-Clipper compiler nor dBASE III is required to run the FICA programs. The steps for installing the FICA software system and executing the FICA programs are described in this report. Instructions are given on how to install the FICA software system onto the hard drive of the PC and how to execute the FICA programs from the FICA subdirectory on the hard drive. Both FICA programs are menu driven with the up-arrow and down-arrow keys used to move the cursor to the desired selection

  17. Spent Nuclear Fuel Project Cold Vacuum Drying Facility Operations Manual

    International Nuclear Information System (INIS)

    IRWIN, J.J.

    1999-01-01

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-553, Spent Nuclear Fuel Project Final Safety Analysis Report Annex B--Cold Vacuum Drying Facility. The HNF-SD-SNF-DRD-002, 1999, (Cold Vacuum Drying Facility Design Requirements), Rev. 4. and the CVDF Final Design Report. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence and references to the CVDF System Design Descriptions (SDDs). This manual has been developed for the SNFP Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved

  18. SAFE users manual. Volume 4. Computer programs

    International Nuclear Information System (INIS)

    Grady, L.M.

    1983-06-01

    Documentation for the Safeguards Automated Facility Evaluation (SAFE) computer programs is presented. The documentation is in the form of subprogram trees, program abstracts, flowcharts, and listings. Listings are provided on microfiche

  19. Manual for Accessibility: [Conference, Meeting, and Lodging Facilities]. Revised.

    Science.gov (United States)

    National Rehabilitation Association, Alexandria, VA.

    This illustrated manual and survey forms are designed to be used by organizations, hotel and restaurant associations, interested individuals and others as a guide for selecting accessible conference, meeting, and lodging facilities. The guidelines can also be used with existing facilities to identify specific modifications and accommodations. The…

  20. MANUAL OF STANDARDS FOR REHABILITATION CENTERS AND FACILITIES.

    Science.gov (United States)

    CANIFF, CHARLES E.; AND OTHERS

    A 5-YEAR PROJECT TO SPECIFY STANDARDS OF REHABILITATION CENTERS AND FACILITIES RESULTED IN THREE PUBLICATIONS. THIS MANUAL INCLUDES THE CHARACTERISTICS AND GOALS OF REHABILITATION FACILITIES. THE STANDARDS FOR ORGANIZATION, SERVICES THAT SHOULD BE PROVIDED, PERSONNEL INCLUDED, RECORDS AND REPORTS, FISCAL MANAGEMENT, AND THE PHYSICAL PLANT ARE…

  1. Spent nuclear fuel project cold vacuum drying facility operations manual

    International Nuclear Information System (INIS)

    IRWIN, J.J.

    1999-01-01

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-SD-SNF-SAR-002, Safety Analysis Report for the Cold Vacuum Drying Facility, Phase 2, Supporting Installation of Processing Systems (Garvin 1998) and, the HNF-SD-SNF-DRD-002, 1997, Cold Vacuum Drying Facility Design Requirements, Rev. 3a. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence, and has been developed for the SNFP Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved

  2. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  3. User's manual of a computer code for seismic hazard evaluation for assessing the threat to a facility by fault model. SHEAT-FM

    International Nuclear Information System (INIS)

    Sugino, Hideharu; Onizawa, Kunio; Suzuki, Masahide

    2005-09-01

    To establish the reliability evaluation method for aged structural component, we developed a probabilistic seismic hazard evaluation code SHEAT-FM (Seismic Hazard Evaluation for Assessing the Threat to a facility site - Fault Model) using a seismic motion prediction method based on fault model. In order to improve the seismic hazard evaluation, this code takes the latest knowledge in the field of earthquake engineering into account. For example, the code involves a group delay time of observed records and an update process model of active fault. This report describes the user's guide of SHEAT-FM, including the outline of the seismic hazard evaluation, specification of input data, sample problem for a model site, system information and execution method. (author)

  4. Portable Digital Radiography and Computed Tomography Manual

    Energy Technology Data Exchange (ETDEWEB)

    2007-11-01

    This user manual describes the function and use of the portable digital radiography and computed tomography (DRCT) scanner. The manual gives a general overview of x-ray imaging systems along with a description of the DRCT system. An inventory of the all the system components, organized by shipping container, is also included. In addition, detailed, step-by-step procedures are provided for all of the exercises necessary for a novice user to successfully collect digital radiographs and tomographic images of an object, including instructions on system assembly and detector calibration and system alignment. There is also a short section covering the limited system care and maintenance needs. Descriptions of the included software packages, the DRCT Digital Imager used for system operation, and the DRCT Image Processing Interface used for image viewing and tomographic data reconstruction are given in the appendixes. The appendixes also include a cheat sheet for more experienced users, a listing of known system problems and how to mitigate them, and an inventory check-off sheet suitable for copying and including with the machine for shipment purposes.

  5. AMRITA -- A computational facility

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, J.E. [California Inst. of Tech., CA (US); Quirk, J.J.

    1998-02-23

    Amrita is a software system for automating numerical investigations. The system is driven using its own powerful scripting language, Amrita, which facilitates both the composition and archiving of complete numerical investigations, as distinct from isolated computations. Once archived, an Amrita investigation can later be reproduced by any interested party, and not just the original investigator, for no cost other than the raw CPU time needed to parse the archived script. In fact, this entire lecture can be reconstructed in such a fashion. To do this, the script: constructs a number of shock-capturing schemes; runs a series of test problems, generates the plots shown; outputs the LATEX to typeset the notes; performs a myriad of behind-the-scenes tasks to glue everything together. Thus Amrita has all the characteristics of an operating system and should not be mistaken for a common-or-garden code.

  6. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  7. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    Cavina, A.

    2013-01-01

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  8. Computer-Aided Facilities Management Systems (CAFM).

    Science.gov (United States)

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  9. Health physics manual of good practices for accelerator facilities

    International Nuclear Information System (INIS)

    Casey, W.R.; Miller, A.J.; McCaslin, J.B.; Coulson, L.V.

    1988-04-01

    It is hoped that this manual will serve both as a teaching aid as well as a useful adjunct for program development. In the context of application, this manual addresses good practices that should be observed by management, staff, and designers since the achievement of a good radiation program indeed involves a combined effort. Ultimately, radiation safety and good work practices become the personal responsibility of the individual. The practices presented in this manual are not to be construed as mandatory rather they are to be used as appropriate for the specific case in the interest of radiation safety. As experience is accrued and new data obtained in the application of this document, ONS will update the guidance to assure that at any given time the guidance reflects optimum performance consistent with current technology and practice.The intent of this guide therefore is to: define common health physics problems at accelerators; recommend suitable methods of identifying, evaluating, and managing accelerator health physics problems; set out the established safety practices at DOE accelerators that have been arrived at by consensus and, where consensus has not yet been reached, give examples of safe practices; introduce the technical literature in the accelerator health physics field; and supplement the regulatory documents listed in Appendix D. Many accelerator health physics problems are no different than those at other kinds of facilities, e.g., ALARA philosophy, instrument calibration, etc. These problems are touched on very lightly or not at all. Similarly, this document does not cover other hazards such as electrical shock, toxic materials, etc. This does not in any way imply that these problems are not serious. 160 refs

  10. Introduction to Computing: Lab Manual. Faculty Guide [and] Student Guide.

    Science.gov (United States)

    Frasca, Joseph W.

    This lab manual is designed to accompany a college course introducing students to computing. The exercises are designed to be completed by the average student in a supervised 2-hour block of time at a computer lab over 15 weeks. The intent of each lab session is to introduce a topic and have the student feel comfortable with the use of the machine…

  11. Sandia National Laboratories Facilities Management and Operations Center Design Standards Manual

    Energy Technology Data Exchange (ETDEWEB)

    Fattor, Steven [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2014-06-01

    The manual contains general requirements that apply to nonnuclear and nonexplosive facilities. For design and construction requirements for modifications to nuclear or explosive facilities, see the project-specific design requirements noted in the Design Criteria.

  12. 2015 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  13. 2014 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, James R. [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  14. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    Science.gov (United States)

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  15. Citham-2 computer code-User manual

    International Nuclear Information System (INIS)

    Batista, J.L.

    1984-01-01

    The procedures and the input data for the Citham-2 computer code are described. It is a subroutine that modifies the nuclide concentration taking in account its burn and prepares cross sections library in 2,3 or 4 energy groups, to the used for Citation program. (E.G.) [pt

  16. The Harwell TAILS computer program user's manual

    International Nuclear Information System (INIS)

    Rouse, K.D.; Cooper, M.J.

    1980-11-01

    The Harwell TAILS computer program is a versatile program for crystal structure refinement through the analysis of neutron or X-ray diffraction data from single crystals or powders. The main features of the program are described and details are given of the data input and output specifications. (author)

  17. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  18. User manual of FRAPCON-I computer code

    International Nuclear Information System (INIS)

    Chia, C.T.

    1985-11-01

    The manual for using the FRAPCON-I code implanted by Reactor Department of Brazilian-CNEN to convert IBM FORTRAN in FORTRAN 77 of Honeywell Bull computer is presented. The FRAPCON-I code describes the behaviour of fuel rods of PWR type reactors at stationary state during long periods of burnup. (M.C.K.)

  19. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  20. Study of developing nuclear fabrication facility's integrated emergency response manual

    International Nuclear Information System (INIS)

    Kim, Taeh Yeong; Cho, Nam Chan; Han, Seung Hoon; Moon, Jong Han; Lee, Jin Hang; Min, Guem Young; Han, Ji Ah

    2016-01-01

    Public begin to pay attention to emergency management. Thus, public's consensus on having high level of emergency management system up to advanced country's is reached. In this social atmosphere, manual is considered as key factor to prevent accident or secure business continuity. Therefore, we first define possible crisis at KEPCO Nuclear Fuel (hereinafter KNF) and also make a 'Reaction List' for each crisis situation at the view of information-design. To achieve it, we analyze several country's crisis response manual and then derive component, indicate duties and roles at the information-design point of view. From this, we suggested guideline to make 'Integrated emergency response manual(IERM)'. The manual we used before have following few problems; difficult to applicate at the site, difficult to deliver information. To complement these problems, we searched manual elements from the view of information-design. As a result, we develop administrative manual. Although, this manual could be thought as fragmentary manual because it confined specific several agency/organization and disaster type

  1. Oak Ridge Leadership Computing Facility (OLCF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of standing up a supercomputer 100 times...

  2. Computing facility at SSC for detectors

    International Nuclear Information System (INIS)

    Leibold, P.; Scipiono, B.

    1990-01-01

    A description of the RISC-based distributed computing facility for detector simulaiton being developed at the SSC Laboratory is discussed. The first phase of this facility is scheduled for completion in early 1991. Included is the status of the project, overview of the concepts used to model and define system architecture, networking capabilities for user access, plans for support of physics codes and related topics concerning the implementation of this facility

  3. Operation and Maintenance Manual for the Central Facilities Area Sewage Treatment Plant

    Energy Technology Data Exchange (ETDEWEB)

    Norm Stanley

    2011-02-01

    This Operation and Maintenance Manual lists operator and management responsibilities, permit standards, general operating procedures, maintenance requirements and monitoring methods for the Sewage Treatment Plant at the Central Facilities Area at the Idaho National Laboratory. The manual is required by the Municipal Wastewater Reuse Permit (LA-000141-03) the sewage treatment plant.

  4. Spent Nuclear Fuel (SNF) Cold Vacuum Drying (CVD) Facility Operations Manual; FINAL

    International Nuclear Information System (INIS)

    IRWIN, J.J.

    1999-01-01

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-553, Spent Nuclear Fuel Project Final Safety Analysis Report Annex B-Cold Vacuum Drying Facility. The HNF-SD-SNF-DRD-002, 1999, Cold Vacuum Drying Facility Design Requirements, Rev. 4, and the CVDF Final Design Report. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence and references to the CVDF System Design Descriptions (SDDs). This manual has been developed for the SNFP Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved

  5. Spent Nuclear Fuel (SNF) Cold Vacuum Drying (CVD) Facility Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    IRWIN, J.J.

    1999-07-02

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-553, Spent Nuclear Fuel Project Final Safety Analysis Report Annex B--Cold Vacuum Drying Facility. The HNF-SD-SNF-DRD-002, 1999, Cold Vacuum Drying Facility Design Requirements, Rev. 4, and the CVDF Final Design Report. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence and references to the CVDF System Design Descriptions (SDDs). This manual has been developed for the SNFP Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved.

  6. 2016 Annual Report - Argonne Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Jim [Argonne National Lab. (ANL), Argonne, IL (United States); Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Cerny, Beth A. [Argonne National Lab. (ANL), Argonne, IL (United States); Coffey, Richard M. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  7. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  8. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  9. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  10. Spent Nuclear Fuel (SNF) Project Cold Vacuum Drying (CVD) Facility Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    IRWIN, J.J.

    2000-02-03

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-SD-SNF-SAR-002, Safety Analysis Report for the Cold Vacuum Drying Facility, Phase 2, Supporting Installation of the Processing Systems (Garvin 1998) and, the HNF-SD-SNF-DRD-002, 1997, Cold Vacuum Drying Facility Design Requirements, Rev. 3a. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence, and has been developed for the spent nuclear fuel project (SNFP) Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved.

  11. Spent Nuclear Fuel (SNF) Project Cold Vacuum Drying (CVD) Facility Operations Manual

    International Nuclear Information System (INIS)

    IRWIN, J.J.

    2000-01-01

    This document provides the Operations Manual for the Cold Vacuum Drying Facility (CVDF). The Manual was developed in conjunction with HNF-SD-SNF-SAR-002, Safety Analysis Report for the Cold Vacuum Drying Facility, Phase 2, Supporting Installation of the Processing Systems (Garvin 1998) and, the HNF-SD-SNF-DRD-002, 1997, Cold Vacuum Drying Facility Design Requirements, Rev. 3a. The Operations Manual contains general descriptions of all the process, safety and facility systems in the CVDF, a general CVD operations sequence, and has been developed for the spent nuclear fuel project (SNFP) Operations Organization and shall be updated, expanded, and revised in accordance with future design, construction and startup phases of the CVDF until the CVDF final ORR is approved

  12. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  13. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    Science.gov (United States)

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  14. SNL/CA Facilities Management Design Standards Manual

    Energy Technology Data Exchange (ETDEWEB)

    Rabb, David [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Clark, Eva [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-12-01

    At Sandia National Laboratories in California (SNL/CA), the design, construction, operation, and maintenance of facilities is guided by industry standards, a graded approach, and the systematic analysis of life cycle benefits received for costs incurred. The design of the physical plant must ensure that the facilities are "fit for use," and provide conditions that effectively, efficiently, and safely support current and future mission needs. In addition, SNL/CA applies sustainable design principles, using an integrated whole-building design approach, from site planning to facility design, construction, and operation to ensure building resource efficiency and the health and productivity of occupants. The safety and health of the workforce and the public, any possible effects on the environment, and compliance with building codes take precedence over project issues, such as performance, cost, and schedule.

  15. Health physics manual of good practices for tritium facilities

    International Nuclear Information System (INIS)

    Blauvelt, R.K.; Deaton, M.R.; Gill, J.T.

    1991-12-01

    The purpose of this document is to provide written guidance defining the generally accepted good practices in use at Department of Energy (DOE) tritium facilities. A open-quotes good practiceclose quotes is an action, policy, or procedure that enhances the radiation protection program at a DOE site. The information selected for inclusion in this document should help readers achieve an understanding of the key radiation protection issues at tritium facilities and provide guidance as to what characterizes excellence from a radiation protection point of view. The ALARA (As Low as Reasonable Achievable) program at DOE sites should be based, in part, on following the good practices that apply to their operations

  16. Materials and Fuels Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    International Nuclear Information System (INIS)

    Harvego, Lisa; Bennett, Brion

    2011-01-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Fuels Complex facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  17. Materials and Security Consolidation Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    International Nuclear Information System (INIS)

    2011-01-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Security Consolidation Center facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  18. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  19. Shieldings for X-ray radiotherapy facilities calculated by computer

    International Nuclear Information System (INIS)

    Pedrosa, Paulo S.; Farias, Marcos S.; Gavazza, Sergio

    2005-01-01

    This work presents a methodology for calculation of X-ray shielding in facilities of radiotherapy with help of computer. Even today, in Brazil, the calculation of shielding for X-ray radiotherapy is done based on NCRP-49 recommendation establishing a methodology for calculating required to the elaboration of a project of shielding. With regard to high energies, where is necessary the construction of a labyrinth, the NCRP-49 is not very clear, so that in this field, studies were made resulting in an article that proposes a solution to the problem. It was developed a friendly program in Delphi programming language that, through the manual data entry of a basic design of architecture and some parameters, interprets the geometry and calculates the shields of the walls, ceiling and floor of on X-ray radiation therapy facility. As the final product, this program provides a graphical screen on the computer with all the input data and the calculation of shieldings and the calculation memory. The program can be applied in practical implementation of shielding projects for radiotherapy facilities and can be used in a didactic way compared to NCRP-49.

  20. Health physics manual of good practices for tritium facilities

    Energy Technology Data Exchange (ETDEWEB)

    Blauvelt, R.K.; Deaton, M.R.; Gill, J.T. [and others

    1991-12-01

    The purpose of this document is to provide written guidance defining the generally accepted good practices in use at Department of Energy (DOE) tritium facilities. A {open_quotes}good practice{close_quotes} is an action, policy, or procedure that enhances the radiation protection program at a DOE site. The information selected for inclusion in this document should help readers achieve an understanding of the key radiation protection issues at tritium facilities and provide guidance as to what characterizes excellence from a radiation protection point of view. The ALARA (As Low as Reasonable Achievable) program at DOE sites should be based, in part, on following the good practices that apply to their operations.

  1. Operation manual for the INEL on-line mass-separator facility

    International Nuclear Information System (INIS)

    Anderl, R.A.

    1984-06-01

    This report is an operation manual for an on-line mass-separator facility which is located in Building 661 at the Test Reactor Area of the Idaho National Engineering Laboratory. The facility provides mass-separated sources of short-lived fission-product radionuclides whose decay properties can be studied using a variety of nuclear spectroscopic techniques. This facility is unique in that it utilizes the gas-jet technique to transport fission products from a 252 Cf source located in a hot cell to the ion source of the mass separator. This document includes the following: (a) a detailed description of the facility, (b) identification of equipment hazards and safety controls, (c) detailed operating procedures for startup, continuous operation and shutdown, (d) operating procedures for the californium hot cell, and (e) an operator's manual for the automated moving tape collector/data acquisition system. 7 references, 16 figures, 8 tables

  2. Computer codes for ventilation in nuclear facilities

    International Nuclear Information System (INIS)

    Mulcey, P.

    1987-01-01

    In this paper the authors present some computer codes, developed in the last years, for ventilation and radioprotection. These codes are used for safety analysis in the conception, exploitation and dismantlement of nuclear facilities. The authors present particularly: DACC1 code used for aerosol deposit in sampling circuit of radiation monitors; PIAF code used for modelization of complex ventilation system; CLIMAT 6 code used for optimization of air conditioning system [fr

  3. CAISSE (Computer Aided Information System on Solar Energy) technical manual

    Energy Technology Data Exchange (ETDEWEB)

    Cantelon, P E; Beinhauer, F W

    1979-01-01

    The Computer Aided Information System on Solar Energy (CAISSE) was developed to provide the general public with information on solar energy and its potential uses and costs for domestic consumption. CAISSE is an interactive computing system which illustrates solar heating concepts through the use of 35 mm slides, text displays on a screen and a printed report. The user communicates with the computer by responding to questions about his home and heating requirements through a touch sensitive screen. The CAISSE system contains a solar heating simulation model which calculates the heating load capable of being supplied by a solar heating system and uses this information to illustrate installation costs, fuel savings and a 20 year life-cycle analysis of cost and benefits. The system contains several sets of radiation and weather data for Canada and USA. The selection of one of four collector models is based upon the requirements input during the computer session. Optimistic and pessimistic fuel cost forecasts are made for oil, natural gas, electricity, or propane; and the forecasted fuel cost is made the basis of the life cycle cost evaluation for the solar heating application chosen. This manual is organized so that each section describes one major aspect of the use of solar energy systems to provide energy for domestic consumption. The sources of data and technical information and the method of incorporating them into the CAISSE display system are described in the same order as the computer processing. Each section concludes with a list of future developments that could be included to make CAISSE outputs more regionally specific and more useful to designers. 19 refs., 1 tab.

  4. Manual of a suite of computer codes, EXPRESS (EXact PREparedness Supporting System)

    International Nuclear Information System (INIS)

    Chino, Masamichi

    1992-06-01

    The emergency response supporting system EXPRESS (EXact PREparedness Supporting System) is constructed in JAERI for low cost engineering work stations under the UNIX operation. The purpose of this system is real-time predictions of affected areas due to radioactivities discharged into atmosphere from nuclear facilities. The computational models in EXPRESS are the mass-consistent wind field model EXPRESS-I and the particle dispersion model EXPRESS-II for atmospheric dispersions. In order to attain the quick response even when the codes are used in a small-scale computer, a high-speed iteration method MILUCR (Modified Incomplete Linear Unitary Conjugate Residual) is applied to EXPRESS-I and kernel density method is to EXPRESS-II. This manual describes the model configurations, code structures, related files, namelists and sample outputs of EXPRESS-I and -II. (author)

  5. User's manual for the NEFTRAN II computer code

    International Nuclear Information System (INIS)

    Olague, N.E.; Campbell, J.E.; Leigh, C.D.; Longsine, D.E.

    1991-02-01

    This document describes the NEFTRAN II (NEtwork Flow and TRANsport in Time-Dependent Velocity Fields) computer code and is intended to provide the reader with sufficient information to use the code. NEFTRAN II was developed as part of a performance assessment methodology for storage of high-level nuclear waste in unsaturated, welded tuff. NEFTRAN II is a successor to the NEFTRAN and NWFT/DVM computer codes and contains several new capabilities. These capabilities include: (1) the ability to input pore velocities directly to the transport model and bypass the network fluid flow model, (2) the ability to transport radionuclides in time-dependent velocity fields, (3) the ability to account for the effect of time-dependent saturation changes on the retardation factor, and (4) the ability to account for time-dependent flow rates through the source regime. In addition to these changes, the input to NEFTRAN II has been modified to be more convenient for the user. This document is divided into four main sections consisting of (1) a description of all the models contained in the code, (2) a description of the program and subprograms in the code, (3) a data input guide and (4) verification and sample problems. Although NEFTRAN II is the fourth generation code, this document is a complete description of the code and reference to past user's manuals should not be necessary. 19 refs., 33 figs., 25 tabs

  6. Health physics manual of good practices for plutonium facilities. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    Brackenbush, L.W.; Heid, K.R.; Herrington, W.N.; Kenoyer, J.L.; Munson, L.F.; Munson, L.H.; Selby, J.M.; Soldat, K.L.; Stoetzel, G.A.; Traub, R.J.

    1988-05-01

    This manual consists of six sections: Properties of Plutonium, Siting of Plutonium Facilities, Facility Design, Radiation Protection, Emergency Preparedness, and Decontamination and Decommissioning. While not the final authority, the manual is an assemblage of information, rules of thumb, regulations, and good practices to assist those who are intimately involved in plutonium operations. An in-depth understanding of the nuclear, physical, chemical, and biological properties of plutonium is important in establishing a viable radiation protection and control program at a plutonium facility. These properties of plutonium provide the basis and perspective necessary for appreciating the quality of control needed in handling and processing the material. Guidance in selecting the location of a new plutonium facility may not be directly useful to most readers. However, it provides a perspective for the development and implementation of the environmental surveillance program and the in-plant controls required to ensure that the facility is and remains a good neighbor. The criteria, guidance, and good practices for the design of a plutonium facility are also applicable to the operation and modification of existing facilities. The design activity provides many opportunities for implementation of features to promote more effective protection and control. The application of ''as low as reasonably achievable'' (ALARA) principles and optimization analyses are generally most cost-effective during the design phase. 335 refs., 8 figs., 20 tabs.

  7. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  8. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  9. Training manual for process operation and management of radioactive waste treatment facility

    Energy Technology Data Exchange (ETDEWEB)

    Shon, J. S.; Kim, K. J.; Ahn, S. J. [and others

    2004-12-01

    Radioactive Waste Treatment Facility (RWTF) has been operating for safe and effective treatment of radioactive wastes generated in the Korea Atomic Energy Research Institute (KAERI). In RWTF, there are evaporation, bituminization and solar evaporation processes for liquid waste, solid waste treatment process and laundry process. As other radioactive waste treatment facilities in foreign countries, the emergency situation such as fire and overflow of liquid waste can be taken place during the operation and result in the spread of contamination of radioactivity. So, easy and definite operating procedure is necessary for the safe operation of the facility. This manual can be available as easy and concise training materials for new employees and workers dispatched from service agency. Especially, in case of emergency urgently occurred during operation, everyone working in the facility can quickly stop the facility following this procedure.

  10. Training manual for process operation and management of radioactive waste treatment facility

    International Nuclear Information System (INIS)

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-12-01

    Radioactive Waste Treatment Facility (RWTF) has been operating for safe and effective treatment of radioactive wastes generated in the Korea Atomic Energy Research Institute (KAERI). In RWTF, there are evaporation, bituminization and solar evaporation processes for liquid waste, solid waste treatment process and laundry process. As other radioactive waste treatment facilities in foreign countries, the emergency situation such as fire and overflow of liquid waste can be taken place during the operation and result in the spread of contamination of radioactivity. So, easy and definite operating procedure is necessary for the safe operation of the facility. This manual can be available as easy and concise training materials for new employees and workers dispatched from service agency. Especially, in case of emergency urgently occurred during operation, everyone working in the facility can quickly stop the facility following this procedure

  11. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  12. Integration of a browser based operator manual in the system environment of a process computer system

    International Nuclear Information System (INIS)

    Weber, Andreas; Erfle, Robert; Feinkohl, Dirk

    2012-01-01

    The integration of a browser based operator manual in the system environment of a process computer system is an optimization of the operating procedure in the control room and a safety enhancement due to faster and error-free access to the manual contents. Several requirements by the authorities have to be fulfilled: the operating manual has to be available as hard copy, the format has to be true to original, protection against manipulation has to be provided, the manual content of the browser-based version and the hard copy have to identical, and the display presentation has to be consistent with ergonomic principals. The integration of the on-line manual in the surveillance process computer system provides the operator with the relevant comments to the surveillance signal. The described integration of the on-line manual is an optimization of the operator's everyday job with respect to ergonomics and safety (human performance).

  13. User manual for PACTOLUS: a code for computing power costs

    International Nuclear Information System (INIS)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables

  14. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  15. Materials and Fuels Complex Facilities Radioactive Waste Management Basis and DOE Manual 435.1-1 Compliance Tables

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Brion Bennett

    2011-09-01

    Department of Energy Order 435.1, 'Radioactive Waste Management,' along with its associated manual and guidance, requires development and maintenance of a radioactive waste management basis for each radioactive waste management facility, operation, and activity. This document presents a radioactive waste management basis for Idaho National Laboratory's Materials and Fuels Complex facilities that manage radioactive waste. The radioactive waste management basis for a facility comprises existing laboratory-wide and facility-specific documents. Department of Energy Manual 435.1-1, 'Radioactive Waste Management Manual,' facility compliance tables also are presented for the facilities. The tables serve as a tool for developing the radioactive waste management basis.

  16. National Ignition Facility TestController for automated and manual testing

    Energy Technology Data Exchange (ETDEWEB)

    Zielinski, Jason, E-mail: fishler2@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States)

    2012-12-15

    The Controls and Information Systems (CIS) organization for the National Ignition Facility (NIF) has developed controls, configuration and analysis software applications that combine for several million lines of code. The team delivers updates throughout the year, from major releases containing hundreds of changes to patch releases containing a small number of focused updates. To ensure the quality of each delivery, manual and automated tests are performed using the NIF TestController test infrastructure. The TestController system provides test inventory management, test planning, automated and manual test execution, release testing summaries and results search, all through a web browser interface. As part of the three-stage software testing strategy, the NIF TestController system helps plan, evaluate and track the readiness of each release to the NIF production environment. After several years of use in testing NIF software applications, the TestController's manual testing features have been leveraged for verifying the installation and operation of NIF Target Diagnostic hardware. The TestController recorded its first test results in 2004. Today, the system has recorded the execution of more than 160,000 tests and continues to play a central role in ensuring that NIF hardware and software meet the requirements of a high reliability facility. This paper describes the TestController system and discusses its use in assuring the quality of software delivered to the NIF.

  17. National Ignition Facility TestController for automated and manual testing

    International Nuclear Information System (INIS)

    Zielinski, Jason

    2012-01-01

    The Controls and Information Systems (CIS) organization for the National Ignition Facility (NIF) has developed controls, configuration and analysis software applications that combine for several million lines of code. The team delivers updates throughout the year, from major releases containing hundreds of changes to patch releases containing a small number of focused updates. To ensure the quality of each delivery, manual and automated tests are performed using the NIF TestController test infrastructure. The TestController system provides test inventory management, test planning, automated and manual test execution, release testing summaries and results search, all through a web browser interface. As part of the three-stage software testing strategy, the NIF TestController system helps plan, evaluate and track the readiness of each release to the NIF production environment. After several years of use in testing NIF software applications, the TestController's manual testing features have been leveraged for verifying the installation and operation of NIF Target Diagnostic hardware. The TestController recorded its first test results in 2004. Today, the system has recorded the execution of more than 160,000 tests and continues to play a central role in ensuring that NIF hardware and software meet the requirements of a high reliability facility. This paper describes the TestController system and discusses its use in assuring the quality of software delivered to the NIF.

  18. Computer Security at Nuclear Facilities (French Edition)

    International Nuclear Information System (INIS)

    2013-01-01

    category of the IAEA Nuclear Security Series, and deals with computer security at nuclear facilities. It is based on national experience and practices as well as publications in the fields of computer security and nuclear security. The guidance is provided for consideration by States, competent authorities and operators. The preparation of this publication in the IAEA Nuclear Security Series has been made possible by the contributions of a large number of experts from Member States. An extensive consultation process with all Member States included consultants meetings and open-ended technical meetings. The draft was then circulated to all Member States for 120 days to solicit further comments and suggestions. The comments received from Member States were reviewed and considered in the final version of the publication.

  19. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  20. Computer Profile of School Facilities Energy Consumption.

    Science.gov (United States)

    Oswalt, Felix E.

    This document outlines a computerized management tool designed to enable building managers to identify energy consumption as related to types and uses of school facilities for the purpose of evaluating and managing the operation, maintenance, modification, and planning of new facilities. Specifically, it is expected that the statistics generated…

  1. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    Science.gov (United States)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  2. Self-study manual for introduction to computational fluid dynamics

    OpenAIRE

    Nabatov, Andrey

    2017-01-01

    Computational Fluid Dynamics (CFD) is the branch of Fluid Mechanics and Computational Physics that plays a decent role in modern Mechanical Engineering Design process due to such advantages as relatively low cost of simulation comparing with conduction of real experiment, an opportunity to easily correct the design of a prototype prior to manufacturing of the final product and a wide range of application: mixing, acoustics, cooling and aerodynamics. This makes CFD particularly and Computation...

  3. Operators manual for a computer controlled impedance measurement system

    Science.gov (United States)

    Gordon, J.

    1987-02-01

    Operating instructions of a computer controlled impedance measurement system based in Hewlett Packard instrumentation are given. Hardware details, program listings, flowcharts and a practical application are included.

  4. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Bland, Arthur S Buddy [ORNL; Boudwin, Kathlyn J. [ORNL; Hack, James J [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL; Hudson, Douglas L [ORNL

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation

  5. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  6. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  7. The computer code EURDYN-1M (release 2). User's manual

    International Nuclear Information System (INIS)

    1982-01-01

    EURDYN-1M is a finite element computer code developed at J.R.C. Ispra to compute the response of two-dimensional coupled fluid-structure configurations to transient dynamic loading for reactor safety studies. This report gives instructions for preparing input data to EURDYN-1M, release 2, and describes a test problem in order to illustrate both the input and the output of the code

  8. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  9. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  10. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  11. Computer facilities for ISABELLE data handling

    International Nuclear Information System (INIS)

    Kramer, M.A.; Love, W.A.; Miller, R.J.; Zeller, M.

    1977-01-01

    The analysis of data produced by ISABELLE experiments will need a large system of computers. An official group of prospective users and operators of that system should begin planning now. Included in the array will be a substantial computer system at each ISABELLE intersection in use. These systems must include enough computer power to keep experimenters aware of the health of the experiment. This will require at least one very fast sophisticated processor in the system, the size depending on the experiment. Other features of the intersection systems must be a good, high speed graphic display, ability to record data on magnetic tape at 500 to 1000 KB, and a high speed link to a central computer. The operating system software must support multiple interactive users. A substantially larger capacity computer system, shared by the six intersection region experiments, must be available with good turnaround for experimenters while ISABELLE is running. A computer support group will be required to maintain the computer system and to provide and maintain software common to all experiments. Special superfast computing hardware or special function processors constructed with microprocessor circuitry may be necessary both in the data gathering and data processing work. Thus both the local and central processors should be chosen with the possibility of interfacing such devices in mind

  12. Health and Safety Research Division manual for the x-ray facility in Building 2008

    International Nuclear Information System (INIS)

    Stansbury, P.S.

    1977-11-01

    The facility in the east end of Building 2008, ORNL, consists of an x-ray machine and a shielded enclosure. The x-ray machine is of the constant potential type and can be operated continuously at generating potentials up to 125 kVcp with tube currents ranging from 0.1 μA to 10 mA. Both the generating potential and the tube current are highly regulated and stabilized. The machine is installed and operated within a large shielded enclosure. In addition to the x-ray machine and its ancillary equipment, the shielded enclosure contains many of the features of a general chemistry and physics laboratory including work benches, sinks, storage space, and electrical and gas service. This manual contains instructions for the safe operation of the x-ray machine

  13. Program user's manual: cryogen system for the analysis for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    1979-04-01

    The Mirror Fusion Test Facility being designed and constructed at the Lawrence Livermore Laboratory requires a liquid helium liquefaction, storage, distribution, and recovery system and a liquid nitrogen storage and distribution system. To provide a powerful analytical tool to aid in the design evolution of this system through hardware, a thermodynamic fluid flow model was developed. This model allows the Lawrence Livermore Laboratory to verify that the design meets desired goals and to play what if games during the design evolution. For example, what if the helium flow rate is changed in the magnet liquid helium flow loop; how does this affect the temperature, fluid quality, and pressure. This manual provides all the information required to run all or portions of this program as desired. In addition, the program is constructed in a modular fashion so changes or modifications can be made easily to keep up with the evolving design

  14. Integration of small computers in the low budget facility

    International Nuclear Information System (INIS)

    Miller, G.E.; Crofoot, T.A.

    1988-01-01

    Inexpensive computers (PC's) are well within the reach of low budget reactor facilities. It is possible to envisage many uses that will both improve capabilities of existing instrumentation and also assist operators and staff with certain routine tasks. Both of these opportunities are important for survival at facilities with severe budget and staffing limitations. (author)

  15. Computer technology: its potential for industrial energy conservation. A technology applications manual

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-01-01

    Today, computer technology is within the reach of practically any industrial corporation regardless of product size. This manual highlights a few of the many applications of computers in the process industry and provides the technical reader with a basic understanding of computer technology, terminology, and the interactions among the various elements of a process computer system. The manual has been organized to separate process applications and economics from computer technology. Chapter 1 introduces the present status of process computer technology and describes the four major applications - monitoring, analysis, control, and optimization. The basic components of a process computer system also are defined. Energy-saving applications in the four major categories defined in Chapter 1 are discussed in Chapter 2. The economics of process computer systems is the topic of Chapter 3, where the historical trend of process computer system costs is presented. Evaluating a process for the possible implementation of a computer system requires a basic understanding of computer technology as well as familiarity with the potential applications; Chapter 4 provides enough technical information for an evaluation. Computer and associated peripheral costs and the logical sequence of steps in the development of a microprocessor-based process control system are covered in Chapter 5.

  16. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  17. Sandia National Laboratories Facilities Management and Operations Center Design Standards Manual

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Timothy L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-09-01

    protection, mechanical, electrical, telecommunications, and security features is expected to ensure compatibility with planned functional equipment and to facilitate constructability. If portions of the design are subcontracted to specialists, delivery of the finished design documents must not be considered complete until the subcontracted portions are also submitted for review. You must, along with support consultants, perform functional analyses and programming in developing design solutions. These solutions must reflect coordination of the competing functional, budgetary, and physical requirements for the project. During design phases, meetings between you and the SNL/NM Project Team to discuss and resolve design issues are required. These meetings are a normal part of the design process. For specific design-review requirements, see the project-specific Design Criteria. In addition to the design requirements described in this manual, instructive information is provided to explain the sustainable building practice goals for design, construction, operation, and maintenance of SNL/NM facilities. Please notify SNL/NM personnel of design best practices not included in this manual, so they can be incorporated in future updates. You must convey all documents describing work to the SNL/NM Project Manager in both hard copy and in an electronic format compatible with the SNL/NM-prescribed CADD and other software packages, and in accordance with a SNL/NM approved standard format. Print all hard copy versions of submitted documents (excluding drawings and renderings) double-sided when practical.

  18. Measurement of mesothelioma on thoracic CT scans: A comparison of manual and computer-assisted techniques

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Oxnard, Geoffrey R.; MacMahon, Heber; Vogelzang, Nicholas J.; Kindler, Hedy L.; Kocherginsky, Masha; Starkey, Adam

    2004-01-01

    Our purpose in this study was to evaluate the variability of manual mesothelioma tumor thickness measurements in computed tomography (CT) scans and to assess the relative performance of six computerized measurement algorithms. The CT scans of 22 patients with malignant pleural mesothelioma were collected. In each scan, an initial observer identified up to three sites in each of three CT sections at which tumor thickness measurements were to be made. At each site, five observers manually measured tumor thickness through a computer interface. Three observers repeated these measurements during three separate sessions. Inter- and intra-observer variability in the manual measurement of tumor thickness was assessed. Six automated measurement algorithms were developed based on the geometric relationship between a specified measurement site and the automatically extracted lung regions. Computer-generated measurements were compared with manual measurements. The tumor thickness measurements of different observers were highly correlated (r≥0.99); however, the 95% limits of agreement for relative inter-observer difference spanned a range of 30%. Tumor thickness measurements generated by the computer algorithms also correlated highly with the average of observer measurements (r≥0.93). We have developed computerized techniques for the measurement of mesothelioma tumor thickness in CT scans. These techniques achieved varying levels of agreement with measurements made by human observers

  19. The computer code Eurdyn - 1 M. (Release 1) Part 2: User's Manual

    International Nuclear Information System (INIS)

    Donea, J.; Giuliani, S.

    1979-01-01

    This report is the user's manual for the computer code Eurdyn-1 M developed at the J.R.C. Ispra for use in containment and fuel subassembly analyses for fast reactor safety studies. The input data are defined and a test problem is presented to illustrate both the input and the output of results

  20. Operation and maintenance manual for the temporary septic holding tank at the 100-D remedial action support facility. Revision 2

    International Nuclear Information System (INIS)

    Kelty, G.G.

    1996-10-01

    This manual was prepared to provide detailed information for the operation and maintenance of the sanitary wastewater holding system at the 100-D Remedial Action Support Facility located in the 100-DR-1 Operable Unit at the Hanford Site. This document describes operations, including the type and frequency of required maintenance, and system failure response procedures

  1. Operation and maintenance manual for the temporary septic holding tank at the 100-D remedial action support facility. Revision 1

    International Nuclear Information System (INIS)

    Kelty, G.G.

    1996-09-01

    This manual provides detailed information for the operation and maintenance of the sanitary wastewater holding system at the 100-D Remedial Action Support Facility located in the 100-DR-1 Operable Unit of the Hanford Site. This document describes operations, including the type and frequency of required maintenance, and system failure response procedures

  2. Environmental Compliance and Pollution Prevention Training Manual for Campus-Based Organizations--Operational and Facility Maintenance Personnel.

    Science.gov (United States)

    New York State Dept. of Environmental Conservation, Albany.

    This manual was designed to be used as part of the Workshop on Environmental Compliance and Pollution Prevention for campus-based facilities. It contains basic information on New York state and federal laws, rules, and regulations for protecting the environment. The information presented is a summary with emphasis on those items believed to be…

  3. User's manual for BINIAC: A computer code to translate APET bins

    International Nuclear Information System (INIS)

    Gough, S.T.

    1994-03-01

    This report serves as the user's manual for the FORTRAN code BINIAC. BINIAC is a utility code designed to format the output from the Defense Waste Processing Facility (DWPF) Accident Progression Event Tree (APET) methodology. BINIAC inputs the accident progression bins from the APET methodology, converts the frequency from occurrences per hour to occurrences per year, sorts the progression bins, and converts the individual dimension character codes into facility attributes. Without the use of BINIAC, this process would be done manually at great time expense. BINIAC was written under the quality assurance control of IQ34 QAP IV-1, revision 0, section 4.1.4. Configuration control is established through the use of a proprietor and a cognizant users list

  4. A data acquisition computer for high energy physics applications DAFNE:- hardware manual

    International Nuclear Information System (INIS)

    Barlow, J.; Seller, P.; De-An, W.

    1983-07-01

    A high performance stand alone computer system based on the Motorola 68000 micro processor has been built at the Rutherford Appleton Laboratory. Although the design was strongly influenced by the requirement to provide a compact data acquisition computer for the high energy physics environment, the system is sufficiently general to find applications in a wider area. It provides colour graphics and tape and disc storage together with access to CAMAC systems. This report is the hardware manual of the data acquisition computer, DAFNE (Data Acquisition For Nuclear Experiments), and as such contains a full description of the hardware structure of the computer system. (author)

  5. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  6. URSULA2 computer program. Volume 3. User's manual. Final report

    International Nuclear Information System (INIS)

    Singhal, A.K.

    1980-01-01

    This report is intended to provide documentation for the users of the URSULA2 code so that they can appreciate its important features such as: code structure, flow chart, grid notations, coding style, usage of secondary storage and its interconnection with the input preparation program (Reference H3201/4). Subroutines and subprograms have been divided into four functional groups. The functions of all subroutines have been explained with particular emphasis on the control subroutine (MAIN) and the data input subroutine (BLOCK DATA). Computations for the flow situations similar to the reference case can be performed simply by making alterations in BLOCK DATA. Separate guides for the preparation of input data and for the interpretation of program output have been provided. Furthermore, two appendices; one for the URSULA2 listing and the second for the glossary of FORTRAN variables, are included to make this report self-sufficient

  7. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  8. National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Van Arsdall, P.J. LLNL

    1998-01-01

    The NIF design team is developing the Integrated Computer Control System (ICCS), which is based on an object-oriented software framework applicable to event-driven control systems. The framework provides an open, extensible architecture that is sufficiently abstract to construct future mission-critical control systems. The ICCS will become operational when the first 8 out of 192 beams are activated in mid 2000. The ICCS consists of 300 front-end processors attached to 60,000 control points coordinated by a supervisory system. Computers running either Solaris or VxWorks are networked over a hybrid configuration of switched fast Ethernet and asynchronous transfer mode (ATM). ATM carries digital motion video from sensors to operator consoles. Supervisory software is constructed by extending the reusable framework components for each specific application. The framework incorporates services for database persistence, system configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. More than twenty collaborating software applications are derived from the common framework. The framework is interoperable among different kinds of computers and functions as a plug-in software bus by leveraging a common object request brokering architecture (CORBA). CORBA transparently distributes the software objects across the network. Because of the pivotal role played, CORBA was tested to ensure adequate performance

  9. Computers in experimental nuclear power facilities

    International Nuclear Information System (INIS)

    Jukl, M.

    1982-01-01

    The CIS 3000 information system is described used for monitoring the operating modes of large technological equipment. The CIS system consists of two ADT computers, an external drum store an analog input side, a bivalent input side, 4 control consoles with monitors and acoustic signalling, a print-out area with typewriters and punching machines and linear recorders. Various applications are described of the installed CIS configuration as is the general-purpose program for processing measured values into a protocol. The program operates in the conversational mode. Different processing variants are shown on the display monitor. (M.D.)

  10. Manual on quality assurance for computer software related to the safety of nuclear power plants

    International Nuclear Information System (INIS)

    1988-01-01

    The objective of the Manual is to provide guidance in the assurance of quality of specification, design, maintenance and use of computer software related to items and activities important to safety (hereinafter referred to as safety related) in nuclear power plants. This guidance is consistent with, and supplements, the requirements and recommendations of Quality Assurance for Safety in Nuclear Power Plants: A Code of Practice, 50-C-QA, and related Safety Guides on quality assurance for nuclear power plants. Annex A identifies the IAEA documents referenced in the Manual. The Manual is intended to be of use to all those who, in any way, are involved with software for safety related applications for nuclear power plants, including auditors who may be called upon to audit management systems and product software. Figs

  11. Computer applications for the Fast Flux Test Facility

    International Nuclear Information System (INIS)

    Worth, G.A.; Patterson, J.R.

    1976-01-01

    Computer applications for the FFTF reactor include plant surveillance functions and fuel handling and examination control functions. Plant surveillance systems provide the reactor operator with a selection of over forty continuously updated, formatted displays of correlated data. All data are checked for limits and validity and the operator is advised of any anomaly. Data are also recorded on magnetic tape for historical purposes. The system also provides calculated variables, such as reactor thermal power and anomalous reactivity. Supplementing the basic plant surveillance computer system is a minicomputer system that monitors the reactor cover gas to detect and characterize absorber or fuel pin failures. In addition to plant surveillance functions, computers are used in the FFTF for controlling selected refueling equipment and for post-irradiation fuel pin examination. Four fuel handling or examination systems operate under computer control with manual monitoring and over-ride capability

  12. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  13. VARIABILITY OF MANUAL AND COMPUTERIZED METHODS FOR MEASURING CORONAL VERTEBRAL INCLINATION IN COMPUTED TOMOGRAPHY IMAGES

    Directory of Open Access Journals (Sweden)

    Tomaž Vrtovec

    2015-06-01

    Full Text Available Objective measurement of coronal vertebral inclination (CVI is of significant importance for evaluating spinal deformities in the coronal plane. The purpose of this study is to systematically analyze and compare manual and computerized measurements of CVI in cross-sectional and volumetric computed tomography (CT images. Three observers independently measured CVI in 14 CT images of normal and 14 CT images of scoliotic vertebrae by using six manual and two computerized measurements. Manual measurements were obtained in coronal cross-sections by manually identifying the vertebral body corners, which served to measure CVI according to the superior and inferior tangents, left and right tangents, and mid-endplate and mid-wall lines. Computerized measurements were obtained in two dimensions (2D and in three dimensions (3D by manually initializing an automated method in vertebral centroids and then searching for the planes of maximal symmetry of vertebral anatomical structures. The mid-endplate lines were the most reproducible and reliable manual measurements (intra- and inter-observer variability of 0.7° and 1.2° standard deviation, SD, respectively. The computerized measurements in 3D were more reproducible and reliable (intra- and inter-observer variability of 0.5° and 0.7° SD, respectively, but were most consistent with the mid-wall lines (2.0° SD and 1.4° mean absolute difference. The manual CVI measurements based on mid-endplate lines and the computerized CVI measurements in 3D resulted in the lowest intra-observer and inter-observer variability, however, computerized CVI measurements reduce observer interaction.

  14. Survey of computer codes applicable to waste facility performance evaluations

    International Nuclear Information System (INIS)

    Alsharif, M.; Pung, D.L.; Rivera, A.L.; Dole, L.R.

    1988-01-01

    This study is an effort to review existing information that is useful to develop an integrated model for predicting the performance of a radioactive waste facility. A summary description of 162 computer codes is given. The identified computer programs address the performance of waste packages, waste transport and equilibrium geochemistry, hydrological processes in unsaturated and saturated zones, and general waste facility performance assessment. Some programs also deal with thermal analysis, structural analysis, and special purposes. A number of these computer programs are being used by the US Department of Energy, the US Nuclear Regulatory Commission, and their contractors to analyze various aspects of waste package performance. Fifty-five of these codes were identified as being potentially useful on the analysis of low-level radioactive waste facilities located above the water table. The code summaries include authors, identification data, model types, and pertinent references. 14 refs., 5 tabs

  15. COMPUTER ORIENTED FACILITIES OF TEACHING AND INFORMATIVE COMPETENCE

    Directory of Open Access Journals (Sweden)

    Olga M. Naumenko

    2010-09-01

    Full Text Available In the article it is considered the history of views to the tasks of education, estimations of its effectiveness from the point of view of forming of basic vitally important competences. Opinions to the problem in different countries, international organizations, corresponding experience of the Ukrainian system of education are described. The necessity of forming of informative competence of future teacher is reasonable in the conditions of application of the computer oriented facilities of teaching at the study of naturally scientific cycle subjects in pedagogical colleges. Prognosis estimations concerning the development of methods of application of computer oriented facilities of teaching are presented.

  16. Neutronic computational modeling of the ASTRA critical facility using MCNPX

    International Nuclear Information System (INIS)

    Rodriguez, L. P.; Garcia, C. R.; Milian, D.; Milian, E. E.; Brayner, C.

    2015-01-01

    The Pebble Bed Very High Temperature Reactor is considered as a prominent candidate among Generation IV nuclear energy systems. Nevertheless the Pebble Bed Very High Temperature Reactor faces an important challenge due to the insufficient validation of computer codes currently available for use in its design and safety analysis. In this paper a detailed IAEA computational benchmark announced by IAEA-TECDOC-1694 in the framework of the Coordinated Research Project 'Evaluation of High Temperature Gas Cooled Reactor (HTGR) Performance' was solved in support of the Generation IV computer codes validation effort using MCNPX ver. 2.6e computational code. In the IAEA-TECDOC-1694 were summarized a set of four calculational benchmark problems performed at the ASTRA critical facility. Benchmark problems include criticality experiments, control rod worth measurements and reactivity measurements. The ASTRA Critical Facility at the Kurchatov Institute in Moscow was used to simulate the neutronic behavior of nuclear pebble bed reactors. (Author)

  17. Automated computation of femoral angles in dogs from three-dimensional computed tomography reconstructions: Comparison with manual techniques.

    Science.gov (United States)

    Longo, F; Nicetto, T; Banzato, T; Savio, G; Drigo, M; Meneghello, R; Concheri, G; Isola, M

    2018-02-01

    The aim of this ex vivo study was to test a novel three-dimensional (3D) automated computer-aided design (CAD) method (aCAD) for the computation of femoral angles in dogs from 3D reconstructions of computed tomography (CT) images. The repeatability and reproducibility of three manual radiography, manual CT reconstructions and the aCAD method for the measurement of three femoral angles were evaluated: (1) anatomical lateral distal femoral angle (aLDFA); (2) femoral neck angle (FNA); and (3) femoral torsion angle (FTA). Femoral angles of 22 femurs obtained from 16 cadavers were measured by three blinded observers. Measurements were repeated three times by each observer for each diagnostic technique. Femoral angle measurements were analysed using a mixed effects linear model for repeated measures to determine the levels of intra-observer agreement (repeatability) and inter-observer agreement (reproducibility). Repeatability and reproducibility of measurements using the aCAD method were excellent (intra-class coefficients, ICCs≥0.98) for all three angles assessed. Manual radiography and CT exhibited excellent agreement for the aLDFA measurement (ICCs≥0.90). However, FNA repeatability and reproducibility were poor (ICCscomputation of the 3D aCAD method provided the highest repeatability and reproducibility among the tested methodologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The manual of a computer software 'FBR Plant Planning Design Prototype System'

    International Nuclear Information System (INIS)

    2003-10-01

    This is a manual of a computer software 'FBR Plant Planning Design Prototype System', which enables users to conduct case studies of deviated FBR design concepts based on 'MONJU'. The calculations simply proceed as the user clicks displayed buttons, therefore step-by-step explanation is supposed not be necessary. The following pages introduce only particular features of this software, i.e, each interactive screens, functions of buttons and consequences after clicks, and the quitting procedure. (author)

  19. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    Krammen, J.

    1985-01-01

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  20. Computer-Assisted School Facility Planning with ONPASS.

    Science.gov (United States)

    Urban Decision Systems, Inc., Los Angeles, CA.

    The analytical capabilities of ONPASS, an on-line computer-aided school facility planning system, are described by its developers. This report describes how, using the Canoga Park-Winnetka-Woodland Hills Planning Area as a test case, the Department of City Planning of the city of Los Angeles employed ONPASS to demonstrate how an on-line system can…

  1. A Manual of Simplified Laboratory Methods for Operators of Wastewater Treatment Facilities.

    Science.gov (United States)

    Westerhold, Arnold F., Ed.; Bennett, Ernest C., Ed.

    This manual is designed to provide the small wastewater treatment plant operator, as well as the new or inexperienced operator, with simplified methods for laboratory analysis of water and wastewater. It is emphasized that this manual is not a replacement for standard methods but a guide for plants with insufficient equipment to perform analyses…

  2. Study of developing nuclear fabrication facility's integrated emergency response manual

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taeh Yeong; Cho, Nam Chan; Han, Seung Hoon; Moon, Jong Han; Lee, Jin Hang [KEPCO, Daejeon (Korea, Republic of); Min, Guem Young; Han, Ji Ah [Dongguk Univ., Daejeon (Korea, Republic of)

    2016-05-15

    Public begin to pay attention to emergency management. Thus, public's consensus on having high level of emergency management system up to advanced country's is reached. In this social atmosphere, manual is considered as key factor to prevent accident or secure business continuity. Therefore, we first define possible crisis at KEPCO Nuclear Fuel (hereinafter KNF) and also make a 'Reaction List' for each crisis situation at the view of information-design. To achieve it, we analyze several country's crisis response manual and then derive component, indicate duties and roles at the information-design point of view. From this, we suggested guideline to make 'Integrated emergency response manual(IERM)'. The manual we used before have following few problems; difficult to applicate at the site, difficult to deliver information. To complement these problems, we searched manual elements from the view of information-design. As a result, we develop administrative manual. Although, this manual could be thought as fragmentary manual because it confined specific several agency/organization and disaster type.

  3. GASFLOW: A Computational Fluid Dynamics Code for Gases, Aerosols, and Combustion, Volume 2: User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, B. D.; Mueller, C.; Necker, G. A.; Travis, J. R.; Spore, J. W.; Lam, K. L.; Royl, P.; Wilson, T. L.

    1998-10-01

    Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best-estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the walls and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containment and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included. Volume III

  4. The CAIN computer code for the generation of MABEL input data sets: a user's manual

    International Nuclear Information System (INIS)

    Tilley, D.R.

    1983-03-01

    CAIN is an interactive FORTRAN computer code designed to overcome the substantial effort involved in manually creating the thermal-hydraulics input data required by MABEL-2. CAIN achieves this by processing output from either of the whole-core codes, RELAP or TRAC, interpolating where necessary, and by scanning RELAP/TRAC output in order to generate additional information. This user's manual describes the actions required in order to create RELAP/TRAC data sets from magnetic tape, to create the other input data sets required by CAIN, and to operate the interactive command procedure for the execution of CAIN. In addition, the CAIN code is described in detail. This programme of work is part of the Nuclear Installations Inspectorate (NII)'s contribution to the United Kingdom Atomic Energy Authority's independent safety assessment of pressurized water reactors. (author)

  5. Users manual for CAFE-3D : a computational fluid dynamics fire code

    International Nuclear Information System (INIS)

    Khalil, Imane; Lopez, Carlos; Suo-Anttila, Ahti Jorma

    2005-01-01

    The Container Analysis Fire Environment (CAFE) computer code has been developed to model all relevant fire physics for predicting the thermal response of massive objects engulfed in large fires. It provides realistic fire thermal boundary conditions for use in design of radioactive material packages and in risk-based transportation studies. The CAFE code can be coupled to commercial finite-element codes such as MSC PATRAN/THERMAL and ANSYS. This coupled system of codes can be used to determine the internal thermal response of finite element models of packages to a range of fire environments. This document is a user manual describing how to use the three-dimensional version of CAFE, as well as a description of CAFE input and output parameters. Since this is a user manual, only a brief theoretical description of the equations and physical models is included

  6. Computer mapping and visualization of facilities for planning of D and D operations

    International Nuclear Information System (INIS)

    Wuller, C.E.; Gelb, G.H.; Cramond, R.; Cracraft, J.S.

    1995-01-01

    The lack of as-built drawings for many old nuclear facilities impedes planning for decontamination and decommissioning. Traditional manual walkdowns subject workers to lengthy exposure to radiological and other hazards. The authors have applied close-range photogrammetry, 3D solid modeling, computer graphics, database management, and virtual reality technologies to create geometrically accurate 3D computer models of the interiors of facilities. The required input to the process is a set of photographs that can be acquired in a brief time. They fit 3D primitive shapes to objects of interest in the photos and, at the same time, record attributes such as material type and link patches of texture from the source photos to facets of modeled objects. When they render the model as either static images or at video rates for a walk-through simulation, the phototextures are warped onto the objects, giving a photo-realistic impression. The authors have exported the data to commercial CAD, cost estimating, robotic simulation, and plant design applications. Results from several projects at old nuclear facilities are discussed

  7. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  8. Molecular Science Computing Facility Scientific Challenges: Linking Across Scales

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Windus, Theresa L.

    2005-07-01

    The purpose of this document is to define the evolving science drivers for performing environmental molecular research at the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and to provide guidance associated with the next-generation high-performance computing center that must be developed at EMSL's Molecular Science Computing Facility (MSCF) in order to address this critical research. The MSCF is the pre-eminent computing facility?supported by the U.S. Department of Energy's (DOE's) Office of Biological and Environmental Research (BER)?tailored to provide the fastest time-to-solution for current computational challenges in chemistry and biology, as well as providing the means for broad research in the molecular and environmental sciences. The MSCF provides integral resources and expertise to emerging EMSL Scientific Grand Challenges and Collaborative Access Teams that are designed to leverage the multiple integrated research capabilities of EMSL, thereby creating a synergy between computation and experiment to address environmental molecular science challenges critical to DOE and the nation.

  9. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  10. COMPUTER ORIENTED FACILITIES OF TEACHING AND INFORMATIVE COMPETENCE

    OpenAIRE

    Olga M. Naumenko

    2010-01-01

    In the article it is considered the history of views to the tasks of education, estimations of its effectiveness from the point of view of forming of basic vitally important competences. Opinions to the problem in different countries, international organizations, corresponding experience of the Ukrainian system of education are described. The necessity of forming of informative competence of future teacher is reasonable in the conditions of application of the computer oriented facilities of t...

  11. Operating procedures: Fusion Experiments Analysis Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lerche, R.A.; Carey, R.W.

    1984-03-20

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility.

  12. Operating procedures: Fusion Experiments Analysis Facility

    International Nuclear Information System (INIS)

    Lerche, R.A.; Carey, R.W.

    1984-01-01

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility

  13. Shielding Calculations for Positron Emission Tomography - Computed Tomography Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baasandorj, Khashbayar [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Jeongseon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    Integrated PET-CT has been shown to be more accurate for lesion localization and characterization than PET or CT alone, and the results obtained from PET and CT separately and interpreted side by side or following software based fusion of the PET and CT datasets. At the same time, PET-CT scans can result in high patient and staff doses; therefore, careful site planning and shielding of this imaging modality have become challenging issues in the field. In Mongolia, the introduction of PET-CT facilities is currently being considered in many hospitals. Thus, additional regulatory legislation for nuclear and radiation applications is necessary, for example, in regulating licensee processes and ensuring radiation safety during the operations. This paper aims to determine appropriate PET-CT shielding designs using numerical formulas and computer code. Since presently there are no PET-CT facilities in Mongolia, contact was made with radiological staff at the Nuclear Medicine Center of the National Cancer Center of Mongolia (NCCM) to get information about facilities where the introduction of PET-CT is being considered. Well-designed facilities do not require additional shielding, which should help cut down overall costs related to PET-CT installation. According to the results of this study, building barrier thicknesses of the NCCM building is not sufficient to keep radiation dose within the limits.

  14. PLEXFIN a computer model for the economic assessment of nuclear power plant life extension. User's manual

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA developed PLEXFIN, a computer model analysis tool aimed to assist decision makers in the assessment of the economic viability of a nuclear power plant life/licence extension. This user's manual was produced to facilitate the application of the PLEXFIN computer model. It is widely accepted in the industry that the operational life of a nuclear power plant is not limited to a pre-determined number of years, sometimes established on non-technical grounds, but by the capability of the plant to comply with the nuclear safety and technical requirements in a cost effective manner. The decision to extend the license/life of a nuclear power plant involves a number of political, technical and economic issues. The economic viability is a cornerstone of the decision-making process. In a liberalized electricity market, the economics to justify a nuclear power plant life/license extension decision requires a more complex evaluation. This user's manual was elaborated in the framework of the IAEA's programmes on Continuous process improvement of NPP operating performance, and on Models for analysis and capacity building for sustainable energy development, with the support of four consultants meetings

  15. Mass Balance. Operational Control Tests for Wastewater Treatment Facilities. Instructor's Manual [and] Student Workbook.

    Science.gov (United States)

    Carnegie, John W.

    This module describes the process used to determine solids mass and location throughout a waste water treatment plant, explains how these values are used to determine the solids mass balance around single treatment units and the entire system, and presents calculations of solids in pounds and sludge units. The instructor's manual contains a…

  16. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  17. Licence applications for low and intermediate level waste predisposal facilities: A manual for operators

    International Nuclear Information System (INIS)

    2009-07-01

    This publication covers all predisposal waste management facilities and practices for receipt, pretreatment (sorting, segregation, characterization), treatment, conditioning, internal relocation and storage of low and intermediate level radioactive waste, including disused sealed radioactive sources. The publication contains an Annex presenting the example of a safety assessment for a small radioactive waste storage facility. Facilities dealing with both short lived and long lived low and intermediate level waste generated from nuclear applications and from operation of small nuclear research reactors are included in the scope. Processing and storage facilities for high activity disused sealed sources and sealed sources containing long lived radionuclides are also covered. The publication does not cover facilities processing or storing radioactive waste from nuclear power plants or any other industrial scale nuclear fuel cycle facilities. Disposal facilities are excluded from the scope of this publication. Authorization process can be implemented in several stages, which may start at the site planning and the feasibility study stage and will continue through preliminary design, final design, commissioning, operation and decommissioning stages. This publication covers primarily the authorization needed to take the facility into operation

  18. Comparison of 3D computer-aided with manual cerebral aneurysm measurements in different imaging modalities

    International Nuclear Information System (INIS)

    Groth, M.; Buhk, J.H.; Schoenfeld, M.; Goebell, E.; Fiehler, J.; Forkert, N.D.

    2013-01-01

    To compare intra- and inter-observer reliability of aneurysm measurements obtained by a 3D computer-aided technique with standard manual aneurysm measurements in different imaging modalities. A total of 21 patients with 29 cerebral aneurysms were studied. All patients underwent digital subtraction angiography (DSA), contrast-enhanced (CE-MRA) and time-of-flight magnetic resonance angiography (TOF-MRA). Aneurysm neck and depth diameters were manually measured by two observers in each modality. Additionally, semi-automatic computer-aided diameter measurements were performed using 3D vessel surface models derived from CE- (CE-com) and TOF-MRA (TOF-com) datasets. Bland-Altman analysis (BA) and intra-class correlation coefficient (ICC) were used to evaluate intra- and inter-observer agreement. BA revealed the narrowest relative limits of intra- and inter-observer agreement for aneurysm neck and depth diameters obtained by TOF-com (ranging between ±5.3 % and ±28.3 %) and CE-com (ranging between ±23.3 % and ±38.1 %). Direct measurements in DSA, TOF-MRA and CE-MRA showed considerably wider limits of agreement. The highest ICCs were observed for TOF-com and CE-com (ICC values, 0.92 or higher for intra- as well as inter-observer reliability). Computer-aided aneurysm measurement in 3D offers improved intra- and inter-observer reliability and a reproducible parameter extraction, which may be used in clinical routine and as objective surrogate end-points in clinical trials. (orig.)

  19. In vitro comparative study of manual and mechanical rotary instrumentation of root canals using computed tomography.

    Science.gov (United States)

    Limongi, Orlando; de Albuquerque, Diana Santana; Baratto Filho, Flares; Vanni, José Roberto; de Oliveira, Elias P Motcy; Barletta, Fernando Branco

    2007-01-01

    This in vitro study compared, using computed tomography (CT), the amount of dentin removed from root canal walls by manual and mechanical rotary instrumentation techniques. Forty mandibular incisors with dental crown and a single canal were selected. The teeth were randomly assigned to two groups, according to the technique used for root canal preparation: Group I - manual instrumentation with stainless steel files; Group II - mechanical instrumentation with RaCe rotary nickel-titanium instruments. In each tooth, root dentin thickness of the buccal, lingual, mesial and distal surfaces in the apical, middle and cervical thirds of the canal was measured (in mm) using a multislice CT scanner (Siemens Emotion, Duo). Data were stored in the SPSS v. 11.5 and SigmaPlot 2001 v. 7.101 softwares. After crown opening, working length was determined, root canals were instrumented and new CT scans were taken for assessment of root dentin thickness. Pre- and post-instrumentation data were compared and analyzed statistically by ANOVA and Tukey's post-hoc test for significant differences (p=0.05). Based on the findings of this study, it may be concluded that regarding dentin removal from root canal walls during instrumentation, neither of the techniques can be considered more effective than the other.

  20. The Argonne Leadership Computing Facility 2010 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Drugan, C. (LCF)

    2011-05-09

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued to provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers

  1. User's manual for SPLPLOT-2: a computer code for data plotting and editing in conversational mode

    International Nuclear Information System (INIS)

    Muramatsu, Ken; Matsumoto, Kiyoshi; Kohsaka, Atsuo; Maniwa, Masaki.

    1985-07-01

    The computer code SPLPLOT-2 for plotting and data editing has been developed as a part of the code package: SPLPACK-1. The SPLPLOT-2 code has capabilities of both conversational and batch processings. This report describes the user's manual for SPLPLOT-2. The following improvements have been made in the SPLPLOT-2. (1) It has capabilities of both conversational and batch processings, (2) function of conversion of files from the input SPL (Standard PLotter) files to internal work files have been implemented to reduce number of time consuming access to the input SPL files, (3) user supplied subroutines can be assigned for data editing from the SPL files, (4) in addition to the two-dimensional graphs, streamline graphs, contour line graphs and bird's-eye view graphs can be drawn. (author)

  2. Theoretical background and user's manual for the computer code on groundwater flow and radionuclide transport calculation in porous rock

    International Nuclear Information System (INIS)

    Shirakawa, Toshihiko; Hatanaka, Koichiro

    2001-11-01

    In order to document a basic manual about input data, output data, execution of computer code on groundwater flow and radionuclide transport calculation in heterogeneous porous rock, we investigated the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport which calculates water flow in three dimension, the path of moving radionuclide, and one dimensional radionuclide migration. In this report, based on above investigation we describe the geostatistical background about simulating heterogeneous permeability field. And we describe construction of files, input and output data, a example of calculating of the programs which simulates heterogeneous permeability field, and calculates groundwater flow and radionuclide transport. Therefore, we can document a manual by investigating the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport calculation. And we can model heterogeneous porous rock and analyze groundwater flow and radionuclide transport by utilizing the information from this report. (author)

  3. Operating manual for the High Flux Isotope Reactor. Description of the facility

    Energy Technology Data Exchange (ETDEWEB)

    None

    1965-06-01

    This report contains a comprehensive description of the High Flux Isotope Reactor facility. Its primary purpose is to supplement the detailed operating procedures, providing the reactor operators with background information on the various HFIR systems. The detailed operating procedures are presented in another report.

  4. Operating manual for the High Flux Isotope Reactor. Volume I. Description of the facility

    International Nuclear Information System (INIS)

    1982-09-01

    This volume contains a comprehensive description of the High Flux Isotope Reactor Facility. Its primary purpose is to supplement the detailed operating procedures, providing the reactor operators with background information on the various HFIR systems. The detailed operating procdures are presented in another report

  5. Operating manual for the High Flux Isotope Reactor. Volume I. Description of the facility

    Energy Technology Data Exchange (ETDEWEB)

    1982-09-01

    This volume contains a comprehensive description of the High Flux Isotope Reactor Facility. Its primary purpose is to supplement the detailed operating procedures, providing the reactor operators with background information on the various HFIR systems. The detailed operating procdures are presented in another report.

  6. Ferry Terminals and Small Craft Berthing Facilities. Design Manual 25.5.

    Science.gov (United States)

    1981-07-01

    become water-logged and sink in a few years. Use is generally not recommended. (2) Extruded polystyrene (Styrofoam). Available in several sizes of precast...Supplemental facilities, such as boardroom, coffee- break room or snack bar, engineering room, and storage room, fol large installations. b

  7. Use of toxicity assessment to develop site specific remediation criteria for oil and gas facilities : guidance manual

    International Nuclear Information System (INIS)

    1996-01-01

    The results of a two year study into the evaluation of toxicity-based methods to develop site-specific, risk-based cleanup objectives for the decommissioning of oil and gas facilities were compiled into a manual of guidance. The two basic approaches used in determining remediation criteria for contaminated sites are: (1) comparison of the concentrations of chemicals found on-site with broad regional or national soil and water quality objectives developed for the chemicals involved, and (2) site-specific risk assessment. Toxicity tests are used to test organisms such as earthworms, lettuce seeds, or larval fish directly in the soil, water or sediment suspected of being contaminated. The effects of any contamination on the survival, growth, reproduction, and behaviour of the test organisms are then evaluated. The manual provides guidance in: (1) using toxicity assessments within the regulatory framework of site decommissioning, (2) performing a toxicity assessment, and (3) developing site-specific criteria for a risk assessment. 18 refs., 3 tabs., 5 figs

  8. Assessing the effects of manual dexterity and playing computer games on catheter-wire manipulation for inexperienced operators.

    Science.gov (United States)

    Alsafi, Z; Hameed, Y; Amin, P; Shamsad, S; Raja, U; Alsafi, A; Hamady, M S

    2017-09-01

    To investigate the effect of playing computer games and manual dexterity on catheter-wire manipulation in a mechanical aortic model. Medical student volunteers filled in a preprocedure questionnaire assessing their exposure to computer games. Their manual dexterity was measured using a smartphone game. They were then shown a video clip demonstrating renal artery cannulation and were asked to reproduce this. All attempts were timed. Two-tailed Student's t-test was used to compare continuous data, while Fisher's exact test was used for categorical data. Fifty students aged 18-22 years took part in the study. Forty-six completed the task at an average of 168 seconds (range 103-301 seconds). There was no significant difference in the dexterity score or time to cannulate the renal artery between male and female students. Students who played computer games for >10 hours per week had better dexterity scores than those who did not play computer games: 9.1 versus 10.2 seconds (p=0.0237). Four of 19 students who did not play computer games failed to complete the task, while all of those who played computer games regularly completed the task (p=0.0168). Playing computer games is associated with better manual dexterity and ability to complete a basic interventional radiology task for novices. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  9. Micro computed tomography features of laryngeal fractures in a case of fatal manual strangulation.

    Science.gov (United States)

    Fais, Paolo; Giraudo, Chiara; Viero, Alessia; Miotto, Diego; Bortolotti, Federica; Tagliaro, Franco; Montisci, Massimo; Cecchetto, Giovanni

    2016-01-01

    Cases of subtle fatal neck compression are often complicated by the lack of specificity of the post-mortem signs of asphyxia and by the lack of clear signs of neck compression. Herein we present a forensic case of a 45-year-old schizophrenic patient found on the floor of the bedroom of a psychiatric ward in cardiopulmonary arrest and who died after two days in a vegetative state. The deposition of the roommate of the deceased, who claimed responsibility for the killing of the victim by neck compression, was considered unreliable by the prosecutor. Autopsy, toxicological analyses, and multi-slice computed tomography (MSCT), micro computed tomography (micro-CT) and histology of the larynx complex were performed. Particularly, micro-CT analysis of the thyroid cartilage revealed the bilateral presence of ossified triticeous cartilages and the complete fragmentation of the right superior horn of the thyroid, but it additionally demonstrated a fracture on the contralateral superior horns, which was not clearly diagnosable at MSCT. On the basis of the evidence of intracartilaginous laryngeal hemorrhages and bilateral microfracture at the base of the superior horns of the larynx, the death was classified as a case of asphyxia due to manual strangulation. Micro-CT was confirmed as a useful tool in cases of subtle fatal neck compression, for the detection of minute laryngeal cartilage fractures, especially in complex cases with equivocal findings on MSCT. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Exercise manual for the Augmented Computer Exercise for Inspection Training (ACE-IT) software

    Energy Technology Data Exchange (ETDEWEB)

    Dobranich, P.R.; Widney, T.W.; Goolsby, P.T. [Sandia National Labs., Albuquerque, NM (United States). Cooperative Monitoring Center and Regional Security; Nelson, J.D.; Evanko, D.A. [Ogden Environmental and Energy Services, Inc., Albuquerque, NM (United States)

    1997-09-01

    The on-site inspection provisions in many current and proposed arms control agreements require extensive preparation and training on the part of both the Inspected Party and the Inspection Team. Current training techniques include table-top inspections and practice inspections. The Augmented Computer Exercise for Inspection Training (ACE-IT), an interactive computer training tool, increases the utility of table-top inspections. ACE-IT has been designed to provide training for a hypothetical challenge inspection under the Chemical Weapons Convention (CWC); however, this training tool can be modified for other inspection regimes. Although ACE-IT provides training from notification of an inspection through post-inspection activities, the primary emphasis of ACE-IT is in the inspection itself--particularly with the concept of managed access. ACE-IT also demonstrates how inspection provisions impact compliance determination and the protection of sensitive information. The Exercise Manual supplements the ACE-IT software by providing general information on on-site inspections and detailed information for the CWC challenge inspection exercise. The detailed information includes the pre-inspection briefing, maps, list of sensitive items, medical records, and shipping records.

  11. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    Science.gov (United States)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  12. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E. [Sandia National Labs., Albuquerque, NM (United States); Tills, J. [J. Tills and Associates, Inc., Sandia Park, NM (United States)

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  13. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    International Nuclear Information System (INIS)

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions

  14. Computer analysis of digital sky surveys using citizen science and manual classification

    Science.gov (United States)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  15. A guide for the selection of computer assisted mapping (CAM) and facilities informations systems

    Energy Technology Data Exchange (ETDEWEB)

    Haslin, S.; Baxter, P.; Jarvis, L.

    1980-12-01

    Many distribution engineers are now aware that computer assisted mapping (CAM) and facilities informations systems are probably the most significant breakthrough to date in computer applications for distribution engineering. The Canadian Electrical Asociation (CEA) recognized this and requested engineers of B.C. Hydro make a study of the state of the art in Canadian utilities and the progress of CAM systems on an international basis. The purpose was to provide a guide to assist Canadian utility distribution engineers faced with the problem of studying the application of CAM systems as an alternative to present methods, consideration being given to the long-term and other benefits that were perhaps not apparent for those approaching this field for the first time. It soon became apparent that technology was developing at a high rate and competition in the market was very strong. Also a number of publications were produced by other sources which adequately covered the scope of this study. This report is thus a collection of references to reports, manuals, and other documents with a few considerations provided for those companies interested in exploring further the use of interactive graphics. 24 refs.

  16. Evaluation of the discrete vortex wake cross flow model using vector computers. Part 2: User's manual for DIVORCE

    Science.gov (United States)

    Deffenbaugh, F. D.; Vitz, J. F.

    1979-01-01

    The users manual for the Discrete Vortex Cross flow Evaluator (DIVORCE) computer program is presented. DIVORCE was developed in FORTRAN 4 for the DCD 6600 and CDC 7600 machines. Optimal calls to a NASA vector subroutine package are provided for use with the CDC 7600.

  17. High resolution muon computed tomography at neutrino beam facilities

    International Nuclear Information System (INIS)

    Suerfu, B.; Tully, C.G.

    2016-01-01

    X-ray computed tomography (CT) has an indispensable role in constructing 3D images of objects made from light materials. However, limited by absorption coefficients, X-rays cannot deeply penetrate materials such as copper and lead. Here we show via simulation that muon beams can provide high resolution tomographic images of dense objects and of structures within the interior of dense objects. The effects of resolution broadening from multiple scattering diminish with increasing muon momentum. As the momentum of the muon increases, the contrast of the image goes down and therefore requires higher resolution in the muon spectrometer to resolve the image. The variance of the measured muon momentum reaches a minimum and then increases with increasing muon momentum. The impact of the increase in variance is to require a higher integrated muon flux to reduce fluctuations. The flux requirements and level of contrast needed for high resolution muon computed tomography are well matched to the muons produced in the pion decay pipe at a neutrino beam facility and what can be achieved for momentum resolution in a muon spectrometer. Such an imaging system can be applied in archaeology, art history, engineering, material identification and whenever there is a need to image inside a transportable object constructed of dense materials

  18. Argonne Leadership Computing Facility 2011 annual report : Shaping future supercomputing.

    Energy Technology Data Exchange (ETDEWEB)

    Papka, M.; Messina, P.; Coffey, R.; Drugan, C. (LCF)

    2012-08-16

    The ALCF's Early Science Program aims to prepare key applications for the architecture and scale of Mira and to solidify libraries and infrastructure that will pave the way for other future production applications. Two billion core-hours have been allocated to 16 Early Science projects on Mira. The projects, in addition to promising delivery of exciting new science, are all based on state-of-the-art, petascale, parallel applications. The project teams, in collaboration with ALCF staff and IBM, have undertaken intensive efforts to adapt their software to take advantage of Mira's Blue Gene/Q architecture, which, in a number of ways, is a precursor to future high-performance-computing architecture. The Argonne Leadership Computing Facility (ALCF) enables transformative science that solves some of the most difficult challenges in biology, chemistry, energy, climate, materials, physics, and other scientific realms. Users partnering with ALCF staff have reached research milestones previously unattainable, due to the ALCF's world-class supercomputing resources and expertise in computation science. In 2011, the ALCF's commitment to providing outstanding science and leadership-class resources was honored with several prestigious awards. Research on multiscale brain blood flow simulations was named a Gordon Bell Prize finalist. Intrepid, the ALCF's BG/P system, ranked No. 1 on the Graph 500 list for the second consecutive year. The next-generation BG/Q prototype again topped the Green500 list. Skilled experts at the ALCF enable researchers to conduct breakthrough science on the Blue Gene system in key ways. The Catalyst Team matches project PIs with experienced computational scientists to maximize and accelerate research in their specific scientific domains. The Performance Engineering Team facilitates the effective use of applications on the Blue Gene system by assessing and improving the algorithms used by applications and the techniques used to

  19. Follow 1.1 - a program for visualization of Thermal-Hydraulic computer simulations. User's manual

    International Nuclear Information System (INIS)

    Hyvarinen, J.

    1990-04-01

    FOLLOW is a computer program designed to function as an analyst's aid when performing large thermalhydraulic and related safety calculations using the well known simulation codes RELAP5, MELCOR, SMABRE and TRAB. The code is a by-product of the effort to improve the analysis capabilities of the Finnish Centre for Radiation and Nuclear Safety (STUK). FOLLOW's most important application is as an on-line 'window' into the progress of the simulation calculation. The thermal-hydraulic analyses related to nuclear safety routinely require very long calculation times. FOLLOW provides a possibility to follow the course of the simulation and thus make observations of the results already during the simulation. FOLLOW's various outputs have been designed to mimic those available at nuclear power plant operators' console. Thus FOLLOW can also be used much like a nuclear power plant simulator. This manual describes the usages, features and input requirements of FOLLOW version 1.1, including a sample problem input and various outputs. (orig.)

  20. Spent Nuclear Fuel (SNF) Project Cold Vacuum Drying (CVD) Facility Operations Manual

    International Nuclear Information System (INIS)

    IRWIN, J.J.

    2000-01-01

    The mission of the Spent Nuclear Fuel (SNF) Project Cold Vacuum Drying Facility (CVDF) is to achieve the earliest possible removal of free water from Multi-Canister Overpacks (MCOs). The MCOs contain metallic uranium SNF that have been removed from the 100K Area fuel storage water basins (i.e., the K East and K West Basins) at the US. Department of Energy Hanford Site in Southeastern Washington state. Removal of free water is necessary to halt water-induced corrosion of exposed uranium surfaces and to allow the MCOs and their SNF payloads to be safely transported to the Hanford Site 200 East Area and stored within the SNF Project Canister Storage Building (CSB). The CVDF is located within a few hundred yards of the basins, southwest of the 165KW Power Control Building and the 105KW Reactor Building. The site area required for the facility and vehicle circulation is approximately 2 acres. Access and egress is provided by the main entrance to the 100K inner area using existing roadways. The CVDF will remove free. water from the MCOs to reduce the potential for continued fuel-water corrosion reactions. The cold vacuum drying process involves the draining of bulk water from the MCO and subsequent vacuum drying. The MCO will be evacuated to a pressure of 8 torr or less and backfilled with an inert gas (helium). The MCO will be sealed, leak tested, and then transported to the CSB within a sealed shipping cask. (The MCO remains within the same shipping Cask from the time it enters the basin to receive its SNF payload until it is removed from the Cask by the CSB MCO handling machine.) The CVDF subproject acquired the required process systems, supporting equipment, and facilities. The cold vacuum drying operations result in an MCO containing dried fuel that is prepared for shipment to the CSB by the Cask transportation system. The CVDF subproject also provides equipment to dispose of solid wastes generated by the cold vacuum drying process and transfer process water removed

  1. Operation and maintenance manual of the accelerator installed in the facility of radiation standards

    International Nuclear Information System (INIS)

    Fujii, Katsutoshi; Kawasaki, Katsuya; Kowatari, Munehiko; Tanimura, Yoshihiko; Kajimoto, Yoichi; Shimizu, Shigeru

    2006-08-01

    4MV Van de Graff accelerator was installed in the Facility of Radiation Standards (FRS) in June 2000, and monoenergetic neutron calibration fields and high energy γ-ray calibration fields have been developed. The calibration fields are provided for R and D on dosimetry, and for the calibration and type-test of radiation protection instruments. This article describes the operational procedure, the maintenance work and the operation of the related apparatuses of the accelerator. This article focuses on the sufficient safety and radiation control for the operators, the maintenance performance of the accelerator, and on the prevention of the malfunction due to the mistakes of the operators. This article targets the unexperienced engineers in charge of operation and maintenance of the accelerator. (author)

  2. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  3. Academic Computing Facilities and Services in Higher Education--A Survey.

    Science.gov (United States)

    Warlick, Charles H.

    1986-01-01

    Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…

  4. Public Computer Assisted Learning Facilities for Children with Visual Impairment: Universal Design for Inclusive Learning

    Science.gov (United States)

    Siu, Kin Wai Michael; Lam, Mei Seung

    2012-01-01

    Although computer assisted learning (CAL) is becoming increasingly popular, people with visual impairment face greater difficulty in accessing computer-assisted learning facilities. This is primarily because most of the current CAL facilities are not visually impaired friendly. People with visual impairment also do not normally have access to…

  5. Integration of a browser based operator manual in the system environment of a process computer system; Integration eines browserbasierten Betriebshandbuchs in die Systemumgebung einer Prozessrechneranlage

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Andreas [Westinghouse Electric Germany GmbH (Germany); Erfle, Robert [DOSCO GmbH, Heidelberg (Germany); Feinkohl, Dirk [E.ON Kernkraft GmbH (Germany). Kernkraftwerk Unterweser

    2012-11-01

    The integration of a browser based operator manual in the system environment of a process computer system is an optimization of the operating procedure in the control room and a safety enhancement due to faster and error-free access to the manual contents. Several requirements by the authorities have to be fulfilled: the operating manual has to be available as hard copy, the format has to be true to original, protection against manipulation has to be provided, the manual content of the browser-based version and the hard copy have to identical, and the display presentation has to be consistent with ergonomic principals. The integration of the on-line manual in the surveillance process computer system provides the operator with the relevant comments to the surveillance signal. The described integration of the on-line manual is an optimization of the operator's everyday job with respect to ergonomics and safety (human performance).

  6. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    Science.gov (United States)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  7. Computer modelling of the UK wind energy resource: UK wind speed data package and user manual

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    A software package has been developed for IBM-PC or true compatibles. It is designed to provide easy access to the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. With the wind speed software package, the user is able to obtain a display of the modelled wind speed at 10m, 25m and 45m above ground level for any location in the UK. The required co-ordinates are simply supplied by the user, and the package displays the selected wind speed. This user manual summarises the methodology used in the generation of these UK maps and shows computer generated plots of the 25m wind speeds in 200 x 200 km regions covering the whole UK. The uncertainties inherent in the derivation of these maps are also described, and notes given on their practical usage. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (18 figures, 3 tables, 6 references). (author)

  8. Accuracy of computer-calculated and manual QRS duration assessments: Clinical implications to select candidates for cardiac resynchronization therapy.

    Science.gov (United States)

    De Pooter, Jan; El Haddad, Milad; Stroobandt, Roland; De Buyzere, Marc; Timmermans, Frank

    2017-06-01

    QRS duration (QRSD) plays a key role in the field of cardiac resynchronization therapy (CRT). Computer-calculated QRSD assessments are widely used, however inter-manufacturer differences have not been investigated in CRT candidates. QRSD was assessed in 377 digitally stored ECGs: 139 narrow QRS, 140 LBBB and 98 ventricular paced ECGs. Manual QRSD was measured as global QRSD, using digital calipers, by two independent observers. Computer-calculated QRSD was assessed by Marquette 12SL (GE Healthcare, Waukesha, WI, USA) and SEMA3 (Schiller, Baar, Switzerland). Inter-manufacturer differences of computer-calculated QRSD assessments vary among different QRS morphologies: narrow QRSD: 4 [2-9] ms (median [IQR]), p=0.010; LBBB QRSD: 7 [2-10] ms, p=0.003 and paced QRSD: 13 [6-18] ms, p=0.007. Interobserver differences of manual QRSD assessments measured: narrow QRSD: 4 [2-6] ms, p=non-significant; LBBB QRSD: 6 [3-12] ms, p=0.006; paced QRSD: 8 [4-18] ms, p=0.001. In LBBB ECGs, intraclass correlation coefficients (ICCs) were comparable for inter-manufacturer and interobserver agreement (ICC 0.830 versus 0.837). When assessing paced QRSD, manual measurements showed higher ICC compared to inter-manufacturer agreement (ICC 0.902 versus 0.776). Using guideline cutoffs of 130ms, up to 15% of the LBBB ECGs would be misclassified as <130ms or ≥130ms by at least one method. Using a cutoff of 150ms, this number increases to 33% of ECGs being misclassified. However, by combining LBBB-morphology and QRSD, the number of misclassified ECGs can be decreased by half. Inter-manufacturer differences in computer-calculated QRSD assessments are significant and may compromise adequate selection of individual CRT candidates when using QRSD as sole parameter. Paced QRSD should preferentially be assessed by manual QRSD measurements. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Computer-based guidelines for concrete pavements : HIPERPAV III : user manual

    Science.gov (United States)

    2009-10-01

    This user manual provides guidance on how to use the new High PERformance PAVing (HIPERPAV) III software program for the analysis of early-age Portland cement concrete pavement (PCCP) behavior. HIPERPAV III includes several improvements over prev...

  10. An analytical model for computation of reliability of waste management facilities with intermediate storages

    International Nuclear Information System (INIS)

    Kallweit, A.; Schumacher, F.

    1977-01-01

    A high reliability is called for waste management facilities within the fuel cycle of nuclear power stations which can be fulfilled by providing intermediate storage facilities and reserve capacities. In this report a model based on the theory of Markov processes is described which allows computation of reliability characteristics of waste management facilities containing intermediate storage facilities. The application of the model is demonstrated by an example. (orig.) [de

  11. TrueAllele casework on Virginia DNA mixture evidence: computer and manual interpretation in 72 reported criminal cases.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Mixtures are a commonly encountered form of biological evidence that contain DNA from two or more contributors. Laboratory analysis of mixtures produces data signals that usually cannot be separated into distinct contributor genotypes. Computer modeling can resolve the genotypes up to probability, reflecting the uncertainty inherent in the data. Human analysts address the problem by simplifying the quantitative data in a threshold process that discards considerable identification information. Elevated stochastic threshold levels potentially discard more information. This study examines three different mixture interpretation methods. In 72 criminal cases, 111 genotype comparisons were made between 92 mixture items and relevant reference samples. TrueAllele computer modeling was done on all the evidence samples, and documented in DNA match reports that were provided as evidence for each case. Threshold-based Combined Probability of Inclusion (CPI and stochastically modified CPI (mCPI analyses were performed as well. TrueAllele's identification information in 101 positive matches was used to assess the reliability of its modeling approach. Comparison was made with 81 CPI and 53 mCPI DNA match statistics that were manually derived from the same data. There were statistically significant differences between the DNA interpretation methods. TrueAllele gave an average match statistic of 113 billion, CPI averaged 6.68 million, and mCPI averaged 140. The computer was highly specific, with a false positive rate under 0.005%. The modeling approach was precise, having a factor of two within-group standard deviation. TrueAllele accuracy was indicated by having uniformly distributed match statistics over the data set. The computer could make genotype comparisons that were impossible or impractical using manual methods. TrueAllele computer interpretation of DNA mixture evidence is sensitive, specific, precise, accurate and more informative than manual

  12. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  13. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  14. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  15. On-line satellite/central computer facility of the Multiparticle Argo Spectrometer System

    International Nuclear Information System (INIS)

    Anderson, E.W.; Fisher, G.P.; Hien, N.C.; Larson, G.P.; Thorndike, A.M.; Turkot, F.; von Lindern, L.; Clifford, T.S.; Ficenec, J.R.; Trower, W.P.

    1974-09-01

    An on-line satellite/central computer facility has been developed at Brookhaven National Laboratory as part of the Multiparticle Argo Spectrometer System (MASS). This facility consisting of a PDP-9 and a CDC-6600, has been successfully used in study of proton-proton interactions at 28.5 GeV/c. (U.S.)

  16. Implementation of computer security at nuclear facilities in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Lochthofen, Andre; Sommer, Dagmar [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    2013-07-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  17. Implementation of computer security at nuclear facilities in Germany

    International Nuclear Information System (INIS)

    Lochthofen, Andre; Sommer, Dagmar

    2013-01-01

    In recent years, electrical and I and C components in nuclear power plants (NPPs) were replaced by software-based components. Due to the increased number of software-based systems also the threat of malevolent interferences and cyber-attacks on NPPs has increased. In order to maintain nuclear security, conventional physical protection measures and protection measures in the field of computer security have to be implemented. Therefore, the existing security management process of the NPPs has to be expanded to computer security aspects. In this paper, we give an overview of computer security requirements for German NPPs. Furthermore, some examples for the implementation of computer security projects based on a GRS-best-practice-approach are shown. (orig.)

  18. Computer-aided system for cryogenic research facilities

    International Nuclear Information System (INIS)

    Gerasimov, V.P.; Zhelamsky, M.V.; Mozin, I.V.; Repin, S.S.

    1994-01-01

    A computer-aided system is developed for the more effective choice and optimization of the design and manufacturing technologies of the superconductor for the magnet system of the International Thermonuclear Experimental Reactor (ITER) with the aim to ensure the superconductor certification. The computer-aided system provides acquisition, processing, storage and display of data describing the proceeding tests, the detection of any parameter deviations and their analysis. Besides, it generates commands for the equipment switch off in emergency situations. ((orig.))

  19. GMS 3: A users manual for an improved generalised multigroup scheme using WDSN MK2 for the KDF9 computer

    International Nuclear Information System (INIS)

    Barnett, M.R.; Ward, J.A.

    1968-08-01

    GMS 3 is an extension of GMS1 written for the KDF9 computer. An improved DSN routine has been incorporated, with consequent reduction in running time in many instances. Three alternative nuclear data libraries are now available, the latest of these being entirely consistent with that of WIMS. A treatment of neutron streaming in simple voids is now incorporated in the routine which prepares few-group macroscopic parameters. This report is primarily a Users Manual, but in addition includes details of theory and programming. (author)

  20. A Computer Assisted Program for the Management of Acute Dental Pain: Programmer’s Manual

    Science.gov (United States)

    1990-02-06

    periodontal abscesses ?" option$(l, 1)= " 1. Yes" option$(2, 1)= " 2. No" Iongest=8 numops=2 qrow=2 qcol=5 call piques((ques$)) call prioptions ans=-O call...Osteitis (Dry Socket) DATA 20, Osseous Sequestrum DENTAL Programmer’s Manual A-94 DATA 30, Abscess /Infection/Cellulitis DATA 42, Periodontal Abscess ...DATA 53, Reversible Pulpitis DATA 64, Irreversible Pulpitis DATA 75, Acute Apical Abscess DATA 87, Acute Apical Periodontitis DATA 99, Carious Lesion

  1. Computer aided optimum design of rubble-mound breakwater cross-sections : Manual of the RUMBA computer package, release 1

    NARCIS (Netherlands)

    De Haan, W.

    1989-01-01

    The computation of the optimum rubble-mound breakwater crosssection is executed on a micro-computer. The RUMBA computer package consists of two main parts: the optimization process is executed by a Turbo Pascal programme, the second part consists of editing functions written in AutoLISP. AutoLISP is

  2. Biosafety Manual

    Energy Technology Data Exchange (ETDEWEB)

    King, Bruce W.

    2010-05-18

    Work with or potential exposure to biological materials in the course of performing research or other work activities at Lawrence Berkeley National Laboratory (LBNL) must be conducted in a safe, ethical, environmentally sound, and compliant manner. Work must be conducted in accordance with established biosafety standards, the principles and functions of Integrated Safety Management (ISM), this Biosafety Manual, Chapter 26 (Biosafety) of the Health and Safety Manual (PUB-3000), and applicable standards and LBNL policies. The purpose of the Biosafety Program is to protect workers, the public, agriculture, and the environment from exposure to biological agents or materials that may cause disease or other detrimental effects in humans, animals, or plants. This manual provides workers; line management; Environment, Health, and Safety (EH&S) Division staff; Institutional Biosafety Committee (IBC) members; and others with a comprehensive overview of biosafety principles, requirements from biosafety standards, and measures needed to control biological risks in work activities and facilities at LBNL.

  3. Operational facility-integrated computer system for safeguards

    International Nuclear Information System (INIS)

    Armento, W.J.; Brooksbank, R.E.; Krichinsky, A.M.

    1980-01-01

    A computer system for safeguards in an active, remotely operated, nuclear fuel processing pilot plant has been developed. This sytem maintains (1) comprehensive records of special nuclear materials, (2) automatically updated book inventory files, (3) material transfer catalogs, (4) timely inventory estimations, (5) sample transactions, (6) automatic, on-line volume balances and alarmings, and (7) terminal access and applications software monitoring and logging. Future development will include near-real-time SNM mass balancing as both a static, in-tank summation and a dynamic, in-line determination. It is planned to incorporate aspects of site security and physical protection into the computer monitoring

  4. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  5. MELCOR computer code manuals: Primer and user's guides, Version 1.8.3 September 1994. Volume 1

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users' Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  6. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  7. Fire Protection Program Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2012-05-18

    This manual documents the Lawrence Livermore National Laboratory (LLNL) Fire Protection Program. Department of Energy (DOE) Orders 420.1B, Facility Safety, requires LLNL to have a comprehensive and effective fire protection program that protects LLNL personnel and property, the public and the environment. The manual provides LLNL and its facilities with general information and guidance for meeting DOE 420.1B requirements. The recommended readers for this manual are: fire protection officers, fire protection engineers, fire fighters, facility managers, directorage assurance managers, facility coordinators, and ES and H team members.

  8. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  9. GO software version 3.0: Volume 1, Overview: Computer Code Manual

    International Nuclear Information System (INIS)

    1988-06-01

    The GO Methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, perform statistical uncertainty analysis, and evaluate the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO methodology, how it can be used, the benefits of using it in the analysis of complex systems, and a comparison of the methodology with fault tree analysis. 14 refs., 35 figs., 9 tabs

  10. New FORTRAN computer programs to acquire and process isotopic mass spectrometric data: Operator's manual

    International Nuclear Information System (INIS)

    Smith, D.H.; McKown, H.S.

    1993-09-01

    This TM is one of a pair that describes ORNL-developed software for acquisition and processing of isotope ratio mass spectral data. This TM is directed at the laboratory analyst. No technical knowledge of the programs and programming is required. It describes how to create and edit files, how to acquire and process data, and how to set up files to obtain the desired results. The aim of this TM is to serve as a utilitarian instruction manual, a open-quotes how toclose quotes approach rather than a open-quotes why?close quotes

  11. Demand and problems of computer-aided implementation of operating, emergency and quality assurance manuals

    International Nuclear Information System (INIS)

    Oppermann, W.; Rempe, W.; Boehme, P.

    1994-01-01

    Considerations and software solutions are presented to convert manuals for use in electronic data processing. It is demonstrated that a very consistent use of electronic data processing can contribute to reduce the probability of human failure in technical installations. However, the representation of information differs considerably from paper documentations, because the ergonomic possibilities of visualizing offered by data processing systems are used. In particular in nuclear engineering several questions and problems arise in this context which concern the structuring of the content of data processing documentations, and their verifiability by experts. (orig./DG) [de

  12. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  13. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  14. Improving CMS data transfers among its distributed computing facilities

    International Nuclear Information System (INIS)

    Flix, J; Magini, N; Sartirana, A

    2011-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on their usage, customizing the topologies and improving their setup in order to keep CMS transferring data at the desired levels in a reliable and robust way.

  15. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  16. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    Science.gov (United States)

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  17. Implementation of the Facility Integrated Inventory Computer System (FICS)

    International Nuclear Information System (INIS)

    McEvers, J.A.; Krichinsky, A.M.; Layman, L.R.; Dunnigan, T.H.; Tuft, R.M.; Murray, W.P.

    1980-01-01

    This paper describes a computer system which has been developed for nuclear material accountability and implemented in an active radiochemical processing plant involving remote operations. The system posesses the following features: comprehensive, timely records of the location and quantities of special nuclear materials; automatically updated book inventory files on the plant and sub-plant levels of detail; material transfer coordination and cataloging; automatic inventory estimation; sample transaction coordination and cataloging; automatic on-line volume determination, limit checking, and alarming; extensive information retrieval capabilities; and terminal access and application software monitoring and logging

  18. EURDYN: computer programs for the nonlinear transient analysis of structures submitted to dynamic loading. EURDYN (Release 3): users' manual

    International Nuclear Information System (INIS)

    Halleux, J.P.

    1983-01-01

    The EURDYN computer codes are mainly designed for the simulation of nonlinear dynamic response of fast-reactor compoments submitted to impulse loading due to abnormal working conditions. Two releases of the structural computer codes EURDYN 01 (2-D beams and triangles and axisymmetric conical shells and triangular tores), 02 (axisymmetric and 2-D quadratic isoparametric elements) and 03 (triangular plate elements) have already been produced. They include material (elasto-plasticity using the classical flow theory approach) and geometrical (large displacements and rotations treated by a corotational technique) nonlinearities. The new features of Release 3 roughly consist in: full large strain capability for 9-node isoparametric elements, generalized array dimensions, introduction of the radial return algorithm for elasto-plastic material modelling, extension of the energy check facility to the case of prescribed displacements, and, possible interface to a post-processing package including time plot facilities

  19. Computer usage among nurses in rural health-care facilities in South Africa: obstacles and challenges.

    Science.gov (United States)

    Asah, Flora

    2013-04-01

    This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.

  20. Development of computer model for radionuclide released from shallow-land disposal facility

    International Nuclear Information System (INIS)

    Suganda, D.; Sucipta; Sastrowardoyo, P.B.; Eriendi

    1998-01-01

    Development of 1-dimensional computer model for radionuclide release from shallow land disposal facility (SLDF) has been done. This computer model is used for the SLDF facility at PPTA Serpong. The SLDF facility is above 1.8 metres from groundwater and 150 metres from Cisalak river. Numerical method by implicit method of finite difference solution is chosen to predict the migration of radionuclide with any concentration.The migration starts vertically from the bottom of SLDF until the groundwater layer, then horizontally in the groundwater until the critical population group. Radionuclide Cs-137 is chosen as a sample to know its migration. The result of the assessment shows that the SLDF facility at PPTA Serpong has the high safety criteria. (author)

  1. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Graf, F.A. Jr.

    1995-02-27

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System`s pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System.

  2. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1995-01-01

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  3. DESIGN OF MANUAL MATERIAL HANDLING SYSTEM THROUGH COMPUTER AIDED ERGONOMICS: A CASE STUDY AT BDTSC TEXTILE FIRM

    Directory of Open Access Journals (Sweden)

    Amare Matebu

    2014-12-01

    Full Text Available Designing of lifting, pushing and pulling activities based on the physical and physiological capabilities of the operators is essential. The purpose of this study is to analyze manual material handling (MMH working posture of the operators using 3D Static Strength Prediction Program (3DSSPP software and to identify major areas causing long last injury of operators. The research has investigated the fit between the demands of tasks and the capabilities of operators. At the existing situations, the actual capabilities of operators have been computed with the help of 3DSSPP software and compared with NIOSH standards. Accordingly, operators' working posture is at an unacceptable position that exposes them for musculoskeletal disorders. Then, after the improvement of the design of MMH device (cart's roller, the result showed that the forces required by the operators to push and pull the sliver cans have been reduced from 931.77 Newton to 194.23 Newton. Furthermore, improvement of MMH cart's roller has reduced the awkward posture of operators and the risk of musculoskeletal disorders. The improved manual material handling design also saves about 1828.40 ETB per month for the company.

  4. Inter-observer and inter-examination variability of manual vertebral bone attenuation measurements on computed tomography

    International Nuclear Information System (INIS)

    Pompe, Esther; Lammers, Jan-Willem J.; Jong, Pim A. de; Jong, Werner U. de; Takx, Richard A.P.; Eikendal, Anouk L.M.; Willemink, Martin J.; Mohamed Hoesein, Firdaus A.A.; Oudkerk, Matthijs; Budde, Ricardo P.J.

    2016-01-01

    To determine inter-observer and inter-examination variability of manual attenuation measurements of the vertebrae in low-dose unenhanced chest computed tomography (CT). Three hundred and sixty-seven lung cancer screening trial participants who underwent baseline and repeat unenhanced low-dose CT after 3 months because of an indeterminate lung nodule were included. The CT attenuation value of the first lumbar vertebrae (L1) was measured in all CTs by one observer to obtain inter-examination reliability. Six observers performed measurements in 100 randomly selected CTs to determine agreement with limits of agreement and Bland-Altman plots and reliability with intraclass correlation coefficients (ICCs). Reclassification analyses were performed using a threshold of 110 HU to define osteoporosis. Inter-examination reliability was excellent with an ICC of 0.92 (p < 0.001). Inter-examination limits of agreement ranged from -26 to 28 HU with a mean difference of 1 ± 14 HU. Inter-observer reliability ICCs ranged from 0.70 to 0.91. Inter-examination variability led to 11.2 % reclassification of participants and inter-observer variability led to 22.1 % reclassification. Vertebral attenuation values can be manually quantified with good to excellent inter-examination and inter-observer reliability on unenhanced low-dose chest CT. This information is valuable for early detection of osteoporosis on low-dose chest CT. (orig.)

  5. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  6. May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals

    International Nuclear Information System (INIS)

    Bolado, R.; Alonso, A.; Moya, J.M.

    1996-07-01

    The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)

  7. Missouri Highway Safety Manual Recalibration

    Science.gov (United States)

    2018-05-01

    The Highway Safety Manual (HSM) is a national manual for analyzing the highway safety of various facilities, including rural roads, urban arterials, freeways, and intersections. The HSM was first published in 2010, and a 2014 supplement addressed fre...

  8. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  9. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  10. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  11. Computer security at ukrainian nuclear facilities: interface between nuclear safety and security

    International Nuclear Information System (INIS)

    Chumak, D.; Klevtsov, O.

    2015-01-01

    Active introduction of information technology, computer instrumentation and control systems (I and C systems) in the nuclear field leads to a greater efficiency and management of technological processes at nuclear facilities. However, this trend brings a number of challenges related to cyber-attacks on the above elements, which violates computer security as well as nuclear safety and security of a nuclear facility. This paper considers regulatory support to computer security at the nuclear facilities in Ukraine. The issue of computer and information security considered in the context of physical protection, because it is an integral component. The paper focuses on the computer security of I and C systems important to nuclear safety. These systems are potentially vulnerable to cyber threats and, in case of cyber-attacks, the potential negative impact on the normal operational processes can lead to a breach of the nuclear facility security. While ensuring nuclear security of I and C systems, it interacts with nuclear safety, therefore, the paper considers an example of an integrated approach to the requirements of nuclear safety and security

  12. Hanford general employee training: Computer-based training instructor's manual

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-01

    The Computer-Based Training portion of the Hanford General Employee Training course is designed to be used in a classroom setting with a live instructor. Future references to this course'' refer only to the computer-based portion of the whole. This course covers the basic Safety, Security, and Quality issues that pertain to all employees of Westinghouse Hanford Company. The topics that are covered were taken from the recommendations and requirements for General Employee Training as set forth by the Institute of Nuclear Power Operations (INPO) in INPO 87-004, Guidelines for General Employee Training, applicable US Department of Energy orders, and Westinghouse Hanford Company procedures and policy. Besides presenting fundamental concepts, this course also contains information on resources that are available to assist students. It does this using Interactive Videodisk technology, which combines computer-generated text and graphics with audio and video provided by a videodisk player.

  13. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  14. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  15. User Delay Cost Model and Facilities Maintenance Cost Model for a Terminal Control Area : Volume 3. User's Manual and Program Documentation for the Facilities Maintenance Cost Model

    Science.gov (United States)

    1978-05-01

    The Facilities Maintenance Cost Model (FMCM) is an analytic model designed to calculate expected annual labor costs of maintenance within a given FAA maintenance sector. The model is programmed in FORTRAN IV and has been demonstrated on the CDC Krono...

  16. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  17. User's manual for the vertical axis wind turbine performance computer code darter

    Energy Technology Data Exchange (ETDEWEB)

    Klimas, P. C.; French, R. E.

    1980-05-01

    The computer code DARTER (DARrieus, Turbine, Elemental Reynolds number) is an aerodynamic performance/loads prediction scheme based upon the conservation of momentum principle. It is the latest evolution in a sequence which began with a model developed by Templin of NRC, Canada and progressed through the Sandia National Laboratories-developed SIMOSS (SSImple MOmentum, Single Streamtube) and DART (SARrieus Turbine) to DARTER.

  18. Manual of phosphoric acid fuel cell power plant cost model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  19. Computer control versus manual control of systemic hypertension during cardiac surgery

    NARCIS (Netherlands)

    Hoeksel, S.A.A.P.; Blom, J.A.; Jansen, J.R.C.; Maessen, J.G.; Schreuder, J.J.

    2001-01-01

    Keywords:Cardiac surgery;hypertension;closed-loop controlBackground: We recently demonstrated the feasibility of computer controlled infusion of vasoactive drugs for the control of systemic hypertension during cardiac surgery. The objective of the current study was to investigate the effects of

  20. The OSG Open Facility: an on-ramp for opportunistic scientific computing

    Science.gov (United States)

    Jayatilaka, B.; Levshina, T.; Sehgal, C.; Gardner, R.; Rynge, M.; Würthwein, F.

    2017-10-01

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  1. The OSG Open Facility: An On-Ramp for Opportunistic Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Jayatilaka, B. [Fermilab; Levshina, T. [Fermilab; Sehgal, C. [Fermilab; Gardner, R. [Chicago U.; Rynge, M. [USC - ISI, Marina del Rey; Würthwein, F. [UC, San Diego

    2017-11-22

    The Open Science Grid (OSG) is a large, robust computing grid that started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero, but has evolved in recent years to a much larger user and resource platform. In addition to meeting the US LHC community’s computational needs, the OSG continues to be one of the largest providers of distributed high-throughput computing (DHTC) to researchers from a wide variety of disciplines via the OSG Open Facility. The Open Facility consists of OSG resources that are available opportunistically to users other than resource owners and their collaborators. In the past two years, the Open Facility has doubled its annual throughput to over 200 million wall hours. More than half of these resources are used by over 100 individual researchers from over 60 institutions in fields such as biology, medicine, math, economics, and many others. Over 10% of these individual users utilized in excess of 1 million computational hours each in the past year. The largest source of these cycles is temporary unused capacity at institutions affiliated with US LHC computational sites. An increasing fraction, however, comes from university HPC clusters and large national infrastructure supercomputers offering unused capacity. Such expansions have allowed the OSG to provide ample computational resources to both individual researchers and small groups as well as sizable international science collaborations such as LIGO, AMS, IceCube, and sPHENIX. Opening up access to the Fermilab FabrIc for Frontier Experiments (FIFE) project has also allowed experiments such as mu2e and NOvA to make substantial use of Open Facility resources, the former with over 40 million wall hours in a year. We present how this expansion was accomplished as well as future plans for keeping the OSG Open Facility at the forefront of enabling scientific research by way of DHTC.

  2. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  3. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Plessis, Anton du, E-mail: anton2@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Physics Department, Stellenbosch University, Stellenbosch (South Africa); Roux, Stephan Gerhard le, E-mail: lerouxsg@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Guelpa, Anina, E-mail: aninag@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa)

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory’s first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  4. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  5. Atmospheric dispersion calculation for posturated accident of nuclear facilities and the computer code: PANDA

    International Nuclear Information System (INIS)

    Kitahara, Yoshihisa; Kishimoto, Yoichiro; Narita, Osamu; Shinohara, Kunihiko

    1979-01-01

    Several Calculation methods for relative concentration (X/Q) and relative cloud-gamma dose (D/Q) of the radioactive materials released from nuclear facilities by posturated accident are presented. The procedure has been formulated as a Computer program PANDA and the usage is explained. (author)

  6. Taking the classical large audience university lecture online using tablet computer and webconferencing facilities

    DEFF Research Database (Denmark)

    Brockhoff, Per B.

    2011-01-01

    During four offerings (September 2008 – May 2011) of the course 02402 Introduction to Statistics for Engineering students at DTU, with an average of 256 students, the lecturing was carried out 100% through a tablet computer combined with the web conferencing facility Adobe Connect (version 7...

  7. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Science.gov (United States)

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  8. User's manual for QUERY: a computer program for retrieval of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Nyholm, R.A.

    1979-03-06

    QUERY is a computer program used for the retrieval of environmental data. The code was developed in support of the Imperial Valley Environmental Project of the Environmental Sciences division at Lawrence Livermore Laboratory to handle a multitude of environmentally related information. The program can run in either an interactive mode or production mode to retrieve these data. In either case, the user specifies a set of search constraints and then proceeds to select an output format from a menu of output options or to specify the output format according to his immediate needs. Basic data statistics can be requested. Merging of disparate data bases and subfile extraction are elementary.

  9. Manual cross check of computed dose times for motorised wedged fields

    International Nuclear Information System (INIS)

    Porte, J.

    2001-01-01

    If a mass of tissue equivalent material is exposed in turn to wedged and open radiation fields of the same size, for equal times, it is incorrect to assume that the resultant isodose pattern will be effectively that of a wedge having half the angle of the wedged field. Computer programs have been written to address the problem of creating an intermediate wedge field, commonly known as a motorized wedge. The total exposure time is apportioned between the open and wedged fields, to produce a beam modification equivalent to that of a wedged field of a given wedge angle. (author)

  10. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II) user's manual

    International Nuclear Information System (INIS)

    David P. Colton

    2007-01-01

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time

  11. Comparison of Intensity-Modulated Radiotherapy Planning Based on Manual and Automatically Generated Contours Using Deformable Image Registration in Four-Dimensional Computed Tomography of Lung Cancer Patients

    International Nuclear Information System (INIS)

    Weiss, Elisabeth; Wijesooriya, Krishni; Ramakrishnan, Viswanathan; Keall, Paul J.

    2008-01-01

    Purpose: To evaluate the implications of differences between contours drawn manually and contours generated automatically by deformable image registration for four-dimensional (4D) treatment planning. Methods and Materials: In 12 lung cancer patients intensity-modulated radiotherapy (IMRT) planning was performed for both manual contours and automatically generated ('auto') contours in mid and peak expiration of 4D computed tomography scans, with the manual contours in peak inspiration serving as the reference for the displacement vector fields. Manual and auto plans were analyzed with respect to their coverage of the manual contours, which were assumed to represent the anatomically correct volumes. Results: Auto contours were on average larger than manual contours by up to 9%. Objective scores, D 2% and D 98% of the planning target volume, homogeneity and conformity indices, and coverage of normal tissue structures (lungs, heart, esophagus, spinal cord) at defined dose levels were not significantly different between plans (p = 0.22-0.94). Differences were statistically insignificant for the generalized equivalent uniform dose of the planning target volume (p = 0.19-0.94) and normal tissue complication probabilities for lung and esophagus (p = 0.13-0.47). Dosimetric differences >2% or >1 Gy were more frequent in patients with auto/manual volume differences ≥10% (p = 0.04). Conclusions: The applied deformable image registration algorithm produces clinically plausible auto contours in the majority of structures. At this stage clinical supervision of the auto contouring process is required, and manual interventions may become necessary. Before routine use, further investigations are required, particularly to reduce imaging artifacts

  12. Distributed project scheduling at NASA: Requirements for manual protocols and computer-based support

    Science.gov (United States)

    Richards, Stephen F.

    1992-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of Space Shuttle mission planning.

  13. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  14. User's manual for the G.T.M.-1 computer code

    International Nuclear Information System (INIS)

    Prado-Herrero, P.

    1992-01-01

    This document describes the GTM-1 ( Geosphere Transport Model, release-1) computer code and is intended to provide the reader with enough detailed information in order to use the code. GTM-1 was developed for the assessment of radionuclide migration by the ground water through geologic deposits whose properties can change along the pathway.GTM-1 solves the transport equation by the finite differences method ( Crank-Nicolson scheme ). It was developped for specific use within Probabilistic System Assessment (PSA) Monte Carlo Method codes; in this context the first application of GTM-1 was within the LISA (Long Term Isolation System Assessment) code. GTM-1 is also available as an independent model, which includes various submodels simulating a multi-barrier disposal system. The code has been tested with the PSACOIN ( Probabilistic System Assessment Codes intercomparison) benchmarks exercises from PSAC User Group (OECD/NEA). 10 refs., 6 Annex., 2 tabs

  15. RECON: a computer program for analyzing repository economics. Documentation and user's manual

    International Nuclear Information System (INIS)

    Clark, L.L.; Cole, B.M.; McNair, G.W.; Schutz, M.E.

    1983-05-01

    From 1981 through 1983 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evalute the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through March of 1983. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input either using card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained

  16. RECON: a computer program for analyzing repository economics. Documentation and user's manual. Revision 1

    International Nuclear Information System (INIS)

    Clark, L.L.; Schutz, M.E.; Luksic, A.T.

    1985-07-01

    From 1981 through 1984 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evaluate the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through September of 1984. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input using either card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained. 2 refs

  17. Rotorcraft Flight Simulation Computer Program C81 with Datamap Interface, Volume 2. Programmer’s Manual

    Science.gov (United States)

    1981-10-01

    overview of the computer program capabilities and the principal mathematical models incorporated in the program are given in Volume I of the documentation...blank card must still be placed in the appropriate place in the Model Data Set. For example, the mathematical model of a UH-lH would not need a wing...SIN %4114 LV OADs CC £6 $ 2. 27 4L 4+ 4 l’l <m L & % CC O 04i18K5 twsSTVEL 13L4.S2 ASK 2? b 41.815 . 8 ’ILbLS a CL 7 8 £0 1 £0 £0 12 £6 1 M90 A/ MUfI H

  18. Users manual for the pursuit of the radiological status of the nuclear and radioactive facilities of the ININ

    International Nuclear Information System (INIS)

    Sotelo B, D.; Villarreal, J.E.

    1992-05-01

    The purpose of this program consists on a database that gives pursuit at the radiation levels in laboratories and facilities users of radioactive material or generators of ionizing radiations, introducing in it mensurations that were made in different departments, for its later analysis. (Author)

  19. Operations and Maintenance Manual for the Temporary Septic Holding Tank at the 100-C Remedial Action Restroom Facility

    International Nuclear Information System (INIS)

    Palmquist, C.A.

    1997-11-01

    The purpose of this document is to provide detailed information regarding the operations and maintenance of the septic holding tank system at the 100-C Remedial Action Restroom Facility. Specific information provided in this document includes the type and frequency of required maintenance and failure response procedures

  20. Operations and Maintenance Manual for the Temporary Septic Holding Tank at the 100-C Remedial Action Support Facility

    International Nuclear Information System (INIS)

    Palmquist, C.A.

    1997-12-01

    The purpose of this document is to provide detailed information regarding the operations and maintenance of the septic holding tank system at the 100-C Remedial Action Restroom Facility. Specific information provided in this document includes the type and frequency of required maintenance and failure response procedures

  1. Physical fitness training reference manual for security force personnel at fuel cycle facilities possessing formula quantities of special nuclear materials

    International Nuclear Information System (INIS)

    Arzino, P.A.; Caplan, C.S.; Goold, R.E.

    1991-09-01

    The recommendations contained throughout this NUREG are being provided to the Nuclear Regulatory Commission (NRC) as a reference manual which can be used by licensee management as they develop a program plan for the safe participation of guards, Tactical Response Team members (TRTs), and all other armed response personnel in physical fitness training and in physical performance standards testing. The information provided in this NUREG will help licensees to determine if guards, TRTs, and other armed response personnel can effectively perform their normal and emergency duties without undue hazard to themselves, to fellow employees, to the plant site, and to the general public. The recommendations in this NUREG are similar in part to those contained within the Department of Energy (DOE) Medical and Fitness Implementation Guide which was published in March 1991. The guidelines contained in this NUREG are not requirements, and compliance is not required. 25 refs

  2. Physical fitness training reference manual for security force personnel at fuel cycle facilities possessing formula quantities of special nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Arzino, P.A.; Caplan, C.S.; Goold, R.E. (California State Univ., Hayward, CA (United States). Foundation)

    1991-09-01

    The recommendations contained throughout this NUREG are being provided to the Nuclear Regulatory Commission (NRC) as a reference manual which can be used by licensee management as they develop a program plan for the safe participation of guards, Tactical Response Team members (TRTs), and all other armed response personnel in physical fitness training and in physical performance standards testing. The information provided in this NUREG will help licensees to determine if guards, TRTs, and other armed response personnel can effectively perform their normal and emergency duties without undue hazard to themselves, to fellow employees, to the plant site, and to the general public. The recommendations in this NUREG are similar in part to those contained within the Department of Energy (DOE) Medical and Fitness Implementation Guide which was published in March 1991. The guidelines contained in this NUREG are not requirements, and compliance is not required. 25 refs.

  3. SAFSIM theory manual: A computer program for the engineering simulation of flow systems

    Energy Technology Data Exchange (ETDEWEB)

    Dobranich, D.

    1993-12-01

    SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program for simulating the integrated performance of complex flow systems. SAFSIM provides sufficient versatility to allow the engineering simulation of almost any system, from a backyard sprinkler system to a clustered nuclear reactor propulsion system. In addition to versatility, speed and robustness are primary SAFSIM development goals. SAFSIM contains three basic physics modules: (1) a fluid mechanics module with flow network capability; (2) a structure heat transfer module with multiple convection and radiation exchange surface capability; and (3) a point reactor dynamics module with reactivity feedback and decay heat capability. Any or all of the physics modules can be implemented, as the problem dictates. SAFSIM can be used for compressible and incompressible, single-phase, multicomponent flow systems. Both the fluid mechanics and structure heat transfer modules employ a one-dimensional finite element modeling approach. This document contains a description of the theory incorporated in SAFSIM, including the governing equations, the numerical methods, and the overall system solution strategies.

  4. WISDAAM software programmer's manual

    International Nuclear Information System (INIS)

    Ball, J.R.

    1992-10-01

    The WISDAAM system was developed to provide quality control over test data associated with in situ testing at the Waste Isolation Pilot Plant (WIPP). Assurance of data quality is of critical importance as these tests supply the information which will be used for development and verification of the technology required for repository implementation. The amount of data collected from the tests, which are some of the largest ever fielded in an underground facility, prompted the undertaking of a major project task to address data processing. The goal was to create a conceptual umbrella under which all of the activities associated with processing WIPP data (i.e., data reduction, archiving, retrieval, etc.) could be grouped. The WISDAAM system was the product of this task. The overall system covers electronic as well as manual data processing; however, this document deals primarily with those operations implemented by software running on a VAX computer

  5. A users manual for a computer program which calculates time optical geocentric transfers using solar or nuclear electric and high thrust propulsion

    Science.gov (United States)

    Sackett, L. L.; Edelbaum, T. N.; Malchow, H. L.

    1974-01-01

    This manual is a guide for using a computer program which calculates time optimal trajectories for high-and low-thrust geocentric transfers. Either SEP or NEP may be assumed and a one or two impulse, fixed total delta V, initial high thrust phase may be included. Also a single impulse of specified delta V may be included after the low thrust state. The low thrust phase utilizes equinoctial orbital elements to avoid the classical singularities and Kryloff-Boguliuboff averaging to help insure more rapid computation time. The program is written in FORTRAN 4 in double precision for use on an IBM 360 computer. The manual includes a description of the problem treated, input/output information, examples of runs, and source code listings.

  6. Money for Research, Not for Energy Bills: Finding Energy and Cost Savings in High Performance Computer Facility Designs

    Energy Technology Data Exchange (ETDEWEB)

    Drewmark Communications; Sartor, Dale; Wilson, Mark

    2010-07-01

    High-performance computing facilities in the United States consume an enormous amount of electricity, cutting into research budgets and challenging public- and private-sector efforts to reduce energy consumption and meet environmental goals. However, these facilities can greatly reduce their energy demand through energy-efficient design of the facility itself. Using a case study of a facility under design, this article discusses strategies and technologies that can be used to help achieve energy reductions.

  7. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user's manual

    International Nuclear Information System (INIS)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.; Mallen, A.N.; Neymotin, L.Y.

    1998-03-01

    This document is the User's Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code's capabilities and limitations; Chapter 2 describes the code's structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARC and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs

  8. User's manual of a supporting system for treatment planning in boron neutron capture therapy. JAERI computational dosimetry system

    International Nuclear Information System (INIS)

    Kumada, Hiroaki; Torii, Yoshiya

    2002-09-01

    A boron neutron capture therapy (BNCT) with epithermal neutron beam is expected to treat effectively for malignant tumor that is located deeply in the brain. It is indispensable to estimate preliminarily the irradiation dose in the brain of a patient in order to perform the epithermal neutron beam BNCT. Thus, the JAERI Computational Dosimetry System (JCDS), which can calculate the dose distributions in the brain, has been developed. JCDS is a software that creates a 3-dimensional head model of a patient by using CT and MRI images and that generates a input data file automatically for calculation neutron flux and gamma-ray dose distribution in the brain by the Monte Carlo code: MCNP, and that displays the dose distribution on the head model for dosimetry by using the MCNP calculation results. JCDS has any advantages as follows; By treating CT data and MRI data which are medical images, a detail three-dimensional model of patient's head is able to be made easily. The three-dimensional head image is editable to simulate the state of a head after its surgical processes such as skin flap opening and bone removal for the BNCT with craniotomy that are being performed in Japan. JCDS can provide information for the Patient Setting System to set the patient in an actual irradiation position swiftly and accurately. This report describes basic design and procedure of dosimetry, operation manual, data and library structure for JCDS (ver.1.0). (author)

  9. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code

    International Nuclear Information System (INIS)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B.

    2003-01-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k eff (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  10. Computer Program for Calculation of Complex Chemical Equilibrium Compositions and Applications II. Users Manual and Program Description. 2; Users Manual and Program Description

    Science.gov (United States)

    McBride, Bonnie J.; Gordon, Sanford

    1996-01-01

    This users manual is the second part of a two-part report describing the NASA Lewis CEA (Chemical Equilibrium with Applications) program. The program obtains chemical equilibrium compositions of complex mixtures with applications to several types of problems. The topics presented in this manual are: (1) details for preparing input data sets; (2) a description of output tables for various types of problems; (3) the overall modular organization of the program with information on how to make modifications; (4) a description of the function of each subroutine; (5) error messages and their significance; and (6) a number of examples that illustrate various types of problems handled by CEA and that cover many of the options available in both input and output. Seven appendixes give information on the thermodynamic and thermal transport data used in CEA; some information on common variables used in or generated by the equilibrium module; and output tables for 14 example problems. The CEA program was written in ANSI standard FORTRAN 77. CEA should work on any system with sufficient storage. There are about 6300 lines in the source code, which uses about 225 kilobytes of memory. The compiled program takes about 975 kilobytes.

  11. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-01-01

    A proposal has been made to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multi-level, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data, through typical transformations and correlations, in under 30 sec. The throughput for such a facility, assuming five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600

  12. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-05-01

    A proposal has been made at LBL to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multilevel, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data through typical transformations and correlations in under 30 s. The throughput for such a facility, for five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600. 3 figures

  13. Health workers' knowledge of and attitudes towards computer applications in rural African health facilities.

    Science.gov (United States)

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje

    2014-01-01

    The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (pworkplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.

  14. Development of the computer code to monitor gamma radiation in the nuclear facility environment

    International Nuclear Information System (INIS)

    Akhmad, Y. R.; Pudjiyanto, M.S.

    1998-01-01

    Computer codes for gamma radiation monitoring in the vicinity of nuclear facility which have been developed could be introduced to the commercial potable gamma analyzer. The crucial stage of the first year activity was succeeded ; that is the codes have been tested to transfer data file (pulse high distribution) from Micro NOMAD gamma spectrometer (ORTEC product) and the convert them into dosimetry and physics quantities. Those computer codes are called as GABATAN (Gamma Analyzer of Batan) and NAGABAT (Natural Gamma Analyzer of Batan). GABATAN code can isable to used at various nuclear facilities for analyzing gamma field up to 9 MeV, while NAGABAT could be used for analyzing the contribution of natural gamma rays to the exposure rate in the certain location

  15. Computer program for storage of historical and routine safety data related to radiologically controlled facilities

    International Nuclear Information System (INIS)

    Marsh, D.A.; Hall, C.J.

    1984-01-01

    A method for tracking and quick retrieval of radiological status of radiation and industrial safety systems in an active or inactive facility has been developed. The system uses a mini computer, a graphics plotter, and mass storage devices. Software has been developed which allows input and storage of architectural details, radiological conditions such as exposure rates, current location of safety systems, and routine and historical information on exposure and contamination levels. A blue print size digitizer is used for input. The computer program retains facility floor plans in three dimensional arrays. The software accesses an eight pen color plotter for output. The plotter generates color plots of the floor plans and safety systems on 8 1/2 x 11 or 20 x 30 paper or on overhead transparencies for reports and presentations

  16. Maintenance of reactor safety and control computers at a large government facility

    International Nuclear Information System (INIS)

    Brady, H.G.

    1985-01-01

    In 1950 the US Government contracted the Du Pont Company to design, build, and operate the Savannah River Plant (SRP). At the time, it was the largest construction project ever undertaken by man. It is still the largest of the Department of Energy facilities. In the nearly 35 years that have elapsed, Du Pont has met its commitments to the US Government and set world safety records in the construction and operation of nuclear facilities. Contributing factors in achieving production goals and setting the safety records are a staff of highly qualified personnel, a well maintained plant, and sound maintenance programs. There have been many ''first ever'' achievements at SRP. These ''firsts'' include: (1) computer control of a nuclear rector, and (2) use of computer systems as safety circuits. This presentation discusses the maintenance program provided for these computer systems and all digital systems at SRP. An in-house computer maintenance program that was started in 1966 with five persons has grown to a staff of 40 with investments in computer hardware increasing from $4 million in 1970 to more than $60 million in this decade. 4 figs

  17. Opportunities for artificial intelligence application in computer- aided management of mixed waste incinerator facilities

    International Nuclear Information System (INIS)

    Rivera, A.L.; Ferrada, J.J.; Singh, S.P.N.

    1992-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site. It is designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conservation and Recovery Act (RCRA). This facility, known as the TSCA Incinerator, services seven DOE/OR installations. This incinerator was recently authorized for production operation in the United States for the processing of mixed (radioactively contaminated-chemically hazardous) wastes as regulated under TSCA and RCRA. Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. These requirements impact the characteristics and disposition of incinerator residues, limits the quality of liquid and gaseous effluents, limit the characteristics and rates of waste feeds and operating conditions, and restrict the handling of the waste feed inventories. This incinerator facility presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. Demonstrated computer-aided management systems could be transferred to future mixed waste incinerator facilities

  18. Automation of a cryogenic facility by commercial process-control computer

    International Nuclear Information System (INIS)

    Sondericker, J.H.; Campbell, D.; Zantopp, D.

    1983-01-01

    To insure that Brookhaven's superconducting magnets are reliable and their field quality meets accelerator requirements, each magnet is pre-tested at operating conditions after construction. MAGCOOL, the production magnet test facility, was designed to perform these tests, having the capacity to test ten magnets per five day week. This paper describes the control aspects of MAGCOOL and the advantages afforded the designers by the implementation of a commercial process control computer system

  19. TJ-II Library Manual (Version 2)

    International Nuclear Information System (INIS)

    Tribaldos, V.; Milligen, B. Ph. van; Lopez-Fraguas, A.

    2001-01-01

    This is a manual of use of the TJ2 Numerical Library that has been developed for making numerical computations of different TJ-II configurations. This manual is a new version of the earlier manual CIEMAT report 806. (Author)

  20. Operations and maintenance manual for the temporary septic holding tank at the 300-FF-1 Remedial Action Support Facility

    International Nuclear Information System (INIS)

    Gilkeson, D.E.; Jackson, G.J.

    1997-02-01

    This document provides detailed information regarding the operations and maintenance of the septic holding tank system at the 300-FF-1 Remedial Action Support Facility, located in the 300 Area. This document includes the type and frequency of requirement maintenance, failure response procedures, and reporting requirements. Sanitary wastewater and raw sewage will enter the holding tank via a sloped 102 mm polyvinyl chloride (PVC) line from the office trailers. The septic holding tank will be emptied, as required, by system demands. During normal usage, it is estimated that the tank will require pumping every 3 working days. Approximately 834 gallons of sanitary wastewater and raw sewage will be disposed of into the septic system during this time

  1. A Computer Simulation to Assess the Nuclear Material Accountancy System of a MOX Fuel Fabrication Facility

    International Nuclear Information System (INIS)

    Portaix, C.G.; Binner, R.; John, H.

    2015-01-01

    SimMOX is a computer programme that simulates container histories as they pass through a MOX facility. It performs two parallel calculations: · the first quantifies the actual movements of material that might be expected to occur, given certain assumptions about, for instance, the accumulation of material and waste, and of their subsequent treatment; · the second quantifies the same movements on the basis of the operator's perception of the quantities involved; that is, they are based on assumptions about quantities contained in the containers. Separate skeletal Excel computer programmes are provided, which can be configured to generate further accountancy results based on these two parallel calculations. SimMOX is flexible in that it makes few assumptions about the order and operational performance of individual activities that might take place at each stage of the process. It is able to do this because its focus is on material flows, and not on the performance of individual processes. Similarly there are no pre-conceptions about the different types of containers that might be involved. At the macroscopic level, the simulation takes steady operation as its base case, i.e., the same quantity of material is deemed to enter and leave the simulated area, over any given period. Transient situations can then be superimposed onto this base scene, by simulating them as operational incidents. A general facility has been incorporated into SimMOX to enable the user to create an ''act of a play'' based on a number of operational incidents that have been built into the programme. By doing this a simulation can be constructed that predicts the way the facility would respond to any number of transient activities. This computer programme can help assess the nuclear material accountancy system of a MOX fuel fabrication facility; for instance the implications of applying NRTA (near real time accountancy). (author)

  2. SNAP operating system reference manual

    International Nuclear Information System (INIS)

    Sabuda, J.D.; Polito, J.; Walker, J.L.; Grant, F.H. III.

    1982-03-01

    The SNAP Operating System (SOS) is a FORTRAN 77 program which provides assistance to the safeguards analyst who uses the Safeguards Automated Facility Evaluation (SAFE) and the Safeguards Network Analysis Procedure (SNAP) techniques. Features offered by SOS are a data base system for storing a library of SNAP applications, computer graphics representation of SNAP models, a computer graphics editor to develop and modify SNAP models, a SAFE-to-SNAP interface, automatic generation of SNAP input data, and a computer graphic post-processor for SNAP. The SOS Reference Manual provides detailed application information concerning SOS as well as a detailed discussion of all SOS components and their associated command input formats. SOS was developed for the US Nuclear Regulatory Commission's Office of Nuclear Regulatory Research and the US Naval Surface Weapons Center by Pritsker and Associates, Inc., under contract to Sandia National Laboratories

  3. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  4. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  5. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  6. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1991-01-01

    In the process of review and evaluation of licensing issues related to nuclear power plants, it is essential to understand the behavior of seismic loading, foundation and structural properties and their impact on the overall structural response. In most cases, such knowledge could be obtained by using simplified engineering models which, when properly implemented, can capture the essential parameters describing the physics of the problem. Such models do not require execution on large computer systems and could be implemented through a personal computer (PC) based capability. Recognizing the need for a PC software package that can perform structural response computations required for typical licensing reviews, the US Nuclear Regulatory Commission sponsored the development of a PC operated computer software package CARES (Computer Analysis for Rapid Evaluation of Structures) system. This development was undertaken by Brookhaven National Laboratory (BNL) during FY's 1988 and 1989. A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to operate on a PC, have user friendly input/output interface, and have quick turnaround. This paper describes the various features which have been implemented into the seismic module of CARES version 1.0

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-09-01

    This portion of the RELAP4/MOD5 User's Manual presents the details of setting up and entering the reactor model to be evaluated. The input card format and arrangement is presented in depth, including not only cards for data but also those for editing and restarting. Problem initalization including pressure distribution and energy balance is discussed. A section entitled ''User Guidelines'' is included to provide modeling recommendations, analysis and verification techniques, and computational difficulty resolution. The section is concluded with a discussion of the computer output form and format

  9. CSTEM User Manual

    Science.gov (United States)

    Hartle, M.; McKnight, R. L.

    2000-01-01

    This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.

  10. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    Energy Technology Data Exchange (ETDEWEB)

    Zynovyev, Mykhaylo

    2012-06-29

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  11. Conceptual design of an ALICE Tier-2 centre. Integrated into a multi-purpose computing facility

    International Nuclear Information System (INIS)

    Zynovyev, Mykhaylo

    2012-01-01

    This thesis discusses the issues and challenges associated with the design and operation of a data analysis facility for a high-energy physics experiment at a multi-purpose computing centre. At the spotlight is a Tier-2 centre of the distributed computing model of the ALICE experiment at the Large Hadron Collider at CERN in Geneva, Switzerland. The design steps, examined in the thesis, include analysis and optimization of the I/O access patterns of the user workload, integration of the storage resources, and development of the techniques for effective system administration and operation of the facility in a shared computing environment. A number of I/O access performance issues on multiple levels of the I/O subsystem, introduced by utilization of hard disks for data storage, have been addressed by the means of exhaustive benchmarking and thorough analysis of the I/O of the user applications in the ALICE software framework. Defining the set of requirements to the storage system, describing the potential performance bottlenecks and single points of failure and examining possible ways to avoid them allows one to develop guidelines for selecting the way how to integrate the storage resources. The solution, how to preserve a specific software stack for the experiment in a shared environment, is presented along with its effects on the user workload performance. The proposal for a flexible model to deploy and operate the ALICE Tier-2 infrastructure and applications in a virtual environment through adoption of the cloud computing technology and the 'Infrastructure as Code' concept completes the thesis. Scientific software applications can be efficiently computed in a virtual environment, and there is an urgent need to adapt the infrastructure for effective usage of cloud resources.

  12. Computer software design description for the Treated Effluent Disposal Facility (TEDF), Project L-045H, Operator Training Station (OTS)

    International Nuclear Information System (INIS)

    Carter, R.L. Jr.

    1994-01-01

    The Treated Effluent Disposal Facility (TEDF) Operator Training Station (OTS) is a computer-based training tool designed to aid plant operations and engineering staff in familiarizing themselves with the TEDF Central Control System (CCS)

  13. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1990-01-01

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  14. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  15. Teaching ergonomics to nursing facility managers using computer-based instruction.

    Science.gov (United States)

    Harrington, Susan S; Walker, Bonnie L

    2006-01-01

    This study offers evidence that computer-based training is an effective tool for teaching nursing facility managers about ergonomics and increasing their awareness of potential problems. Study participants (N = 45) were randomly assigned into a treatment or control group. The treatment group completed the ergonomics training and a pre- and posttest. The control group completed the pre- and posttests without training. Treatment group participants improved significantly from 67% on the pretest to 91% on the posttest, a gain of 24%. Differences between mean scores for the control group were not significant for the total score or for any of the subtests.

  16. FIRAC: a computer code to predict fire-accident effects in nuclear facilities

    International Nuclear Information System (INIS)

    Bolstad, J.W.; Krause, F.R.; Tang, P.K.; Andrae, R.W.; Martin, R.A.; Gregory, W.S.

    1983-01-01

    FIRAC is a medium-sized computer code designed to predict fire-induced flows, temperatures, and material transport within the ventilating systems and other airflow pathways in nuclear-related facilities. The code is designed to analyze the behavior of interconnected networks of rooms and typical ventilation system components. This code is one in a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. The structure of this code closely follows that of the previously developed TVENT and EVENT codes. Because a lumped-parameter formulation is used, this code is particularly suitable for calculating the effects of fires in the far field (that is, in regions removed from the fire compartment), where the fire may be represented parametrically. However, a fire compartment model to simulate conditions in the enclosure is included. This model provides transport source terms to the ventilation system that can affect its operation and in turn affect the fire

  17. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    Gould, S.S.; Layman, L.R.; Million, D.L.

    1983-01-01

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  18. HVAC system operation manual of IMEF

    International Nuclear Information System (INIS)

    Baek, Sang Yeol; Park, Dae Kyu; Ahn, Sang Bok; Ju, Yong Sun.

    1997-06-01

    This manual is operation procedures of the IMEF(Irradiated Material Examination Facility) HVAC(Heating, Ventilation and Air Conditioning) System. General operation procedures and test method of the IMEF HVAC system are described. The manual is as follows; 1. HVAC system operation manual 2. HVAC system management guide 3. HVAC system maintenance manual 4. HVAC system air velocity and flowrate measurement manual 5. HVAC system HEPA filter leak test manual 6. HVAC system charcoal filter leak test manual 7. HVAC system HEPA and charcoal filter exchange manual. (author). 8 tabs

  19. The HEPCloud Facility: elastic computing for High Energy Physics – The NOvA Use Case

    Energy Technology Data Exchange (ETDEWEB)

    Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Norman, A. [Fermilab; Timm, S. [Fermilab; Tiradani, A. [Fermilab

    2017-03-15

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 25 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  20. The HEPCloud Facility: elastic computing for High Energy Physics - The NOvA Use Case

    Science.gov (United States)

    Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Norman, A.; Timm, S.; Tiradani, A.

    2017-10-01

    The need for computing in the HEP community follows cycles of peaks and valleys mainly driven by conference dates, accelerator shutdown, holiday schedules, and other factors. Because of this, the classical method of provisioning these resources at providing facilities has drawbacks such as potential overprovisioning. As the appetite for computing increases, however, so does the need to maximize cost efficiency by developing a model for dynamically provisioning resources only when needed. To address this issue, the HEPCloud project was launched by the Fermilab Scientific Computing Division in June 2015. Its goal is to develop a facility that provides a common interface to a variety of resources, including local clusters, grids, high performance computers, and community and commercial Clouds. Initially targeted experiments include CMS and NOvA, as well as other Fermilab stakeholders. In its first phase, the project has demonstrated the use of the “elastic” provisioning model offered by commercial clouds, such as Amazon Web Services. In this model, resources are rented and provisioned automatically over the Internet upon request. In January 2016, the project demonstrated the ability to increase the total amount of global CMS resources by 58,000 cores from 150,000 cores - a 38 percent increase - in preparation for the Recontres de Moriond. In March 2016, the NOvA experiment has also demonstrated resource burst capabilities with an additional 7,300 cores, achieving a scale almost four times as large as the local allocated resources and utilizing the local AWS s3 storage to optimize data handling operations and costs. NOvA was using the same familiar services used for local computations, such as data handling and job submission, in preparation for the Neutrino 2016 conference. In both cases, the cost was contained by the use of the Amazon Spot Instance Market and the Decision Engine, a HEPCloud component that aims at minimizing cost and job interruption. This paper

  1. CSNI Integral Test Facility Matrices for Validation of Best-Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Internationally agreed Integral Test Facility (ITF) matrices for validation of realistic thermal hydraulic system computer codes were established. ITF development is mainly for Pressurised Water Reactors (PWRs) and Boiling Water Reactors (BWRs). A separate activity was for Russian Pressurised Water-cooled and Water-moderated Energy Reactors (WWER). Firstly, the main physical phenomena that occur during considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. In this paper some specific examples from the ITF matrices will also be provided. The matrices will be a guide for code validation, will be a basis for comparisons of code predictions performed with different system codes, and will contribute to the quantification of the uncertainty range of code model predictions. In addition to this objective, the construction of such a matrix is an attempt to record information which has been generated around the world over the last years, so that it is more accessible to present and future workers in that field than would otherwise be the case.

  2. Enhanced computational infrastructure for data analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; McHarg, B.B.; Meyer, W.H.; Parker, C.T.

    2000-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from nine national laboratories, 19 foreign laboratories, 16 universities, and five industrial partnerships. As a result of this work, DIII-D data is available on a 24x7 basis from a set of viewing and analysis tools that can be run on either the collaborators' or DIII-D's computer systems. Additionally, a web based data and code documentation system has been created to aid the novice and expert user alike

  3. Enhanced Computational Infrastructure for Data Analysis at the DIII-D National Fusion Facility

    International Nuclear Information System (INIS)

    Schissel, D.P.; Peng, Q.; Schachter, J.; Terpstra, T.B.; Casper, T.A.; Freeman, J.; Jong, R.; Keith, K.M.; Meyer, W.H.; Parker, C.T.; McCharg, B.B.

    1999-01-01

    Recently a number of enhancements to the computer hardware infrastructure have been implemented at the DIII-D National Fusion Facility. Utilizing these improvements to the hardware infrastructure, software enhancements are focusing on streamlined analysis, automation, and graphical user interface (GUI) systems to enlarge the user base. The adoption of the load balancing software package LSF Suite by Platform Computing has dramatically increased the availability of CPU cycles and the efficiency of their use. Streamlined analysis has been aided by the adoption of the MDSplus system to provide a unified interface to analyzed DIII-D data. The majority of MDSplus data is made available in between pulses giving the researcher critical information before setting up the next pulse. Work on data viewing and analysis tools focuses on efficient GUI design with object-oriented programming (OOP) for maximum code flexibility. Work to enhance the computational infrastructure at DIII-D has included a significant effort to aid the remote collaborator since the DIII-D National Team consists of scientists from 9 national laboratories, 19 foreign laboratories, 16 universities, and 5 industrial partnerships. As a result of this work, DIII-D data is available on a 24 x 7 basis from a set of viewing and analysis tools that can be run either on the collaborators' or DIII-Ds computer systems. Additionally, a Web based data and code documentation system has been created to aid the novice and expert user alike

  4. Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele

    2012-01-01

    In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.

  5. FIRAC - a computer code to predict fire accident effects in nuclear facilities

    International Nuclear Information System (INIS)

    Bolstad, J.W.; Foster, R.D.; Gregory, W.S.

    1983-01-01

    FIRAC is a medium-sized computer code designed to predict fire-induced flows, temperatures, and material transport within the ventilating systems and other airflow pathways in nuclear-related facilities. The code is designed to analyze the behavior of interconnected networks of rooms and typical ventilation system components. This code is one in a family of computer codes that is designed to provide improved methods of safety analysis for the nuclear industry. The structure of this code closely follows that of the previously developed TVENT and EVENT codes. Because a lumped-parameter formulation is used, this code is particularly suitable for calculating the effects of fires in the far field (that is, in regions removed from the fire compartment), where the fire may be represented parametrically. However, a fire compartment model to simulate conditions in the enclosure is included. This model provides transport source terms to the ventilation system that can affect its operation and in turn affect the fire. A basic material transport capability that features the effects of convection, deposition, entrainment, and filtration of material is included. The interrelated effects of filter plugging, heat transfer, gas dynamics, and material transport are taken into account. In this paper the authors summarize the physical models used to describe the gas dynamics, material transport, and heat transfer processes. They also illustrate how a typical facility is modeled using the code

  6. Application of personal computer to development of entrance management system for radiating facilities

    International Nuclear Information System (INIS)

    Suzuki, Shogo; Hirai, Shouji

    1989-01-01

    The report describes a system for managing the entrance and exit of personnel to radiating facilities. A personal computer is applied to its development. Major features of the system is outlined first. The computer is connected to the gate and two magnetic card readers provided at the gate. The gate, which is installed at the entrance to a room under control, opens only for those who have a valid card. The entrance-exit management program developed is described next. The following three files are used: ID master file (random file of the magnetic card number, name, qualification, etc., of each card carrier), entrance-exit management file (random file of time of entrance/exit, etc., updated everyday), and entrance-exit record file (sequential file of card number, name, date, etc.), which are stored on floppy disks. A display is provided to show various lists including a list of workers currently in the room and a list of workers who left the room at earlier times of the day. This system is useful for entrance management of a relatively small facility. Though small in required cost, it requires only a few operators to perform effective personnel management. (N.K.)

  7. GASFLOW: A computational model to analyze accidents in nuclear containment and facility buildings

    International Nuclear Information System (INIS)

    Travis, J.R.; Nichols, B.D.; Wilson, T.L.; Lam, K.L.; Spore, J.W.; Niederauer, G.F.

    1993-01-01

    GASFLOW is a finite-volume computer code that solves the time-dependent, compressible Navier-Stokes equations for multiple gas species. The fluid-dynamics algorithm is coupled to the chemical kinetics of combusting liquids or gases to simulate diffusion or propagating flames in complex geometries of nuclear containment or confinement and facilities' buildings. Fluid turbulence is calculated to enhance the transport and mixing of gases in rooms and volumes that may be connected by a ventilation system. The ventilation system may consist of extensive ductwork, filters, dampers or valves, and fans. Condensation and heat transfer to walls, floors, ceilings, and internal structures are calculated to model the appropriate energy sinks. Solid and liquid aerosol behavior is simulated to give the time and space inventory of radionuclides. The solution procedure of the governing equations is a modified Los Alamos ICE'd-ALE methodology. Complex facilities can be represented by separate computational domains (multiblocks) that communicate through overlapping boundary conditions. The ventilation system is superimposed throughout the multiblock mesh. Gas mixtures and aerosols are transported through the free three-dimensional volumes and the restricted one-dimensional ventilation components as the accident and fluid flow fields evolve. Combustion may occur if sufficient fuel and reactant or oxidizer are present and have an ignition source. Pressure and thermal loads on the building, structural components, and safety-related equipment can be determined for specific accident scenarios. GASFLOW calculations have been compared with large oil-pool fire tests in the 1986 HDR containment test T52.14, which is a 3000-kW fire experiment. The computed results are in good agreement with the observed data

  8. GASFLOW-MPI. A scalable computational fluid dynamics code for gases, aerosols and combustion. Vol. 2. Users' manual (Revision 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Jianjun; Travis, Jack; Royl, Peter; Necker, Gottfried; Svishchev, Anatoly; Jordan, Thomas

    2016-07-01

    Karlsruhe Institute of Technology (KIT) is developing the parallel computational fluid dynamics code GASFLOW-MPI as a best-estimate tool for predicting transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containments and other facility buildings. GASFLOW-MPI is a finite-volume code based on proven computational fluid dynamics methodology that solves the compressible Navier-Stokes equations for three-dimensional volumes in Cartesian or cylindrical coordinates.

  9. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  10. The Overview of the National Ignition Facility Distributed Computer Control System

    International Nuclear Information System (INIS)

    Lagin, L.J.; Bettenhausen, R.C.; Carey, R.A.; Estes, C.M.; Fisher, J.M.; Krammen, J.E.; Reed, R.K.; VanArsdall, P.J.; Woodruff, J.P.

    2001-01-01

    The Integrated Computer Control System (ICCS) for the National Ignition Facility (NIF) is a layered architecture of 300 front-end processors (FEP) coordinated by supervisor subsystems including automatic beam alignment and wavefront control, laser and target diagnostics, pulse power, and shot control timed to 30 ps. FEP computers incorporate either VxWorks on PowerPC or Solaris on UltraSPARC processors that interface to over 45,000 control points attached to VME-bus or PCI-bus crates respectively. Typical devices are stepping motors, transient digitizers, calorimeters, and photodiodes. The front-end layer is divided into another segment comprised of an additional 14,000 control points for industrial controls including vacuum, argon, synthetic air, and safety interlocks implemented with Allen-Bradley programmable logic controllers (PLCs). The computer network is augmented asynchronous transfer mode (ATM) that delivers video streams from 500 sensor cameras monitoring the 192 laser beams to operator workstations. Software is based on an object-oriented framework using CORBA distribution that incorporates services for archiving, machine configuration, graphical user interface, monitoring, event logging, scripting, alert management, and access control. Software coding using a mixed language environment of Ada95 and Java is one-third complete at over 300 thousand source lines. Control system installation is currently under way for the first 8 beams, with project completion scheduled for 2008

  11. Dosimetry and health effects self-teaching curriculum: illustrative problems to supplement the user's manual for the Dosimetry and Health Effects Computer Code

    International Nuclear Information System (INIS)

    Runkle, G.E.; Finley, N.C.

    1983-03-01

    This document contains a series of sample problems for the Dosimetry and Health Effects Computer Code to be used in conjunction with the user's manual (Runkle and Cranwell, 1982) for the code. This code was developed at Sandia National Laboratories for the Risk Methodology for Geologic Disposal of Radioactive Waste program (NRC FIN A-1192). The purpose of this document is to familiarize the user with the code, its capabilities, and its limitations. When the user has finished reading this document, he or she should be able to prepare data input for the Dosimetry and Health Effects code and have some insights into interpretation of the model output

  12. The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster

    Science.gov (United States)

    Löwe, P.; Klump, J.; Thaler, J.

    2012-04-01

    Compute clusters can be used as GIS workbenches, their wealth of resources allow us to take on geocomputation tasks which exceed the limitations of smaller systems. To harness these capabilities requires a Geographic Information System (GIS), able to utilize the available cluster configuration/architecture and a sufficient degree of user friendliness to allow for wide application. In this paper we report on the first successful porting of GRASS GIS, the oldest and largest Free Open Source (FOSS) GIS project, onto a compute cluster using Platform Computing's Load Sharing Facility (LSF). In 2008, GRASS6.3 was installed on the GFZ compute cluster, which at that time comprised 32 nodes. The interaction with the GIS was limited to the command line interface, which required further development to encapsulate the GRASS GIS business layer to facilitate its use by users not familiar with GRASS GIS. During the summer of 2011, multiple versions of GRASS GIS (v 6.4, 6.5 and 7.0) were installed on the upgraded GFZ compute cluster, now consisting of 234 nodes with 480 CPUs providing 3084 cores. The GFZ compute cluster currently offers 19 different processing queues with varying hardware capabilities and priorities, allowing for fine-grained scheduling and load balancing. After successful testing of core GIS functionalities, including the graphical user interface, mechanisms were developed to deploy scripted geocomputation tasks onto dedicated processing queues. The mechanisms are based on earlier work by NETELER et al. (2008). A first application of the new GIS functionality was the generation of maps of simulated tsunamis in the Mediterranean Sea for the Tsunami Atlas of the FP-7 TRIDEC Project (www.tridec-online.eu). For this, up to 500 processing nodes were used in parallel. Further trials included the processing of geometrically complex problems, requiring significant amounts of processing time. The GIS cluster successfully completed all these tasks, with processing times

  13. [Elderlies in street situation or social vulnerability: facilities and difficulties in the use of computational tools].

    Science.gov (United States)

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Pereira, Valclei Aparecida Gandolpho; Negreiros, Maria Célia de; Paranhos, Wana Yeda; Leite, Maria Madalena Januário

    2014-01-01

    This study aimed to identify the advantages and difficulties encountered by older people living on the streets or social vulnerability, to use the computer or internet. It is an exploratory qualitative research, in which five elderlies, attended on a non-governmental organization located in the city of São Paulo, have participated. The discourses were analyzed by content analysis technique and showed, as facilities, among others, to clarify doubts with the monitors, the stimulus for new discoveries coupled with proactivity and curiosity, and develop new skills. The mentioned difficulties were related to physical or cognitive issues, lack of instructor, and lack of knowledge to interact with the machine. The studies focusing on the elderly population living on the streets or in social vulnerability may contribute with evidence to guide the formulation of public policies to this population.

  14. Development of a personal computer based facility-level SSAC component and inspector support system

    International Nuclear Information System (INIS)

    Markov, A.

    1989-08-01

    Research Contract No. 4658/RB was conducted between the IAEA and the Bulgarian Committee on Use of Atomic Energy for Peaceful Purposes. The contract required the Committee to develop and program a personal computer based software package to be used as a facility-level computerized State System of Accounting and Control (SSAC) at an off-load power reactor. The software delivered, called the National Safeguards System (NSS) keeps track of all fuel assembly activity at a power reactor and generates all ledgers, MBA material balances and any required reports to national or international authorities. The NSS is designed to operate on a PC/AT or compatible equipment with a hard disk of 20 MB, color graphics monitor or adaptor and at least one floppy disk drive, 360 Kb. The programs are written in Basic (compiler 2.0). They are executed under MS DOS 3.1 or later

  15. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  16. Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, James S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Leverman, Dustin B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Hanley, Jesse A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences; Oral, Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Center for Computational Sciences

    2016-08-22

    This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality of both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.

  17. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  18. Computational Simulations of the NASA Langley HyMETS Arc-Jet Facility

    Science.gov (United States)

    Brune, A. J.; Bruce, W. E., III; Glass, D. E.; Splinter, S. C.

    2017-01-01

    The Hypersonic Materials Environmental Test System (HyMETS) arc-jet facility located at the NASA Langley Research Center in Hampton, Virginia, is primarily used for the research, development, and evaluation of high-temperature thermal protection systems for hypersonic vehicles and reentry systems. In order to improve testing capabilities and knowledge of the test article environment, an effort is underway to computationally simulate the flow-field using computational fluid dynamics (CFD). A detailed three-dimensional model of the arc-jet nozzle and free-jet portion of the flow-field has been developed and compared to calibration probe Pitot pressure and stagnation-point heat flux for three test conditions at low, medium, and high enthalpy. The CFD model takes into account uniform pressure and non-uniform enthalpy profiles at the nozzle inlet as well as catalytic recombination efficiency effects at the probe surface. Comparing the CFD results and test data indicates an effectively fully-catalytic copper surface on the heat flux probe of about 10% efficiency and a 2-3 kpa pressure drop from the arc heater bore, where the pressure is measured, to the plenum section, prior to the nozzle. With these assumptions, the CFD results are well within the uncertainty of the stagnation pressure and heat flux measurements. The conditions at the nozzle exit were also compared with radial and axial velocimetry. This simulation capability will be used to evaluate various three-dimensional models that are tested in the HyMETS facility. An end-to-end aerothermal and thermal simulation of HyMETS test articles will follow this work to provide a better understanding of the test environment, test results, and to aid in test planning. Additional flow-field diagnostic measurements will also be considered to improve the modeling capability.

  19. Site security personnel training manual

    International Nuclear Information System (INIS)

    1978-10-01

    As required by 10 CFR Part 73, this training manual provides guidance to assist licensees in the development of security personnel training and qualifications programs. The information contained in the manual typifies the level and scope of training for personnel assigned to perform security related tasks and job duties associated with the protection of nuclear fuel cycle facilities and nuclear power reactors

  20. Transportation security personnel training manual

    International Nuclear Information System (INIS)

    1978-11-01

    Objective of this manual is to train security personnel to protect special nuclear materials and nuclear facilities against theft and sabotage as required by 10 CFR Part 73. This volume contains the introduction and rationale

  1. Computer-aided and manual quantifications of MRI synovitis, bone marrow edema-like lesions, erosion and cartilage loss in rheumatoid arthritis of the wrist

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Haitao [The First Affiliated Hospital of Chongqing Medical University, Department of Radiology, Chongqing (China); University of California, San Francisco (UCSF), Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Rivoire, Julien; Hoppe, Michael; Link, Thomas M.; Li, Xiaojuan [University of California, San Francisco (UCSF), Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Srikhum, Waraporn [University of California, San Francisco (UCSF), Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Thammasat University, Department of Radiology, Pathumthani (Thailand); Imboden, John [San Francisco General Hospital, University of California, Department of Medicine, San Francisco and Division of Rheumatology, San Francisco, CA (United States)

    2014-12-10

    To investigate the reliability and validity of computer-aided automated and manual quantification as well as semiquantitative analysis for MRI synovitis, bone marrow edema-like lesions, erosion and cartilage loss of the wrist in rheumatoid arthritis (RA) compared to the OMERACT-RAMRIS. Wrist MRI was performed at 3 T in 16 patients with RA. Synovial volume and perfusion, bone marrow edema-like lesion (BMEL) volume, signal intensity and perfusion, and erosion dimensions were measured manually and using an in-house-developed automated software algorithm; findings were correlated with the OMERAC-RAMRIS gradings. In addition, a semiquantitative MRI cartilage loss score system was developed. Intraclass correlation coefficients (ICCs) were used to test the reproducibility of these quantitative and semiquantitative techniques. Spearman correlation coefficients were calculated between lesion quantifications and RAMRIS and between the MRI cartilage score and radiographic Sharp van der Heijde joint space narrowing scores. The intra- and interobserver ICCs were excellent for synovial, BMEL and erosion quantifications and cartilage loss grading (all >0.89). The synovial volume, BMEL volume and signal intensity, and erosion dimensions were significantly correlated with the corresponding RAMRIS (r = 0.727 to 0.900, p < 0.05). Synovial perfusion parameter maximum enhancement (Emax) was significantly correlated with synovitis RAMRIS (r = 0.798). BMEL perfusion parameters were not correlated with the RAMRIS BME score. Cartilage loss gradings from MRI were significantly correlated with the Sharp joint space narrowing scores (r = 0.635, p = 0.008). The computer-aided, manual and semiquantitative methods presented in this study can be used to evaluate MRI pathologies in RA with excellent reproducibility. Significant correlations with standard RAMRIS were found in the measurements using these methods. (orig.)

  2. CESARR V.2 manual: Computer code for the evaluation of surface storage of low and medium level radioactive waste

    International Nuclear Information System (INIS)

    Moya Rivera, J.A.; Bolado Lavin, R.

    1997-01-01

    CESARR (Code for the safety evaluation of low and medium level radioactive waste storage). This code was developed for the safety probabilistic evaluations in the facilities of low-and medium level radioactive waste storage

  3. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  4. On a new method to compute photon skyshine doses around radiotherapy facilities

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, R.; Facure, A. [Comissao Nacional de Eenrgia Nuclear, Rio de Janeiro (Brazil); Xavier, A. [PEN/Coppe -UFRJ, Rio de Janeiro (Brazil)

    2006-07-01

    Full text of publication follows: Nowadays, in a great number of situations constructions are raised around radiotherapy facilities. In cases where the constructions would not be in the primary x-ray beam, 'skyshine' radiation is normally accounted for. The skyshine method is commonly used to to calculate the dose contribution from scattered radiation in such circumstances, when the roof shielding is projected considering there will be no occupancy upstairs. In these cases, there will be no need to have the usual 1,5-2,0 m thick ceiling, and the construction costs can be considerably reduced. The existing expression to compute these doses do not accomplish to explain mathematically the existence of a shadow area just around the outer room walls, and its growth, as we get away from these walls. In this paper we propose a new method to compute photon skyshine doses, using geometrical considerations to find the maximum dose point. An empirical equation is derived, and its validity is tested using M.C.N.P. 5 Monte Carlo calculation to simulate radiotherapy rooms configurations. (authors)

  5. Computer-guided facility for the study of single crystals at the gamma diffractometer GADI

    International Nuclear Information System (INIS)

    Heer, H.; Bleichert, H.; Gruhn, W.; Moeller, R.

    1984-10-01

    In the study of solid-state properties it is in many cases necessary to work with single crystals. The increased requirement in the industry and research as well as the desire for better characterization by means of γ-diffractometry made it necessary to improve and to modernize the existing instrument. The advantages of a computer-guided facility against the conventional, semiautomatic operation are manifold. Not only the process guidance, but also the data acquisition and evaluation are performed by the computer. By a remote control the operator is able to find quickly a reflex and to drive the crystal in every desired measuring position. The complete protocollation of all important measuring parameters, the convenient data storage, as well as the automatic evaluation are much useful for the user. Finally the measuring time can be increased to practically 24 hours per day. By this the versed characterization by means of γ-diffractometry is put on a completely new level. (orig.) [de

  6. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  7. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  8. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—User’s manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-05-12

    The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.

  9. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and

  10. ERWIN2: User's manual for a computer model to calculate the economic efficiency of wind energy systems

    International Nuclear Information System (INIS)

    Van Wees, F.G.H.

    1992-01-01

    During the last few years the Business Unit ESC-Energy Studies of the Netherlands Energy Research Foundation (ECN) developed calculation programs to determine the economic efficiency of energy technologies, which programs support several studies for the Dutch Ministry of Economic Affairs. All these programs form the so-called BRET programs. One of these programs is ERWIN (Economische Rentabiliteit WINdenergiesystemen or in English: Economic Efficiency of Wind Energy Systems) of which an updated manual (ERWIN2) is presented in this report. An outline is given of the possibilities and limitations to carry out calculations with the model

  11. Manual Therapy

    OpenAIRE

    Hakgüder, Aral; Kokino, Siranuş

    2002-01-01

    Manual therapy has been used in the treatment of pain and dysfunction of spinal and peripheral joints for more than a hundred years. Manual medicine includes manipulation, mobilization, and postisometric relaxation techniques. The aim of manual therapy is to enhance restricted movement caused by blockage of joints keeping postural balance, restore function and maintain optimal body mechanics. Anatomic, biomechanical, and neurophysiological evaluations of the leucomotor system is essential for...

  12. Use of manual alveolar recruitment maneuvers to eliminate atelectasis artifacts identified during thoracic computed tomography of healthy neonatal foals.

    Science.gov (United States)

    Lascola, Kara M; Clark-Price, Stuart C; Joslyn, Stephen K; Mitchell, Mark A; O'Brien, Robert T; Hartman, Susan K; Kline, Kevin H

    2016-11-01

    OBJECTIVE To evaluate use of single manual alveolar recruitment maneuvers (ARMs) to eliminate atelectasis during CT of anesthetized foals. ANIMALS 6 neonatal Standardbred foals. PROCEDURES Thoracic CT was performed on spontaneously breathing anesthetized foals positioned in sternal (n = 3) or dorsal (3) recumbency when foals were 24 to 36 hours old (time 1), 4 days old (time 2), 7 days old (time 3), and 10 days old (time 4). The CT images were collected without ARMs (all times) and during ARMs with an internal airway pressure of 10, 20, and 30 cm H 2 O (times 2 and 3). Quantitative analysis of CT images measured whole lung and regional changes in attenuation or volume with ARMs. RESULTS Increased attenuation and an alveolar pattern were most prominent in the dependent portion of the lungs. Subjectively, ARMs did not eliminate atelectasis; however, they did incrementally reduce attenuation, particularly in the nondependent portion of the lungs. Quantitative differences in lung attenuation attributable to position of foal were not identified. Lung attenuation decreased significantly (times 2 and 3) and lung volume increased significantly (times 2 and 3) after ARMs. Changes in attenuation and volume were most pronounced in the nondependent portion of the lungs and at ARMs of 20 and 30 cm H 2 O. CONCLUSIONS AND CLINICAL RELEVANCE Manual ARMs did not eliminate atelectasis but reduced attenuation in nondependent portions of the lungs. Positioning of foals in dorsal recumbency for CT may be appropriate when pathological changes in the ventral portion of the lungs are suspected.

  13. IAEA safeguards technical manual

    International Nuclear Information System (INIS)

    1982-03-01

    Part F of the Safeguards Technical Manual is being issued in three volumes. Volume 1 was published in 1977 and revised slightly in 1979. Volume 1 discusses basic probability concepts, statistical inference, models and measurement errors, estimation of measurement variances, and calibration. These topics of general interest in a number of application areas, are presented with examples drawn from nuclear materials safeguards. The final two chapters in Volume 1 deal with problem areas unique to safeguards: calculating the variance of MUF and of D respectively. Volume 2 continues where Volume 1 left off with a presentation of topics of specific interest to Agency safeguards. These topics include inspection planning from a design and effectiveness evaluation viewpoint, on-facility site inspection activities, variables data analysis as applied to inspection data, preparation of inspection reports with respect to statistical aspects of the inspection, and the distribution of inspection samples to more than one analytical laboratory. Volume 3 covers generally the same material as Volumes 1 and 2 but with much greater unity and cohesiveness. Further, the cook-book style of the previous two volumes has been replaced by one that makes use of equations and formulas as opposed to computational steps, and that also provides the bases for the statistical procedures discussed. Hopefully, this will help minimize the frequency of misapplications of the techniques

  14. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the Path to Ignition

    International Nuclear Information System (INIS)

    Lagin, L J; Bettenhauasen, R C; Bowers, G A; Carey, R W; Edwards, O D; Estes, C M; Demaret, R D; Ferguson, S W; Fisher, J M; Ho, J C; Ludwigsen, A P; Mathisen, D G; Marshall, C D; Matone, J M; McGuigan, D L; Sanchez, R J; Shelton, R T; Stout, E A; Tekle, E; Townsend, S L; Van Arsdall, P J; Wilson, E F

    2007-01-01

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of 8 beams each using laser hardware that is modularized into more than 6,000 line replaceable units such as optical assemblies, laser amplifiers, and multifunction sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-Megajoule capability of infrared light. During the next two years, the control system will be expanded to include automation of target area systems including final optics, target positioners and

  15. Status of the National Ignition Facility Integrated Computer Control System (ICCS) on the path to ignition

    International Nuclear Information System (INIS)

    Lagin, L.J.; Bettenhausen, R.C.; Bowers, G.A.; Carey, R.W.; Edwards, O.D.; Estes, C.M.; Demaret, R.D.; Ferguson, S.W.; Fisher, J.M.; Ho, J.C.; Ludwigsen, A.P.; Mathisen, D.G.; Marshall, C.D.; Matone, J.T.; McGuigan, D.L.; Sanchez, R.J.; Stout, E.A.; Tekle, E.A.; Townsend, S.L.; Van Arsdall, P.J.

    2008-01-01

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility under construction that will contain a 192-beam, 1.8-MJ, 500-TW, ultraviolet laser system together with a 10-m diameter target chamber with room for multiple experimental diagnostics. NIF is the world's largest and most energetic laser experimental system, providing a scientific center to study inertial confinement fusion (ICF) and matter at extreme energy densities and pressures. NIF's laser beams are designed to compress fusion targets to conditions required for thermonuclear burn, liberating more energy than required to initiate the fusion reactions. NIF is comprised of 24 independent bundles of eight beams each using laser hardware that is modularized into more than 6000 line replaceable units such as optical assemblies, laser amplifiers, and multi-function sensor packages containing 60,000 control and diagnostic points. NIF is operated by the large-scale Integrated Computer Control System (ICCS) in an architecture partitioned by bundle and distributed among over 800 front-end processors and 50 supervisory servers. NIF's automated control subsystems are built from a common object-oriented software framework based on CORBA distribution that deploys the software across the computer network and achieves interoperation between different languages and target architectures. A shot automation framework has been deployed during the past year to orchestrate and automate shots performed at the NIF using the ICCS. In December 2006, a full cluster of 48 beams of NIF was fired simultaneously, demonstrating that the independent bundle control system will scale to full scale of 192 beams. At present, 72 beams have been commissioned and have demonstrated 1.4-MJ capability of infrared light. During the next 2 years, the control system will be expanded in preparation for project completion in 2009 to include automation of target area systems including final optics

  16. FINAS. Example manual. 2

    International Nuclear Information System (INIS)

    Iwata, Koji; Tsukimori, Kazuyuki; Ueno, Mutsuo

    2003-12-01

    FINAS is a general purpose structural analysis computer program which was developed by Japan Nuclear Cycle Development Institute for the analysis of static, dynamic and thermal responses of elastic and inelastic structures by the finite element method. This manual contains typical analysis examples that illustrate applications of FINAS to a variety of structural engineering problems. The first part of this manual presents fundamental examples in which numerical solutions by FINAS are compared with some analytical reference solutions, and the second part of this manual presents more complex examples intended for practical application. All the input data images and principal results for each problem are included in this manual for beginners' convenience. All the analyses are performed by using the FINAS Version 13.0. (author)

  17. New challenges for HEP computing: RHIC [Relativistic Heavy Ion Collider] and CEBAF [Continuous Electron Beam Accelerator Facility

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1990-01-01

    We will look at two facilities; RHIC and CEBF. CEBF is in the construction phase, RHIC is about to begin construction. For each of them, we examine the kinds of physics measurements that motivated their construction, and the implications of these experiments for computing. Emphasis will be on on-line requirements, driven by the data rates produced by these experiments

  18. BEACON/MOD: a computer program for thermal-hydraulic analysis of nuclear reactor containments - user's manual

    International Nuclear Information System (INIS)

    Broadus, C.R.; Doyle, R.J.; James, S.W.; Lime, J.F.; Mings, W.J.

    1980-04-01

    The BEACON code is a best-estimate, advanced containment code designed to perform a best-estimate analysis of the flow of a mixture of air, water, and steam in a nuclear reactor containment system under loss-of-coolant accident conditions. The code can simulate two-component, two-phase fluid flow in complex geometries using a combination of two-dimensional, one-dimensional, and lumped-parameter representations for the various parts of the system. The current version of BEACON, which is designated BEACON/MOD3, contains mass and heat transfer models for wall film and wall conduction. It is suitable for the evaluation of short-term transients in dry-containment systems. This manual describes the models employed in BEACON/MOD3 and specifies code implementation requirements. It provides application information for input data preparation and for output data interpretation

  19. A technique for manual definition of an irregular volume of interest in single photon emission computed tomography

    International Nuclear Information System (INIS)

    Fleming, J.S.; Kemp, P.M.; Bolt, L.

    1999-01-01

    A technique is described for manually outlining a volume of interest (VOI) in a three-dimensional SPECT dataset. Regions of interest (ROIs) are drawn on three orthogonal maximum intensity projections. Image masks based on these ROIs are backprojected through the image volume and the resultant 3D dataset is segmented to produce the VOI. The technique has been successfully applied in the exclusion of unwanted areas of activity adjacent to the brain when segmenting the organ in SPECT imaging using 99m Tc HMPAO. An example of its use for segmentation in tumour imaging is also presented. The technique is of value for applications involving semi-automatic VOI definition in SPECT. (author)

  20. WYLBUR reference manual. [For interactive text editing

    Energy Technology Data Exchange (ETDEWEB)

    Krupp, R.F.; Messina, P.C.; Peavler, J.M.; Schustack, S.; Starai, T.

    1977-04-01

    WYLBUR is a system for manipulating various kinds of text, such as computer programs, manuscripts, letters, forms, articles, or reports. Its on-line interactive text-editing capabilities allow the user to create, change, and correct text, and to search and display it. WYLBUR also has facilities for job submission and retrieval from remote terminals that make it possible for a user to inquire about the status of any job in the system, cancel jobs that are executing or awaiting execution, reroute output, raise job priority, or get information on the backlog of batch jobs. WYLBUR also has excellent recovery capabilities and a fast response time. This manual describes the WYLBUR version currently used at ANL. It is intended primarily as a reference manual; thus, examples of WYLBUR commands are kept to a minimum. (RWR)

  1. Structures manual

    Science.gov (United States)

    2001-01-01

    This manual was written as a guide for use by design personnel in the Vermont Agency of Transportation Structures Section. This manual covers the design responsibilities of the Section. It does not cover other functions that are a part of the Structu...

  2. Quality Manual

    Science.gov (United States)

    Koch, Michael

    The quality manual is the “heart” of every management system related to quality. Quality assurance in analytical laboratories is most frequently linked with ISO/IEC 17025, which lists the standard requirements for a quality manual. In this chapter examples are used to demonstrate, how these requirements can be met. But, certainly, there are many other ways to do this.

  3. Surface Water Modeling Using an EPA Computer Code for Tritiated Waste Water Discharge from the heavy Water Facility

    International Nuclear Information System (INIS)

    Chen, K.F.

    1998-06-01

    Tritium releases from the D-Area Heavy Water Facilities to the Savannah River have been analyzed. The U.S. EPA WASP5 computer code was used to simulate surface water transport for tritium releases from the D-Area Drum Wash, Rework, and DW facilities. The WASP5 model was qualified with the 1993 tritium measurements at U.S. Highway 301. At the maximum tritiated waste water concentrations, the calculated tritium concentration in the Savannah River at U.S. Highway 301 due to concurrent releases from D-Area Heavy Water Facilities varies from 5.9 to 18.0 pCi/ml as a function of the operation conditions of these facilities. The calculated concentration becomes the lowest when the batch releases method for the Drum Wash Waste Tanks is adopted

  4. Development of a computer code for shielding calculation in X-ray facilities

    International Nuclear Information System (INIS)

    Borges, Diogo da S.; Lava, Deise D.; Affonso, Renato R.W.; Moreira, Maria de L.; Guimaraes, Antonio C.F.

    2014-01-01

    The construction of an effective barrier against the interaction of ionizing radiation present in X-ray rooms requires consideration of many variables. The methodology used for specifying the thickness of primary and secondary shielding of an traditional X-ray room considers the following factors: factor of use, occupational factor, distance between the source and the wall, workload, Kerma in the air and distance between the patient and the receptor. With these data it was possible the development of a computer program in order to identify and use variables in functions obtained through graphics regressions offered by NCRP Report-147 (Structural Shielding Design for Medical X-Ray Imaging Facilities) for the calculation of shielding of the room walls as well as the wall of the darkroom and adjacent areas. With the built methodology, a program validation is done through comparing results with a base case provided by that report. The thickness of the obtained values comprise various materials such as steel, wood and concrete. After validation is made an application in a real case of radiographic room. His visual construction is done with the help of software used in modeling of indoor and outdoor. The construction of barriers for calculating program resulted in a user-friendly tool for planning radiographic rooms to comply with the limits established by CNEN-NN-3:01 published in September / 2011

  5. Computational investigation of reshock strength in hydrodynamic instability growth at the National Ignition Facility

    Science.gov (United States)

    Bender, Jason; Raman, Kumar; Huntington, Channing; Nagel, Sabrina; Morgan, Brandon; Prisbrey, Shon; MacLaren, Stephan

    2017-10-01

    Experiments at the National Ignition Facility (NIF) are studying Richtmyer-Meshkov and Rayleigh-Taylor hydrodynamic instabilities in multiply-shocked plasmas. Targets feature two different-density fluids with a multimode initial perturbation at the interface, which is struck by two X-ray-driven shock waves. Here we discuss computational hydrodynamics simulations investigating the effect of second-shock (``reshock'') strength on instability growth, and how these simulations are informing target design for the ongoing experimental campaign. A Reynolds-Averaged Navier Stokes (RANS) model was used to predict motion of the spike and bubble fronts and the mixing-layer width. In addition to reshock strength, the reshock ablator thickness and the total length of the target were varied; all three parameters were found to be important for target design, particularly for ameliorating undesirable reflected shocks. The RANS data are compared to theoretical models that predict multimode instability growth proportional to the shock-induced change in interface velocity, and to currently-available data from the NIF experiments. Work performed under the auspices of the U.S. D.O.E. by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344. LLNL-ABS-734611.

  6. EXPERIMENTAL AND COMPUTATIONAL ACTIVITIES AT THE OREGON STATE UNIVERSITY NEES TSUNAMI RESEARCH FACILITY

    Directory of Open Access Journals (Sweden)

    S.C. Yim

    2009-01-01

    Full Text Available A diverse series of research projects have taken place or are underway at the NEES Tsunami Research Facility at Oregon State University. Projects range from the simulation of the processes and effects of tsunamis generated by sub-aerial and submarine landslides (NEESR, Georgia Tech., model comparisons of tsunami wave effects on bottom profiles and scouring (NEESR, Princeton University, model comparisons of wave induced motions on rigid and free bodies (Shared-Use, Cornell, numerical model simulations and testing of breaking waves and inundation over topography (NEESR, TAMU, structural testing and development of standards for tsunami engineering and design (NEESR, University of Hawaii, and wave loads on coastal bridge structures (non-NEES, to upgrading the two-dimensional wave generator of the Large Wave Flume. A NEESR payload project (Colorado State University was undertaken that seeks to improve the understanding of the stresses from wave loading and run-up on residential structures. Advanced computational tools for coupling fluid-structure interaction including turbulence, contact and impact are being developed to assist with the design of experiments and complement parametric studies. These projects will contribute towards understanding the physical processes that occur during earthquake generated tsunamis including structural stress, debris flow and scour, inundation and overland flow, and landslide generated tsunamis. Analytical and numerical model development and comparisons with the experimental results give engineers additional predictive tools to assist in the development of robust structures as well as identification of hazard zones and formulation of hazard plans.

  7. Development of a computational code for calculations of shielding in dental facilities

    International Nuclear Information System (INIS)

    Lava, Deise D.; Borges, Diogo da S.; Affonso, Renato R.W.; Guimaraes, Antonio C.F.; Moreira, Maria de L.

    2014-01-01

    This paper is prepared in order to address calculations of shielding to minimize the interaction of patients with ionizing radiation and / or personnel. The work includes the use of protection report Radiation in Dental Medicine (NCRP-145 or Radiation Protection in Dentistry), which establishes calculations and standards to be adopted to ensure safety to those who may be exposed to ionizing radiation in dental facilities, according to the dose limits established by CNEN-NN-3.1 standard published in September / 2011. The methodology comprises the use of computer language for processing data provided by that report, and a commercial application used for creating residential projects and decoration. The FORTRAN language was adopted as a method for application to a real case. The result is a programming capable of returning data related to the thickness of material, such as steel, lead, wood, glass, plaster, acrylic, acrylic and leaded glass, which can be used for effective shielding against single or continuous pulse beams. Several variables are used to calculate the thickness of the shield, as: number of films used in the week, film load, use factor, occupational factor, distance between the wall and the source, transmission factor, workload, area definition, beam intensity, intraoral and panoramic exam. Before the application of the methodology is made a validation of results with examples provided by NCRP-145. The calculations redone from the examples provide answers consistent with the report

  8. Thermal studies of the canister staging pit in a hypothetical Yucca Mountain canister handling facility using computational fluid dynamics

    International Nuclear Information System (INIS)

    Soltani, Mehdi; Barringer, Chris; Bues, Timothy T. de

    2007-01-01

    The proposed Yucca Mountain nuclear waste storage site will contain facilities for preparing the radioactive waste canisters for burial. A previous facility design considered was the Canister Handling Facility Staging Pit. This design is no longer used, but its thermal evaluation is typical of such facilities. Structural concrete can be adversely affected by the heat from radioactive decay. Consequently, facilities must have heating ventilation and air conditioning (HVAC) systems for cooling. Concrete temperatures are a function of conductive, convective and radiative heat transfer. The prediction of concrete temperatures under such complex conditions can only be adequately handled by computational fluid dynamics (CFD). The objective of the CFD analysis was to predict concrete temperatures under normal and off-normal conditions. Normal operation assumed steady state conditions with constant HVAC flow and temperatures. However, off-normal operation was an unsteady scenario which assumed a total HVAC failure for a period of 30 days. This scenario was particularly complex in that the concrete temperatures would gradually rise, and air flows would be buoyancy driven. The CFD analysis concluded that concrete wall temperatures would be at or below the maximum temperature limits in both the normal and off-normal scenarios. While this analysis was specific to a facility design that is no longer used, it demonstrates that such facilities are reasonably expected to have satisfactory thermal performance. (author)

  9. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User's manual to Version 1b (including program reference)

    International Nuclear Information System (INIS)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.; Johnson, G.L.; Mok, G.C.

    1995-02-01

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user's manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers with a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  10. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  11. BEACON/MOD2A: a computer program for subcompartment analysis of nuclear reactor containment. A user's manual

    International Nuclear Information System (INIS)

    Wells, R.A.

    1979-03-01

    The BEACON code is a Best Estimate Advanced Containment code which being developed by EG and G, Idaho, Inc., at the Idaho National Engineering Laboratory. The program is designed to perform a best estimate analysis of the flow of a mixture of air, water, and steam in a nuclear reactor containment system under loss-of-coolant accident conditions. The code can simulate two-component, two-phase fluid flow in complex geometries using a combination of two-dimensional, one-dimensional, and lumped-parameter representations for the various parts of the system. The current version of BEACON, which is designated BEACON/MOD2A, contains mass and heat transfer models for wall film and for wall conduction. It is suitable for the evaluation of short term transients in PWR dry containment systems. This manual describes the models employed in BEACON/MOD2A and specifies code implementation requirements. It provides application information for input data preparation and for output data interpretation

  12. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    International Nuclear Information System (INIS)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  13. VASCOMP 2. The V/STOL aircraft sizing and performance computer program. Volume 6: User's manual, revision 3

    Science.gov (United States)

    Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.

    1980-01-01

    This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.

  14. CANAL user's manual

    International Nuclear Information System (INIS)

    Faya, A.; Wolf, L.; Todreas, N.

    1979-11-01

    CANAL is a subchannel computer program for the steady-state and transient thermal hydraulic analysis of BWR fuel rod bundles. The purpose of this manual is to introduce the user into the mechanism of running the code by providing information about the input data and options

  15. Computer based plant display and digital control system of Wolsong NPP Tritium Removal Facility

    International Nuclear Information System (INIS)

    Jung, C.; Smith, B.; Tosello, G.; Grosbois, J. de; Ahn, J.

    2007-01-01

    The Wolsong Tritium Removal Facility (WTRF) is an AECL-designed, first-of-a-kind facility that removes tritium from the heavy water that is used in systems of the CANDUM reactors in operation at the Wolsong Nuclear Power Plant in South Korea. The Plant Display and Control System (PDCS) provides digital plant monitoring and control for the WTRF and offers the advantages of state-of-the-art digital control system technologies for operations and maintenance. The overall features of the PDCS will be described and some of the specific approaches taken on the project to save construction time and costs, to reduce in-service life-cycle costs and to improve quality will be presented. The PDCS consists of two separate computer sub-systems: the Digital Control System (DCS) and the Plant Display System (PDS). The PDS provides the computer-based Human Machine Interface (HMI) for operators, and permits efficient supervisory or device level monitoring and control. A System Maintenance Console (SMC) is included in the PDS for the purpose of software and hardware configuration and on-line maintenance. A Historical Data System (HDS) is also included in the PDS as a data-server that continuously captures and logs process data and events for long-term storage and on-demand selective retrieval. The PDCS of WTRF has been designed and implemented based on an off-the-self PDS/DCS product combination, the Delta-V System from Emerson. The design includes fully redundant Ethernet network communications, controllers, power supplies and redundancy on selected I/O modules. The DCS provides field bus communications to interface with 3rd party controllers supplied on specialized skids, and supports HART communication with field transmitters. The DCS control logic was configured using a modular and graphical approach. The control strategies are primarily device control modules implemented as autonomous control loops, and implemented using IEC 61131-3 Function Block Diagram (FBD) and Structured

  16. SMACS: a system of computer programs for probabilistic seismic analysis of structures and subsystems. Volume I. User's manual

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.; Bumpus, S.; Gerhard, M.A.

    1985-03-01

    The SMACS (Seismic Methodology Analysis Chain with Statistics) system of computer programs, one of the major computational tools of the Seismic Safety Margins Research Program (SSMRP), links the seismic input with the calculation of soil-structure interaction, major structure response, and subsystem response. The seismic input is defined by ensembles of acceleration time histories in three orthogonal directions. Soil-structure interaction and detailed structural response are then determined simultaneously, using the substructure approach to SSI as implemented in the CLASSI family of computer programs. The modus operandi of SMACS is to perform repeated deterministic analyses, each analysis simulating an earthquake occurrence. Parameter values for each simulation are sampled from assumed probability distributions according to a Latin hypercube experimental design. The user may specify values of the coefficients of variation (COV) for the distributions of the input variables. At the heart of the SMACS system is the computer program SMAX, which performs the repeated SSI response calculations for major structure and subsystem response. This report describes SMAX and the pre- and post-processor codes, used in conjunction with it, that comprise the SMACS system

  17. Computational Modeling in Support of High Altitude Testing Facilities, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  18. Computational Modeling in Support of High Altitude Testing Facilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  19. THEAP-I: A computer program for thermal hydraulic analysis of a thermally interacting channel bundle of complex geometry. The micro computer version user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Megaritou, A; Bartzis, J G

    1987-09-01

    In the present report the micro computer version of the code is described. More emphasis is given in the new features of the code (i.e. input data structure). A set of instructions for running in an IBM-AT2 computer with the Microsoft FORTRAN V.4.0 is also included together with a sample problem refering to the Greek Research Reactor.

  20. Dynamic link: user's manual

    International Nuclear Information System (INIS)

    Harada, Hiroo; Asai, Kiyoshi; Kihara, Kazuhisa.

    1981-09-01

    The purpose of dynamic link facility is to link a load module dynamically only when it is used in execution time. The facility is very useful for development, execution and maintenance of a large scale computer program which is too big to be saved as one load module in main memory, or it is poor economy to save it due to many unused subroutines depending on an input. It is also useful for standardization and common utilization of programs. Standard usage of dynamic link facility of FACOM M-200 computer system, a software tool which analyzes the effect of dynamic link facility and application of dynamic link to nuclear codes are described. (author)

  1. REBA experimenters' manual

    International Nuclear Information System (INIS)

    Schuch, R.L.

    1977-04-01

    The REBA is a high-energy, pulsed electron beam or bremsstrahlung x-ray generator whose operational purpose is to provide an energy source of short duration for conducting experiments, primarily to determine material responses to rapid surface and in-depth deposition of energy. The purpose of this manual is to serve as a basic source of information for prospective users of REBA. Included is a brief discussion of the design and operation of the facility as well as a summary of output characteristics for electron beam modes and environmental data for x-ray operation. The manual also contains a description of the REBA experimental facilities, including geometry of the test cell, instrumentation and data collection capabilities, and services and support available to experimenters

  2. The environmental survey manual

    International Nuclear Information System (INIS)

    1987-08-01

    The purpose of this manual is to provide guidance to the Survey and Sampling and Analysis teams that conduct the one-time Environmental Survey of the major US Department of Energy (DOE) operating facilities. This manual includes a discussion of DOE's policy on environmental issues, a review of statutory guidance as it applies to the Survey, the procedures and protocols to be used by the Survey teams, criteria for the use of the Survey teams in evaluating existing environmental data for the Survey effort, generic technical checklists used in every Survey, health and safety guidelines for the personnel conducting the Survey, including the identification of potential hazards, prescribed protective equipment, and emergency procedures, the required formats for the Survey reports, guidance on identifying environmental problems that need immediate attention by the Operations Office responsible for the particular facility, and procedures and protocols for the conduct of sampling and analysis

  3. NIF ICCS Test Controller for Automated and Manual Testing

    International Nuclear Information System (INIS)

    Zielinski, J S

    2007-01-01

    The National Ignition Facility (NIF) Integrated Computer Control System (ICCS) is a large (1.5 MSLOC), hierarchical, distributed system that controls all aspects of the NIF laser [1]. The ICCS team delivers software updates to the NIF facility throughout the year to support shot operations and commissioning activities. In 2006, there were 48 releases of ICCS: 29 full releases, 19 patches. To ensure the quality of each delivery, thousands of manual and automated tests are performed using the ICCS Test Controller test infrastructure. The TestController system provides test inventory management, test planning, automated test execution and manual test logging, release testing summaries and test results search, all through a web browser interface. Automated tests include command line based frameworks server tests and Graphical User Interface (GUI) based Java tests. Manual tests are presented as a checklist-style web form to be completed by the tester. The results of all tests, automated and manual, are kept in a common repository that provides data to dynamic status reports. As part of the 3-stage ICCS release testing strategy, the TestController system helps plan, evaluate and track the readiness of each release to the NIF facility

  4. Users manual for the pursuit of the radiological status of the nuclear and radioactive facilities of the ININ; Manual del usuario programa de seguimiento del estado radiologico de las instalaciones nucleares y radiactivas del ININ

    Energy Technology Data Exchange (ETDEWEB)

    Sotelo B, D [ITESM, Servicio Social, Monterrey (Mexico); Villarreal, J E

    1992-05-15

    The purpose of this program consists on a database that gives pursuit at the radiation levels in laboratories and facilities users of radioactive material or generators of ionizing radiations, introducing in it mensurations that were made in different departments, for its later analysis. (Author)

  5. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    Science.gov (United States)

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. The Emergence of Large-Scale Computer Assisted Summative Examination Facilities in Higher Education

    NARCIS (Netherlands)

    Draaijer, S.; Warburton, W. I.

    2014-01-01

    A case study is presented of VU University Amsterdam where a dedicated large-scale CAA examination facility was established. In the facility, 385 students can take an exam concurrently. The case study describes the change factors and processes leading up to the decision by the institution to

  8. Potential applications of artificial intelligence in computer-based management systems for mixed waste incinerator facility operation

    International Nuclear Information System (INIS)

    Rivera, A.L.; Singh, S.P.N.; Ferrada, J.J.

    1991-01-01

    The Department of Energy/Oak Ridge Field Office (DOE/OR) operates a mixed waste incinerator facility at the Oak Ridge K-25 Site, designed for the thermal treatment of incinerable liquid, sludge, and solid waste regulated under the Toxic Substances Control Act (TSCA) and the Resource Conversion and Recovery Act (RCRA). Operation of the TSCA Incinerator is highly constrained as a result of the regulatory, institutional, technical, and resource availability requirements. This presents an opportunity for applying computer technology as a technical resource for mixed waste incinerator operation to facilitate promoting and sustaining a continuous performance improvement process while demonstrating compliance. This paper describes mixed waste incinerator facility performance-oriented tasks that could be assisted by Artificial Intelligence (AI) and the requirements for AI tools that would implement these algorithms in a computer-based system. 4 figs., 1 tab

  9. MANUAL LOGIC CONTROLLER (MLC)

    OpenAIRE

    Claude Ziad Bayeh

    2015-01-01

    The “Manual Logic Controller” also called MLC, is an electronic circuit invented and designed by the author in 2008, in order to replace the well known PLC (Programmable Logic Controller) in many applications for its advantages and its low cost of fabrication. The function of the MLC is somewhat similar to the well known PLC, but instead of doing it by inserting a written program into the PLC using a computer or specific software inside the PLC, it will be manually programmed in a manner to h...

  10. TRUBA User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Tereshchenko, M. A.; Castejon, F.; Cappa, A.

    2008-04-25

    The TRUBA (pipeline in Russian) code is a computational tool for studying the propagation of Gaussian-shaped microwave beams in a prescribed equilibrium plasma. This manual covers the basic material handed to use the implementation of TRUBA (version 3,4) interfaced with the numerical library of the TJ-II stellarator. The manual provides a concise theoretical background of the problem, specifications for setting up the input files and interpreting the output of the code, and some information useful in modifying TRUBA. (Author) 13 refs.

  11. TRUBA User Manual

    International Nuclear Information System (INIS)

    Tereshchenko, M. A.; Castejon, F.; Cappa, A.

    2008-01-01

    The TRUBA (pipeline in Russian) code is a computational tool for studying the propagation of Gaussian-shaped microwave beams in a prescribed equilibrium plasma. This manual covers the basic material handed to use the implementation of TRUBA (version 3,4) interfaced with the numerical library of the TJ-II stellarator. The manual provides a concise theoretical background of the problem, specifications for setting up the input files and interpreting the output of the code, and some information useful in modifying TRUBA. (Author) 13 refs

  12. Effects of a manualized short-term treatment of internet and computer game addiction (STICA: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jäger Susanne

    2012-04-01

    Full Text Available Abstract Background In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA and computer addiction (CA in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. Methods/design This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA. The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. Discussion A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. Trial Registration ClinicalTrials (NCT01434589

  13. Effects of a manualized short-term treatment of internet and computer game addiction (STICA): study protocol for a randomized controlled trial

    Science.gov (United States)

    2012-01-01

    Background In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA) and computer addiction (CA) in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. Methods/design This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA). The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression) will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. Discussion A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. Trial Registration ClinicalTrials (NCT01434589) PMID:22540330

  14. Effects of a manualized short-term treatment of internet and computer game addiction (STICA): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Jäger, Susanne; Müller, Kai W; Ruckes, Christian; Wittig, Tobias; Batra, Anil; Musalek, Michael; Mann, Karl; Wölfling, Klaus; Beutel, Manfred E

    2012-04-27

    In the last few years, excessive internet use and computer gaming have increased dramatically. Salience, mood modification, tolerance, withdrawal symptoms, conflict, and relapse have been defined as diagnostic criteria for internet addiction (IA) and computer addiction (CA) in the scientific community. Despite a growing number of individuals seeking help, there are no specific treatments of established efficacy. This clinical trial aims to determine the effect of the disorder-specific manualized short-term treatment of IA/CA (STICA). The cognitive behavioural treatment combines individual and group interventions with a total duration of 4 months. Patients will be randomly assigned to STICA treatment or to a wait list control group. Reliable and valid measures of IA/CA and co-morbid mental symptoms (for example social anxiety, depression) will be assessed prior to the beginning, in the middle, at the end, and 6 months after completion of treatment. A treatment of IA/CA will establish efficacy and is desperately needed. As this is the first trial to determine efficacy of a disorder specific treatment, a wait list control group will be implemented. Pros and cons of the design were discussed. ClinicalTrials (NCT01434589).

  15. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-IV. User's manual

    International Nuclear Information System (INIS)

    2001-01-01

    generated by each plant and the user specified characteristics of fuels used. Expanded dimensions for handling up to 90 types of plants and a larger number of configurations (up to 500 per year and 5000 for the study period). The present manual allows us to support the use of the WASP-IV version and to illustrate the capabilities of the model. This manual contains 13 chapters. Chapter 1 gives a summary description of WASP-IV Computer Code and its Modules and file system. Chapter 2 explains the hardware requirement and the installation of the package. The sequence of the execution of WASP-IV is also briefly introduced in this chapter. Chapters 3 to 9 explains, in detail, how to execute each of the module of WASP-IV package, the organisation of input files and output from the run of the model. Special attention was paid to the description of the linkage of modules. Chapter 10 specially guides the users on how to effectively search for an optimal solution. Chapter 11 describes the execution of sensitivity analyses that can be (recommend to be) performed with WASP-IV. To ease the debugging during the running of the software, Chapter 12 provides technical details of the new features incorporated in this version. Chapter 13 provides a list of error and warning messages produced for each module of WASP. The reader of this manual is assumed to have experience in the field of power generation expansion planning and to be familiar with all concepts related to such type of analysis, therefore these aspects are not treated in this manual. Additional information on power generation expansion planning can be found in the IAEA publication 'Expansion Planning for Electrical Generating Systems, A Guidebook', Technical Reports Series No. 241 (1984) or User's Manual of WASP-IV Plus, Computer Manual Series No. 8, (1995)

  16. MINTEQ user's manual

    International Nuclear Information System (INIS)

    Peterson, S.R.; Hostetler, C.J.; Deutsch, W.J.; Cowan, C.E.

    1987-02-01

    This manual will aid the user in applying the MINTEQ geochemical computer code to model aqueous solutions and the interactions of aqueous solutions with hypothesized assemblages of solid phases. The manual will provide a basic understanding of how the MINTEQ computer code operates and the important principles that are incorporated into the code and instruct a user of the MINTEQ code on how to create input files to simulate a variety of geochemical problems. Chapters 2 through 8 are for the user who has some experience with or wishes to review the principles important to geochemical computer codes. These chapters include information on the methodology MINTEQ uses to incorporate these principles into the code. Chapters 9 through 11 are for the user who wants to know how to create input data files to model various types of problems. 35 refs., 2 figs., 5 tabs

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  18. Nuclear fuel cycle facility accident analysis handbook

    International Nuclear Information System (INIS)

    Ayer, J.E.; Clark, A.T.; Loysen, P.; Ballinger, M.Y.; Mishima, J.; Owczarski, P.C.; Gregory, W.S.; Nichols, B.D.

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH

  19. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    Science.gov (United States)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  20. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    International Nuclear Information System (INIS)

    Donvito, Giacinto; Italiano, Alessandro; Salomoni, Davide

    2014-01-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post

  1. The minimal manual: is less really more?

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1993-01-01

    Carroll, Smith-Kerker, Ford and Mazur-Rimetz (The minimal manual, Human-Computer Interaction , 3, 123-153, 1987) have introduced the minimal manual as an alternative to standard self-instruction manuals. While their research indicates strong gains, only a few attempts have been made to validate

  2. User's manual for the CC3 computer models of the concept for disposal of Canada's nuclear fuel waste

    International Nuclear Information System (INIS)

    Dougan, K.D.; Wojciechowski, L.C.

    1995-06-01

    Atomic Energy of Canada Limited (AECL) is assessing a concept for disposing of CANDU reactor fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept, and for environmental assessments of other systems. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system, which is represented by the CC3 (Canadian Concept, generation 3) models comprising a design-specific vault, a site-specific geosphere and a biosphere typical of the Canadian Shield. (author). 23 refs., 7 tabs., 21 figs

  3. COBRA-SFS [Spent Fuel Storage]: A thermal-hydraulic analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    Rector, D.R.; Cuta, J.M.; Lombardo, N.J.; Michener, T.E.; Wheeler, C.L.

    1986-11-01

    COBRA-SFS (Spent Fuel Storage) is a general thermal-hydraulic analysis computer code used to predict temperatures and velocities in a wide variety of systems. The code was refined and specialized for spent fuel storage system analyses for the US Department of Energy's Commercial Spent Fuel Management Program. The finite-volume equations governing mass, momentum, and energy conservation are written for an incompressible, single-phase fluid. The flow equations model a wide range of conditions including natural circulation. The energy equations include the effects of solid and fluid conduction, natural convection, and thermal radiation. The COBRA-SFS code is structured to perform both steady-state and transient calculations; however, the transient capability has not yet been validated. This volume contains the input instructions for COBRA-SFS and an auxiliary radiation exchange factor code, RADX-1. It is intended to aid the user in becoming familiar with the capabilities and modeling conventions of the code

  4. GRACE manual

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1993-02-01

    This manual is composed of three kinds of objects, theoretical background for calculating the cross section of elementary process, usage and technical details of the GRACE system. Throughout this manual we take the tree level process e + e - → W + W - γ as an example, including the e ± -scalar boson interactions. The real FORTRAN source code for this process is attached in the relevant sections as well as the results of calculation, which might be a great help for understanding the practical use of the system. (J.P.N.)

  5. BUCLASP 3: A computer program for stresses and buckling of heated composite stiffened panels and other structures, user's manual

    Science.gov (United States)

    Tripp, L. L.; Tamekuni, M.; Viswanathan, A. V.

    1973-01-01

    The use of the computer program BUCLASP3 is described. The code is intended for thermal stress and instability analyses of structures such as unidirectionally stiffened panels. There are two types of instability analyses that can be effected by PAINT; (1) thermal buckling, and (2) buckling due to a specified inplane biaxial loading. Any structure that has a constant cross section in one direction, that may be idealized as an assemblage of beam elements and laminated flat and curved plate strip-elements can be analyzed. The two parallel ends of the panel must be simply supported, whereas arbitrary elastic boundary conditions may be imposed along any one or both external longitudinal side. Any variation in the temperature rise (from ambient) through the cross section of a panel is considered in the analyses but it must be assumed that in the longitudinal direction the temperature field is constant. Load distributions for the externally applied inplane biaxial loads are similar in nature to the permissible temperature field.

  6. User's manual of a supporting system for treatment planning in boron neutron capture therapy. JAERI computational dosimetry system

    CERN Document Server

    Kumada, H

    2002-01-01

    A boron neutron capture therapy (BNCT) with epithermal neutron beam is expected to treat effectively for malignant tumor that is located deeply in the brain. It is indispensable to estimate preliminarily the irradiation dose in the brain of a patient in order to perform the epithermal neutron beam BNCT. Thus, the JAERI Computational Dosimetry System (JCDS), which can calculate the dose distributions in the brain, has been developed. JCDS is a software that creates a 3-dimensional head model of a patient by using CT and MRI images and that generates a input data file automatically for calculation neutron flux and gamma-ray dose distribution in the brain by the Monte Carlo code: MCNP, and that displays the dose distribution on the head model for dosimetry by using the MCNP calculation results. JCDS has any advantages as follows; By treating CT data and MRI data which are medical images, a detail three-dimensional model of patient's head is able to be made easily. The three-dimensional head image is editable to ...

  7. Hydrogen Mixing Studies (HMS), user's manual

    International Nuclear Information System (INIS)

    Lam, K.L.; Wilson, T.L.; Travis, J.R.

    1994-12-01

    Hydrogen Mixing Studies (HMS) is a best-estimate analysis tool for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containments and other facilities. It can model geometrically complex facilities having multiple compartments and internal structures. The code can simulate the effects of steam condensation, heat transfer to walls and internal structures, chemical kinetics, and fluid turbulence. The gas mixture may consist of components included in a built-in library of 20 species. HMS is a finite-volume computer code that solves the time-dependent, three-dimensional (3D) compressible Navier Stokes equations. Both Cartesian and cylindrical coordinate systems are available. Transport equations for the fluid internal energy and for gas species densities are also solved. HMS was originally developed to run on Cray-type supercomputers with vector-processing units that greatly improve the computational speed, especially for large, complex problems. Recently the code has been converted to run on Sun workstations. Both the Cray and Sun versions have the same built-in graphics capabilities that allow 1D, 2D, 3D, and time-history plots of all solution variables. Other code features include a restart capability and flexible definitions of initial and time-dependent boundary conditions. This manual describes how to use the code. It explains how to set up the model geometry, define walls and obstacles, and specify gas species and material properties. Definitions of initial and boundary conditions are also described. The manual also describes various physical model and numerical procedure options, as well as how to turn them on. The reader also learns how to specify different outputs, especially graphical display of solution variables. Finally sample problems are included to illustrate some applications of the code. An input deck that illustrates the minimum required data to run HMS is given at the end of this manual

  8. Manual on decontamination of surfaces

    International Nuclear Information System (INIS)

    1979-01-01

    The manual is intended for those who are responsible for the organization and implementation of decontamination programmes for facilities where radioactive materials are handled mainly on a laboratory scale. It contains information and guidelines on practical methods for decontaminating working spaces, equipment, laboratory benches and protective clothing. Useful information is also provided on the removal of loose skin contamination from personnel by mild, non-medical processes. Methods of removing skin contamination needing medical supervision, or of internal decontamination, which is entirely a medical process, are not covered in this manual. Large-scale decontamination of big nuclear facilities is also considered as outside its scope

  9. Waste Management Technical Manual

    Energy Technology Data Exchange (ETDEWEB)

    Buckingham, J.S. [ed.

    1967-08-31

    This Manual has been prepared to provide a documented compendium of the technical bases and general physical features of Isochem Incorporated`s Waste Management Program. The manual is intended to be used as a means of training and as a reference handbook for use by personnel responsible for executing the Waste Management Program. The material in this manual was assembled by members of Isochem`s Chemical Processing Division, Battelle Northwest Laboratory, and Hanford Engineering Services between September 1965 and March 1967. The manual is divided into the following parts: Introduction, contains a summary of the overall Waste Management Program. It is written to provide the reader with a synoptic view and as an aid in understanding the subsequent parts; Feed Material, contains detailed discussion of the type and sources of feed material used in the Waste Management Program, including a chapter on nuclear reactions and the formation of fission products; Waste Fractionization Plant Processing, contains detailed discussions of the processes used in the Waste Fractionization Plant with supporting data and documentation of the technology employed; Waste Fractionization Plant Product and Waste Effluent Handling, contains detailed discussions of the methods of handling the product and waste material generated by the Waste Fractionization Plant; Plant and Equipment, describes the layout of the Waste Management facilities, arrangement of equipment, and individual equipment pieces; Process Control, describes the instruments and analytical methods used for process control; and Safety describes process hazards and the methods used to safeguard against them.

  10. User's manual of a supporting system for treatment planning in boron neutron capture therapy. JAERI computational dosimetry system

    Energy Technology Data Exchange (ETDEWEB)

    Kumada, Hiroaki; Torii, Yoshiya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-09-01

    A boron neutron capture therapy (BNCT) with epithermal neutron beam is expected to treat effectively for malignant tumor that is located deeply in the brain. It is indispensable to estimate preliminarily the irradiation dose in the brain of a patient in order to perform the epithermal neutron beam BNCT. Thus, the JAERI Computational Dosimetry System (JCDS), which can calculate the dose distributions in the brain, has been developed. JCDS is a software that creates a 3-dimensional head model of a patient by using CT and MRI images and that generates a input data file automatically for calculation neutron flux and gamma-ray dose distribution in the brain by the Monte Carlo code: MCNP, and that displays the dose distribution on the head model for dosimetry by using the MCNP calculation results. JCDS has any advantages as follows; By treating CT data and MRI data which are medical images, a detail three-dimensional model of patient's head is able to be made easily. The three-dimensional head image is editable to simulate the state of a head after its surgical processes such as skin flap opening and bone removal for the BNCT with craniotomy that are being performed in Japan. JCDS can provide information for the Patient Setting System to set the patient in an actual irradiation position swiftly and accurately. This report describes basic design and procedure of dosimetry, operation manual, data and library structure for JCDS (ver.1.0). (author)

  11. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.; Mallen, A.N.; Neymotin, L.Y.

    1998-03-01

    This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARC and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.

  12. FTMP (Fault Tolerant Multiprocessor) programmer's manual

    Science.gov (United States)

    Feather, F. E.; Liceaga, C. A.; Padilla, P. A.

    1986-01-01

    The Fault Tolerant Multiprocessor (FTMP) computer system was constructed using the Rockwell/Collins CAPS-6 processor. It is installed in the Avionics Integration Research Laboratory (AIRLAB) of NASA Langley Research Center. It is hosted by AIRLAB's System 10, a VAX 11/750, for the loading of programs and experimentation. The FTMP support software includes a cross compiler for a high level language called Automated Engineering Design (AED) System, an assembler for the CAPS-6 processor assembly language, and a linker. Access to this support software is through an automated remote access facility on the VAX which relieves the user of the burden of learning how to use the IBM 4381. This manual is a compilation of information about the FTMP support environment. It explains the FTMP software and support environment along many of the finer points of running programs on FTMP. This will be helpful to the researcher trying to run an experiment on FTMP and even to the person probing FTMP with fault injections. Much of the information in this manual can be found in other sources; we are only attempting to bring together the basic points in a single source. If the reader should need points clarified, there is a list of support documentation in the back of this manual.

  13. Draft of diagnostic techniques for primary coolant circuit facilities using control computer

    International Nuclear Information System (INIS)

    Suchy, R.; Procka, V.; Murin, V.; Rybarova, D.

    A method is proposed of in-service on-line diagnostics of primary circuit selected parts by means of a control computer. Computer processing will involve the measurements of neutron flux, pressure difference in pumps and in the core, and the vibrations of primary circuit mechanical parts. (H.S.)

  14. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    International Nuclear Information System (INIS)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-01-01

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above

  15. A computational test facility for distributed analysis of gravitational wave signals

    International Nuclear Information System (INIS)

    Amico, P; Bosi, L; Cattuto, C; Gammaitoni, L; Punturo, M; Travasso, F; Vocca, H

    2004-01-01

    In the gravitational wave detector Virgo, the in-time detection of a gravitational wave signal from a coalescing binary stellar system is an intensive computational task. A parallel computing scheme using the message passing interface (MPI) is described. Performance results on a small-scale cluster are reported

  16. National Ignition Facility system design requirements NIF integrated computer controls SDR004

    International Nuclear Information System (INIS)

    Bliss, E.

    1996-01-01

    This System Design Requirement document establishes the performance, design, development, and test requirements for the NIF Integrated Computer Control System. The Integrated Computer Control System (ICCS) is covered in NIF WBS element 1.5. This document responds directly to the requirements detailed in the NIF Functional Requirements/Primary Criteria, and is supported by subsystem design requirements documents for each major ICCS Subsystem

  17. Laser performance operations model (LPOM): a computational system that automates the setup and performance analysis of the national ignition facility

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, M; House, R; Williams, W; Haynam, C; White, R; Orth, C; Sacks, R [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA, 94550 (United States)], E-mail: shaw7@llnl.gov

    2008-05-15

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF will be the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed and deployed that automates the laser setup process, and accurately predict laser energetics. LPOM determines the settings of the injection laser system required to achieve the desired main laser output, provides equipment protection, determines the diagnostic setup, and supplies post shot data analysis and reporting.

  18. User Delay Cost Model and Facilities Maintenance Cost Model for a Terminal Control Area : Volume 2. User's Manual and Program Documentation for the User Delay Cost Model

    Science.gov (United States)

    1978-05-01

    The User Delay Cost Model (UDCM) is a Monte Carlo simulation of certain classes of movement of air traffic in the Boston Terminal Control Area (TCA). It incorporates a weather module, an aircraft generation module, a facilities module, and an air con...

  19. Advantages for the introduction of computer techniques in centralized supervision of radiation levels in nuclear facilities

    International Nuclear Information System (INIS)

    Vialettes, H.; Leblanc, P.

    1980-01-01

    A new computerized information system at the Saclay Center comprising 120 measuring channels is described. The advantages offered by this system with respect to the systems in use up to now are presented. Experimental results are given which support the argument that the system can effectively supervise the radioisotope facility at the Center. (B.G.)

  20. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  1. Simple computational modeling for human extracorporeal irradiation using the BNCT facility of the RA-3 Reactor

    International Nuclear Information System (INIS)

    Farias, Ruben; Gonzalez, S.J.; Bellino, A.; Sztenjberg, M.; Pinto, J.; Thorp, Silvia I.; Gadan, M.; Pozzi, Emiliano; Schwint, Amanda E.; Heber, Elisa M.; Trivillin, V.A.; Zarza, Leandro G.; Estryk, Guillermo; Miller, M.; Bortolussi, S.; Soto, M.S.; Nigg, D.W.

    2009-01-01

    We present a simple computational model of the reactor RA-3 developed using Monte Carlo transport code MCNP. The model parameters are adjusted in order to reproduce experimental measured points in air and the source validation is performed in an acrylic phantom. Performance analysis is carried out using computational models of animal extracorporeal irradiation in liver and lung. Analysis is also performed inside a neutron shielded receptacle use for the irradiation of rats with a model of hepatic metastases.The computational model reproduces the experimental behavior in all the analyzed cases with a maximum difference of 10 percent. (author)

  2. A personal computer code for seismic evaluations of nuclear power plants facilities

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Graves, H.

    1990-01-01

    The program CARES (Computer Analysis for Rapid Evaluation of Structures) is an integrated computational system being developed by Brookhaven National Laboratory (BNL) for the U.S. Nuclear Regulatory Commission. It is specifically designed to be a personal computer (PC) operated package which may be used to determine the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants. CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the various features which have been implemented into the Seismic Module of CARES

  3. Specific features of organizng the computer-aided design of radio-electronic equipment for electrophysical facilities

    International Nuclear Information System (INIS)

    Mozin, I.V.; Vasil'ev, M.P.

    1985-01-01

    Problems of developing systems for computer-aided design (CAD) of radioelectronic equipment for large electrophysical facilities such as charged particle accelerators of new generation are discussed. The PLATA subsystem representing a part of CAD and used for printed circuit design is described. The subsystem PLATA is utilized to design, on the average, up to 150 types of circuits a year, 100-120 of which belong to circuits of increased complexity. In this case labour productivity of a designer at documentation increases almost two times

  4. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...... are transferred to a large high-speed computer for bulk processing and for the production of isophot and equiphase contour maps or profiles. The performance of the system is demonstrated through results for a single conical horn, for interacting rectangular horns, for multiple cylindrical scatterers...

  5. Requirements Report Computer Software System for a Semi-Automatic Pipe Handling System and Fabrication Facility

    National Research Council Canada - National Science Library

    1980-01-01

    .... This report is to present the requirements of the computer software that must be developed to create Pipe Detail Drawings and to support the processing of the Pipe Detail Drawings through the Pipe Shop...

  6. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  7. Computing Facilities for AI: A Survey of Present and Near-Future Options

    OpenAIRE

    Fahlman, Scott

    1981-01-01

    At the recent AAAI conference at Stanford, it became apparent that many new AI research centers are being established around the country in industrial and governmental settings and in universities that have not paid much attention to AI in the past. At the same time, many of the established AI centers are in the process of converting from older facilities, primarily based on Decsystem-10 and Decsystem-20 machines, to a variety of newer options. At present, unfortunately, there is no simple an...

  8. Nuclear electronics laboratory manual

    International Nuclear Information System (INIS)

    1984-05-01

    The Nuclear Electronics Laboratory Manual is a joint product of several electronics experts who have been associated with IAEA activity in this field for many years. The manual does not include experiments of a basic nature, such as characteristics of different active electronics components. It starts by introducing small electronics blocks, employing one or more active components. The most demanding exercises instruct a student in the design and construction of complete circuits, as used in commercial nuclear instruments. It is expected that a student who completes all the experiments in the manual should be in a position to design nuclear electronics units and also to understand the functions of advanced commercial instruments which need to be repaired or maintained. The future tasks of nuclear electronics engineers will be increasingly oriented towards designing and building the interfaces between a nuclear experiment and a computer. The manual pays tribute to this development by introducing a number of experiments which illustrate the principles and the technology of interfacing

  9. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  11. Davis PV plant operation and maintenance manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    This operation and maintenance manual contains the information necessary to run the Photovoltaics for Utility Scale Applications (PVUSA) test facility in Davis, California. References to more specific information available in drawings, data sheets, files, or vendor manuals are included. The PVUSA is a national cooperative research and demonstration program formed in 1987 to assess the potential of utility scale photovoltaic systems.

  12. Quality assurance manual: Volume 1

    International Nuclear Information System (INIS)

    Oijala, J.E.

    1988-06-01

    Stanford Linear Accelerator Center (SLAC) is a DOE-supported research facility that carries out experimental and theoretical research in high energy physics and developmental work in new techniques for particle acceleration and experimental instrumentation. The purpose of this manual is to describe SLAC quality assurance policies and practices in various parts of the Laboratory

  13. A stand alone computer system to aid the development of mirror fusion test facility RF heating systems

    International Nuclear Information System (INIS)

    Thomas, R.A.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  14. Prostate positioning using cone-beam computer tomography based on manual soft-tissue registration. Interobserver agreement between radiation oncologists and therapists

    Energy Technology Data Exchange (ETDEWEB)

    Jereczek-Fossa, B.A.; Pobbiati, C.; Fanti, P. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); University of Milan, Milan (Italy); Santoro, L. [European Institute of Oncology, Department of Epidemiology and Biostatistics, Milan (Italy); Fodor, C.; Zerini, D. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); Vigorito, S. [European Institute of Oncology, Department of Medical Physics, Milan (Italy); Baroni, G. [Politecnico di Milano, Department of Electronics Information and Bioengineering, Milan (Italy); De Cobelli, O. [European Institute of Oncology, Department of Urology, Milan (Italy); University of Milan, Milan (Italy); Orecchia, R. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); National Center for Oncological Hadrontherapy (CNAO) Foundation, Pavia (Italy); University of Milan, Milan (Italy)

    2014-01-15

    To check the interobserver agreement between radiation oncologists and therapists (RTT) using an on- and off-line cone-beam computer tomography (CBCT) protocol for setup verification in the radiotherapy of prostate cancer. The CBCT data from six prostate cancer patients treated with hypofractionated intensity-modulated radiotherapy (IMRT) were independently reviewed off-line by four observers (one radiation oncologist, one junior and two senior RTTs) and benchmarked with on-line CBCT positioning performed by a radiation oncologist immediately prior to treatment. CBCT positioning was based on manual soft-tissue registration. Agreement between observers was evaluated using weighted Cohen's kappa statistics. In total, 152 CBCT-based prostate positioning procedures were reviewed by each observer. The mean (± standard deviation) of the differences between off- and on-line CBCT-simCT registration translations along the three directions (antero-posterior, latero-lateral and cranio-caudal) and rotation around the antero-posterior axis were - 0.7 (3.6) mm, 1.9 (2.7) mm, 0.9 (3.6) mm and - 1.8 (5.0) degrees, respectively. Satisfactory interobserver agreement was found, being substantial (weighted kappa > 0.6) in 10 of 16 comparisons and moderate (0.41-0.60) in the remaining six comparisons. CBCT interpretation performed by RTTs is comparable to that of radiation oncologists. Our study might be helpful in the quality assurance of radiotherapy and the optimization of competencies. Further investigation should include larger sample sizes, a greater number of observers and validated methodology in order to assess interobserver variability and its impact on high-precision prostate cancer IGRT. In the future, it should enable the wider implementation of complex and evolving radiotherapy technologies. (orig.)

  15. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  16. AIRDOS-II computer code for estimating radiation dose to man from airborne radionuclides in areas surrouding nuclear facilities

    International Nuclear Information System (INIS)

    Moore, R.E.

    1977-04-01

    The AIRDOS-II computer code estimates individual and population doses resulting from the simultaneous atmospheric release of as many as 36 radionuclides from a nuclear facility. This report describes the meteorological and environmental models used is the code, their computer implementation, and the applicability of the code to assessments of radiological impact. Atmospheric dispersion and surface deposition of released radionuclides are estimated as a function of direction and distance from a nuclear power plant or fuel-cycle facility, and doses to man through inhalation, air immersion, exposure to contaminated ground, food ingestion, and water immersion are estimated in the surrounding area. Annual doses are estimated for total body, GI tract, bone, thyroid, lungs, muscle, kidneys, liver, spleen, testes, and ovaries. Either the annual population doses (man-rems/year) or the highest annual individual doses in the assessment area (rems/year), whichever are applicable, are summarized in output tables in several ways--by nuclides, modes of exposure, and organs. The location of the highest individual doses for each reference organ estimated for the area is specified in the output data

  17. Burnup calculations for KIPT accelerator driven subcritical facility using Monte Carlo computer codes-MCB and MCNPX

    International Nuclear Information System (INIS)

    Gohar, Y.; Zhong, Z.; Talamo, A.

    2009-01-01

    Argonne National Laboratory (ANL) of USA and Kharkov Institute of Physics and Technology (KIPT) of Ukraine have been collaborating on the conceptual design development of an electron accelerator driven subcritical (ADS) facility, using the KIPT electron accelerator. The neutron source of the subcritical assembly is generated from the interaction of 100 KW electron beam with a natural uranium target. The electron beam has a uniform spatial distribution and electron energy in the range of 100 to 200 MeV. The main functions of the subcritical assembly are the production of medical isotopes and the support of the Ukraine nuclear power industry. Neutron physics experiments and material structure analyses are planned using this facility. With the 100 KW electron beam power, the total thermal power of the facility is ∼375 kW including the fission power of ∼260 kW. The burnup of the fissile materials and the buildup of fission products reduce continuously the reactivity during the operation, which reduces the neutron flux level and consequently the facility performance. To preserve the neutron flux level during the operation, fuel assemblies should be added after long operating periods to compensate for the lost reactivity. This process requires accurate prediction of the fuel burnup, the decay behavior of the fission produces, and the introduced reactivity from adding fresh fuel assemblies. The recent developments of the Monte Carlo computer codes, the high speed capability of the computer processors, and the parallel computation techniques made it possible to perform three-dimensional detailed burnup simulations. A full detailed three-dimensional geometrical model is used for the burnup simulations with continuous energy nuclear data libraries for the transport calculations and 63-multigroup or one group cross sections libraries for the depletion calculations. Monte Carlo Computer code MCNPX and MCB are utilized for this study. MCNPX transports the electrons and the

  18. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... SUPPLEMENTARY INFORMATION section for electronic access to the guidance document. Submit electronic comments on... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the User's... document to http://www.regulations.gov or written comments to the Division of Dockets Management (see...

  19. Navier-Stokes Simulation of Airconditioning Facility of a Large Modem Computer Room

    Science.gov (United States)

    2005-01-01

    NASA recently assembled one of the world's fastest operational supercomputers to meet the agency's new high performance computing needs. This large-scale system, named Columbia, consists of 20 interconnected SGI Altix 512-processor systems, for a total of 10,240 Intel Itanium-2 processors. High-fidelity CFD simulations were performed for the NASA Advanced Supercomputing (NAS) computer room at Ames Research Center. The purpose of the simulations was to assess the adequacy of the existing air handling and conditioning system and make recommendations for changes in the design of the system if needed. The simulations were performed with NASA's OVERFLOW-2 CFD code which utilizes overset structured grids. A new set of boundary conditions were developed and added to the flow solver for modeling the roomls air-conditioning and proper cooling of the equipment. Boundary condition parameters for the flow solver are based on cooler CFM (flow rate) ratings and some reasonable assumptions of flow and heat transfer data for the floor and central processing units (CPU) . The geometry modeling from blue prints and grid generation were handled by the NASA Ames software package Chimera Grid Tools (CGT). This geometric model was developed as a CGT-scripted template, which can be easily modified to accommodate any changes in shape and size of the room, locations and dimensions of the CPU racks, disk racks, coolers, power distribution units, and mass-storage system. The compute nodes are grouped in pairs of racks with an aisle in the middle. High-speed connection cables connect the racks with overhead cable trays. The cool air from the cooling units is pumped into the computer room from a sub-floor through perforated floor tiles. The CPU cooling fans draw cool air from the floor tiles, which run along the outside length of each rack, and eject warm air into the center isle between the racks. This warm air is eventually drawn into the cooling units located near the walls of the room. One

  20. REFLA-1D/MODE 1: a computer program for reflood thermo-hydrodynamic analysis during PWR-LOCA user's manual

    International Nuclear Information System (INIS)

    Murao, Yoshio; Sugimoto, Jun; Okubo, Tsutomu

    1981-01-01

    This manual describes the REFLA-1D/MODE 1 reflood system analysis code. This code can solve the core thermo-hydrodynamics under forced flooding conditions and gravity feed conditions in a system similar to FLECHT-SET phase A. This manual describes the REFLA-1D/MODE 1 models and provides application information required to utilize REFLA-1D/MODE 1. (author)

  1. Sodium safety manual

    International Nuclear Information System (INIS)

    Hayes, D.J.; Gardiner, R.L.

    1980-09-01

    The sodium safety manual is based upon more than a decade of experience with liquid sodium at Berkeley Nuclear Laboratories (BNL). It draws particularly from the expertise and experience developed in the course of research work into sodium fires and sodium water reactions. It draws also on information obtained from the UKAEA and other sodium users. Many of the broad principles will apply to other Establishments but much of the detail is specific to BNL and as a consequence its application at other sites may well be limited. Accidents with sodium are at best unpleasant and at worst lethal in an extremely painful way. The object of this manual is to help prevent sodium accidents. It is not intended to give detailed advice on specific precautions for particular situations, but rather to set out the overall strategy which will ensure that sodium activities will be pursued safely. More detail is generally conveyed to staff by the use of local instructions known as Sodium Working Procedures (SWP's) which are not reproduced in this manual although a list of current SWP's is included. Much attention is properly given to the safe design and operation of larger facilities; nevertheless evidence suggests that sodium accidents most frequently occur in small-scale work particularly in operations associated with sodium cleaning and special care is needed in all such cases. (U.K.)

  2. Reliability Lessons Learned From GPU Experience With The Titan Supercomputer at Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Gallarno, George [Christian Brothers University; Rogers, James H [ORNL; Maxwell, Don E [ORNL

    2015-01-01

    The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learned in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    International Nuclear Information System (INIS)

    1995-01-01

    (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). The present document has been produced to support use of the WASP-Ill Plus computer code and to illustrate the capabilities of the program. This Manual is organized in two separate volumes. This first one includes 11 main chapters describing how to use the WASP-Ill Plus computer program. Chapter 1 gives a summary description and some background information about the program. Chapter 2 introduces some concepts, mainly related to the computer requirements imposed by the program, that are used throughout the Manual. Chapters 3 to 9 describe how to execute each of the various programs (or modules) of the WASP-Ill Plus package. The description for each module shows the user how to prepare the Job Control statements and input data needed to execute the module and how to interpret the printed output produced. The iterative process that should be followed in order to obtain the 'optimal solution' for a WASP case study is covered in Chapters 6 to 8. Chapter 10 explains the use of an auxiliary program of the WASP package which is mainly intended for saving computer time. Lastly, Chapter 11 recapitulates the use of WASP-Ill Plus for executing a generation expansion planning study; describes the several phases normally involved in this type of study; and provides the user with practical hints about the most important aspects that need to be verified at each phase while executing the various WASP modules

  6. PROFEAT Update: A Protein Features Web Server with Added Facility to Compute Network Descriptors for Studying Omics-Derived Networks.

    Science.gov (United States)

    Zhang, P; Tao, L; Zeng, X; Qin, C; Chen, S Y; Zhu, F; Yang, S Y; Li, Z R; Chen, W P; Chen, Y Z

    2017-02-03

    The studies of biological, disease, and pharmacological networks are facilitated by the systems-level investigations using computational tools. In particular, the network descriptors developed in other disciplines have found increasing applications in the study of the protein, gene regulatory, metabolic, disease, and drug-targeted networks. Facilities are provided by the public web servers for computing network descriptors, but many descriptors are not covered, including those used or useful for biological studies. We upgraded the PROFEAT web server http://bidd2.nus.edu.sg/cgi-bin/profeat2016/main.cgi for computing up to 329 network descriptors and protein-protein interaction descriptors. PROFEAT network descriptors comprehensively describe the topological and connectivity characteristics of unweighted (uniform binding constants and molecular levels), edge-weighted (varying binding constants), node-weighted (varying molecular levels), edge-node-weighted (varying binding constants and molecular levels), and directed (oriented processes) networks. The usefulness of the network descriptors is illustrated by the literature-reported studies of the biological networks derived from the genome, interactome, transcriptome, metabolome, and diseasome profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Computer control and data acquisition system for the Mirror Fusion Test Facility Ion Cyclotron Resonant Heating System (ICRH)

    International Nuclear Information System (INIS)

    Cheshire, D.L.; Thomas, R.A.

    1985-01-01

    The Lawrence Livermore National Laboratory (LLNL) large Mirror Fusion Test Facility (MFTF-B) will employ an Ion Cyclotron Resonant Heating (ICRH) system for plasma startup. As the MFTF-B Industrial Participant, TRW has responsibility for the ICRH system, including development of the data acquisition and control system. During the MFTF-B Supervisory Control and Diagnostic System (SCDS). For subsystem development and checkout at TRW, and for verification and acceptance testing at LLNL, the system will be run from a stand-alone computer system designed to simulate the functions of SCDS. The ''SCDS Simulator'' was developed originally for the MFTF-B ECRH System; descriptions of the hardware and software are updated in this paper. The computer control and data acquisition functions implemented for ICRH are described, including development status, and test schedule at TRW and at LLNL. The application software is written for the SCDS Simulator, but it is programmed in PASCAL and designed to facilitate conversion for use on the SCDS computers

  8. Methane measurements manual; Handbok metanmaetningar

    Energy Technology Data Exchange (ETDEWEB)

    Holmgren, Magnus Andreas (SP Technical research institute of Sweden, Boraas (Sweden))

    2011-02-15

    Emissions to air in different parts of the system may arise in biogas plants, where there is biological treatment of organic matter by anaerobic degradation, and during upgrading of biogas to vehicle fuel. There are mainly four reasons why these emissions must be minimized. These are safety, greenhouse gas emissions, economy and smell. This manual gathers experience of several years of work with measurement of methane emissions from biogas and upgrading facilities. This work has been done mainly in the context of Swedish Waste Management's system of voluntary commitment. The purpose of this manual is to standardize methods and procedures when methane measurements are carried out so that the results are comparable between different providers. The main target group of the manual is measurement consultants performing such measurements. Calculation template in Excel is part of the manual, which further contributes to the measurements evaluated in a standardized way. The manual contains several examples which have been calculated in the accompanying Excel template. The handbook also contains a chapter mainly intended for facility staff, in which implementation of accurate leak detection is described, and where there are hints of a system of so-called intermediate inspections to detect leaks in time

  9. ARIES NDA Robot operators' manual

    International Nuclear Information System (INIS)

    Scheer, N.L.; Nelson, D.C.

    1998-05-01

    The ARIES NDA Robot is an automation device for servicing the material movements for a suite of Non-destructive assay (NDA) instruments. This suite of instruments includes a calorimeter, a gamma isotopic system, a segmented gamma scanner (SGS), and a neutron coincidence counter (NCC). Objects moved by the robot include sample cans, standard cans, and instrument plugs. The robot computer has an RS-232 connection with the NDA Host computer, which coordinates robot movements and instrument measurements. The instruments are expected to perform measurements under the direction of the Host without operator intervention. This user's manual describes system startup, using the main menu, manual operation, and error recovery

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  11. Pdap Manual

    DEFF Research Database (Denmark)

    Pedersen, Mads Mølgaard; Larsen, Torben J.

    Pdap, Python Data Analysis Program, is a program for post processing, analysis, visualization and presentation of data e.g. simulation results and measurements. It is intended but not limited to the domain of wind turbines. It combines an intuitive graphical user interface with python scripting...... that allows automation and implementation of custom functions. This manual gives a short introduction to the graphical user interface, describes the mathematical background for some of the functions, describes the scripting API and finally a few examples on how automate analysis via scripting is presented....... The newest version, and more documentation and help on how to used, extend and automate Pdap can be found at the webpage www.hawc2.dk...

  12. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  13. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    Science.gov (United States)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  14. On dosimetry of radiodiagnosis facilities, mainly focused on computed tomography units

    International Nuclear Information System (INIS)

    Ghitulescu, Zoe

    2008-01-01

    The 'talk' refers to the Dosimetry of computed tomography units and it has been thought and structured in three parts, more or less stressed each of them, thus: 1) Basics of image acquisition using computed tomography technique; 2) Effective Dose calculation for a patient and its assessment using BERT concept; 3) Recommended actions of getting a good compromise in between related dose and the image quality. The aim of the first part is that the reader to become acquainted with the CT technique in order to be able of understanding the Effective Dose calculation given example and its conversion into time units using the BERT concept . The drown conclusion is that: 1) Effective dose calculation accomplished by the medical physicist (using a special soft for the CT scanner and the exam type) and, converted in time units through BERT concept, could be then communicated by the radiologist together with the diagnostic notes. Thus, it is obviously necessary a minimum informal of the patients as regards the nature and type of radiation, for instance, by the help of some leaflets. In the third part are discussed the factors which lead to get a good image quality taking into account the ALARA principle of Radiation Protection which states the fact that the dose should be 'as low as reasonable achievable'. (author)

  15. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). The present document has been produced to support use of the WASP-Ill Plus computer code and to illustrate the capabilities of the program. This Manual is organized in two separate volumes. This first one includes 11 main chapters describing how to use the WASP-Ill Plus computer program. Chapter 1 gives a summary description and some background information about the program. Chapter 2 introduces some concepts, mainly related to the computer requirements imposed by the program, that are used throughout the Manual. Chapters 3 to 9 describe how to execute each of the various programs (or modules) of the WASP-Ill Plus package. (abstract truncated)

  16. Accuracy Evaluation of The Depth of Six Kinds of Sperm Counting Chambers for both Manual and Computer-Aided Semen Analyses

    Directory of Open Access Journals (Sweden)

    Jin-Chun Lu

    2016-12-01

    Full Text Available Background: Although the depth of the counting chamber is an important factor influencing sperm counting, no research has yet been reported on the measurement and comparison of the depth of the chamber. We measured the exact depths of six kinds of sperm counting chambers and evaluated their accuracy. Materials and Methods: In this prospective study, the depths of six kinds of sperm counting chambers for both manual and computer-aided semen analyses, including Makler (n=24, Macro (n=32, Geoffrey (n=34, GoldCyto (n=20, Leja (n=20 and Cell-VU (n=20, were measured with the Filmetrics F20 Spectral Reflectance Thin-Film Measurement System, then the mean depth, the range and the coefficient of variation (CV of each chamber, and the mean depth, relative deviation and acceptability of each kind of chamber were calculated by the closeness to the nominal value. Among the 24 Makler chambers, 5 were new and 19 were used, and the other five kinds were all new chambers. Results: The depths (mean ± SD, μm of Makler (new, Macro and Geoffrey chambers were 11.07 ± 0.41, 10.19 ± 0.48 and 10.00 ± 0.28, respectively, while those of GoldCyto, Leja and Cell-VU chambers were 23.76 ± 2.15, 20.49 ± 0.22 and 24.22 ± 2.58, respectively. The acceptability of Geoffrey chambers was the highest (94.12%, followed by Macro (65.63%, Leja (35% and Makler (20%, while that of the other two kinds and the used Makler chamber was zero. Conclusion: There existed some difference between the actual depth and the corresponding nominal value for sperm counting chambers, and the overall acceptability was very low. Moreover, the abrasion caused by the long use, as of Makler chamber, for example, may result in unacceptability of the chamber. In order to ensure the accuracy and repeatability of sperm concentration results, the depth of the sperm counting chamber must be checked regularly.

  17. A manual for implementing residual radioactive material guidelines

    International Nuclear Information System (INIS)

    Gilbert, T.L.; Yu, C.; Yuan, Y.C.; Zielen, A.J.; Jusko, M.J.; Wallo, A. III; Argonne National Lab., IL; Dames and Moore, West Valley, NY; Argonne National Lab., IL; USDOE Assistant Secretary for Nuclear Energy, Washington, DC

    1989-06-01

    This manual presents information for implementing US Department of Energy (DOE) guidelines for residual radioactive material at sites identified by the Formerly Utilized Sites Remedial Action Program (FUSRAP) and the Surplus Facilities Management Program (SFMP). It describes the analysis and models used to derive site-specific guidelines for allowable residual concentrations of radionuclides in soil and the design and use of the RESRAD computer code for calculating guideline values. It also describes procedures for implementing DOE policy for reducing residual radioactivity to levels that are as low as reasonably achievable. 36 refs., 16 figs, 22 tabs

  18. Wien Automatic System Package (WASP). A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 2: Appendices

    International Nuclear Information System (INIS)

    1995-01-01

    With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). This second volume of the document to support use of the WASP-Ill Plus computer code consists of 5 appendices giving some additional information about the WASP-Ill Plus program. Appendix A is mainly addressed to the WASP-Ill Plus system analyst and supplies some information which could help in the implementation of the program on the user computer facilities. This appendix also includes some aspects about WASP-Ill Plus that could not be treated in detail in Chapters 1 to 11. Appendix B identifies all error and warning messages that may appear in the WASP printouts and advises the user how to overcome the problem. Appendix C presents the flow charts of the programs along with a brief description of the objectives and structure of each module. Appendix D describes the

  19. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  20. Electronic Commerce user manual

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-10

    This User Manual supports the Electronic Commerce Standard System. The Electronic Commerce Standard System is being developed for the Department of Defense of the Technology Information Systems Program at the Lawrence Livermore National Laboratory, operated by the University of California for the Department of Energy. The Electronic Commerce Standard System, or EC as it is known, provides the capability for organizations to conduct business electronically instead of through paper transactions. Electronic Commerce and Computer Aided Acquisition and Logistics Support, are two major projects under the DoD`s Corporate Information Management program, whose objective is to make DoD business transactions faster and less costly by using computer networks instead of paper forms and postage. EC runs on computers that use the UNIX operating system and provides a standard set of applications and tools that are bound together by a common command and menu system. These applications and tools may vary according to the requirements of the customer or location and may be customized to meet the specific needs of an organization. Local applications can be integrated into the menu system under the Special Databases & Applications option on the EC main menu. These local applications will be documented in the appendices of this manual. This integration capability provides users with a common environment of standard and customized applications.

  1. Electronic Commerce user manual

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-10

    This User Manual supports the Electronic Commerce Standard System. The Electronic Commerce Standard System is being developed for the Department of Defense of the Technology Information Systems Program at the Lawrence Livermore National Laboratory, operated by the University of California for the Department of Energy. The Electronic Commerce Standard System, or EC as it is known, provides the capability for organizations to conduct business electronically instead of through paper transactions. Electronic Commerce and Computer Aided Acquisition and Logistics Support, are two major projects under the DoD's Corporate Information Management program, whose objective is to make DoD business transactions faster and less costly by using computer networks instead of paper forms and postage. EC runs on computers that use the UNIX operating system and provides a standard set of applications and tools that are bound together by a common command and menu system. These applications and tools may vary according to the requirements of the customer or location and may be customized to meet the specific needs of an organization. Local applications can be integrated into the menu system under the Special Databases Applications option on the EC main menu. These local applications will be documented in the appendices of this manual. This integration capability provides users with a common environment of standard and customized applications.

  2. ASSERT-4 user's manual

    International Nuclear Information System (INIS)

    Judd, R.A.; Tahir, A.; Carver, M.B.; Stewart, D.G.; Thibeault, P.R.; Rowe, D.S.

    1984-09-01

    ASSERT-4 is an advanced subchannel code being developed primarily to model single- and two-phase flow and heat transfer in horizontal rod bundles. This manual is intended to facilitate the application of this code to the analysis of flow in reactor fuel channels. It contains a brief description of the thermalhydraulic model and ASSERT-4 solution scheme, and other information required by users. This other information includes a detailed discussion of input data requirements, a sample problem and solution, and information describing how to access and run ASSERT-4 on the Chalk River computers

  3. RADTRAN 6 Technical Manual

    Energy Technology Data Exchange (ETDEWEB)

    Weiner, Ruth F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neuhauser, Karen Sieglinde [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Heames, Terence John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Donnell, Brandon M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dennis, Matthew L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-01-01

    This Technical Manual contains descriptions of the calculation models and mathematical and numerical methods used in the RADTRAN 6 computer code for transportation risk and consequence assessment. The RADTRAN 6 code combines user-supplied input data with values from an internal library of physical and radiological data to calculate the expected radiological consequences and risks associated with the transportation of radioactive material. Radiological consequences and risks are estimated with numerical models of exposure pathways, receptor populations, package behavior in accidents, and accident severity and probability.

  4. RADTRAN 6 technical manual.

    Energy Technology Data Exchange (ETDEWEB)

    Weiner, Ruth F.; Neuhauser, Karen Sieglinde; Heames, Terence John; O' Donnell, Brandon M.; Dennis, Matthew L.

    2014-01-01

    This Technical Manual contains descriptions of the calculation models and mathematical and numerical methods used in the RADTRAN 6 computer code for transportation risk and consequence assessment. The RADTRAN 6 code combines user-supplied input data with values from an internal library of physical and radiological data to calculate the expected radiological consequences and risks associated with the transportation of radioactive material. Radiological consequences and risks are estimated with numerical models of exposure pathways, receptor populations, package behavior in accidents, and accident severity and probability.

  5. NASCAP programmer's reference manual

    Science.gov (United States)

    Mandell, M. J.; Stannard, P. R.; Katz, I.

    1993-05-01

    The NASA Charging Analyzer Program (NASCAP) is a computer program designed to model the electrostatic charging of complicated three-dimensional objects, both in a test tank and at geosynchronous altitudes. This document is a programmer's reference manual and user's guide. It is designed as a reference to experienced users of the code, as well as an introduction to its use for beginners. All of the many capabilities of NASCAP are covered in detail, together with examples of their use. These include the definition of objects, plasma environments, potential calculations, particle emission and detection simulations, and charging analysis.

  6. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. TA-55 change control manual

    International Nuclear Information System (INIS)

    Blum, T.W.; Selvage, R.D.; Courtney, K.H.

    1997-11-01

    This manual is the guide for initiating change at the Plutonium Facility, which handles the processing of plutonium as well as research on plutonium metallurgy. It describes the change and work control processes employed at TA-55 to ensure that all proposed changes are properly identified, reviewed, approved, implemented, tested, and documented so that operations are maintained within the approved safety envelope. All Laboratory groups, their contractors, and subcontractors doing work at TA-55 follow requirements set forth herein. This manual applies to all new and modified processes and experiments inside the TA-55 Plutonium Facility; general plant project (GPP) and line item funded construction projects at TA-55; temporary and permanent changes that directly or indirectly affect structures, systems, or components (SSCs) as described in the safety analysis, including Facility Control System (FCS) software; and major modifications to procedures. This manual does not apply to maintenance performed on process equipment or facility SSCs or the replacement of SSCs or equipment with documented approved equivalents

  8. Computational Analysis Supporting the Design of a New Beamline for the Mines Neutron Radiography Facility

    Science.gov (United States)

    Wilson, C.; King, J.

    The Colorado School of Mines installed a neutron radiography system at the United States Geological Survey TRIGA reactor in 2012. An upgraded beamline could dramatically improve the imaging capabilities of this system. This project performed computational analyses to support the design of a new beamline, with the major goals of minimizing beam divergence and maximizing beam intensity. The new beamline will consist of a square aluminum tube with an 11.43 cm (4.5 in) inner side length and 0.635 cm (0.25 in) thick walls. It is the same length as the original beam tube (8.53 m) and is composed of 1.22 m (4 ft) and 1.52 m (5 ft) flanged sections which bolt together. The bottom 1.22 m of the beamline is a cylindrical aluminum pre-collimator which is 0.635 cm (0.25 in) thick, with an inner diameter of 5.08 cm (2 in). Based on Monte Carlo model results, when a pre-collimator is present, the use of a neutron absorbing liner on the inside surface of the beam tube has almost no effect on the angular distribution of the neutron current at the collimator exit. The use of a pre-collimator may result in a non-uniform flux profile at the image plane; however, as long as the collimator is at least three times longer than the pre-collimator, the flux distortion is acceptably low.

  9. Animal facilities

    International Nuclear Information System (INIS)

    Fritz, T.E.; Angerman, J.M.; Keenan, W.G.; Linsley, J.G.; Poole, C.M.; Sallese, A.; Simkins, R.C.; Tolle, D.

    1981-01-01

    The animal facilities in the Division are described. They consist of kennels, animal rooms, service areas, and technical areas (examining rooms, operating rooms, pathology labs, x-ray rooms, and 60 Co exposure facilities). The computer support facility is also described. The advent of the Conversational Monitor System at Argonne has launched a new effort to set up conversational computing and graphics software for users. The existing LS-11 data acquisition systems have been further enhanced and expanded. The divisional radiation facilities include a number of gamma, neutron, and x-ray radiation sources with accompanying areas for related equipment. There are five 60 Co irradiation facilities; a research reactor, Janus, is a source for fission-spectrum neutrons; two other neutron sources in the Chicago area are also available to the staff for cell biology studies. The electron microscope facilities are also described

  10. Caltrans : construction manual

    Science.gov (United States)

    2009-08-01

    Caltrans intends this manual as a resource for all personnel engaged in contract administration. The manual establishes policies and procedures for the construction phase of Caltrans projects. However, this manual is not a contract document. It impos...

  11. Recommended practice for the design of a computer driven Alarm Display Facility for central control rooms of nuclear power generating stations

    International Nuclear Information System (INIS)

    Ben-Yaacov, G.

    1984-01-01

    This paper's objective is to explain the process by which design can prevent human errors in nuclear plant operation. Human factor engineering principles, data, and methods used in the design of computer driven alarm display facilities are discussed. A ''generic'', advanced Alarm Display Facility is described. It considers operator capabilities and limitations in decision-making processes, response dynamics, and human memory limitations. Highlighted are considerations of human factor criteria in the designing and layout of alarm displays. Alarm data sources are described, and their use within the Alarm Display Facility are illustrated

  12. Evaluation of left atrial function by multidetector computed tomography before left atrial radiofrequency-catheter ablation: Comparison of a manual and automated 3D volume segmentation method

    International Nuclear Information System (INIS)

    Wolf, Florian; Ourednicek, Petr; Loewe, Christian; Richter, Bernhard; Goessinger, Heinz David; Gwechenberger, Marianne; Plank, Christina; Schernthaner, Ruediger Egbert; Toepker, Michael; Lammer, Johannes; Feuchtner, Gudrun M.

    2010-01-01

    Introduction: The purpose of this study was to compare a manual and automated 3D volume segmentation tool for evaluation of left atrial (LA) function by 64-slice multidetector-CT (MDCT). Methods and materials: In 33 patients with paroxysmal atrial fibrillation a MDCT scan was performed before radiofrequency-catheter ablation. Atrial function (minimal volume (LAmin), maximal volume (LAmax), stroke volume (SV), ejection fraction (EF)) was evaluated by two readers using a manual and an automatic tool and measurement time was evaluated. Results: Automated LA volume segmentation failed in one patient due to low LA enhancement (103HU). Mean LAmax, LAmin, SV and EF were 127.7 ml, 93 ml, 34.7 ml, 27.1% by the automated, and 122.7 ml, 89.9 ml, 32.8 ml, 26.3% by the manual method with no significant difference (p > 0.05) and high Pearsons correlation coefficients (r = 0.94, r = 0.94, r = 0.82 and r = 0.85, p < 0.0001), respectively. The automated method was significantly faster (p < 0.001). Interobserver variability was low for both methods with Pearson's correlation coefficients between 0.98 and 0.99 (p < 0.0001). Conclusions: Evaluation of LA volume and function with 64-slice MDCT is feasible with a very low interobserver variability. The automatic method is as accurate as the manual method but significantly less time consuming permitting a routine use in clinical practice before RF-catheter ablation.

  13. Computer-based automated left atrium segmentation and volumetry from ECG-gated coronary CT angiography data. Comparison with manual slice segmentation and ultrasound planimetric methods

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, R.W.; Kraus, B.; Kerl, J.M.; Lehnert, T.; Vogl, T.J. [Universitaetsklinikum Frankfurt (Germany). Inst. fuer Diagnostische und Interventionelle Radiologie; Bernhardt, D.; Vega-Higuera, F. [Siemens AG, Healthcare Sector, Forchheim (Germany). Computed Tomography; Ackermann, H. [Universitaetsklinikum Frankfurt (Germany). Inst. fuer Biostatistik und Mathematische Modellierung

    2010-12-15

    Purpose: Enlargement of the left atrium is a risk factor for cardiovascular or cerebrovascular events. We evaluated the performance of prototype software for fully automated segmentation and volumetry of the left atrium. Materials and Methods: In 34 retrospectively ECG-gated coronary CT angiography scans, the end-systolic (LAVsys) and end-diastolic (LAVdia) volume of the left atrium was calculated fully automatically by prototype software. Manual slice segmentation by two independent experienced radiologists served as the reference standard. Furthermore, two independent observers calculated the LAV utilizing two ultrasound planimetric methods ('area length' and 'prolate ellipse') on CTA images. Measurement periods were compared for all methods. Results: The left atrial volumes calculated with the prototype software were in excellent agreement with the results from manual slice segmentation (r = 0.97 - 0.99; p < 0.001; Bland-Altman) with excellent interobserver agreement between both radiologists (r = 0.99; p < 0.001). Ultrasound planimetric methods clearly showed a higher variation (r = 0.72 - 0.86) with moderate interobserver agreement (r = 0.51 - 0.79). The measurement period was significantly lower with the software (267 {+-} 28 sec; p < 0.001) than with ultrasound methods (431 {+-} 68 sec) or manual slice segmentation (567 {+-} 91 sec). Conclusion: The prototype software showed excellent agreement with manual slice segmentation with the least time consumption. This will facilitate the routine assessment of the LA volume from coronary CTA data and therefore risk stratification. (orig.)

  14. Building Condition and Suitability Evaluation Manual.

    Science.gov (United States)

    MGT of America, Inc., Tallahassee, FL.

    This educational facility evaluation manual contains the overall building condition rating form and the supporting check sheets which have been field tested in several states and, where appropriate, modified for use in the Idaho School Facilities Needs Assessment. The exterior building condition form examines the foundation, structure, walls,…

  15. Development of an environmental safety case guidance manual

    International Nuclear Information System (INIS)

    Wellstead, Matthew John

    2014-01-01

    NDA RWMD is currently considering the scope, purpose and structure of a safety case manual that covers the development of nuclear operational, transport and environmental safety cases for a geological disposal facility in the United Kingdom. This paper considers the Environmental Safety Case (ESC) input into such a manual (herein referred to as the 'ESC Manual'), looking at the drivers and benefits that a guidance manual in this area may provide. (authors)

  16. SES2D user's manual

    International Nuclear Information System (INIS)

    Johnson, J.D.; Lyon, S.P.

    1982-04-01

    SES2D is an interactive graphics code designed to generate plots of equation of state data from the Los Alamos National Laboratory Group T-4 computer libraries. This manual discusses the capabilities of the code. It describes the prompts and commands and illustrates their use with a sample run

  17. Hanford whole body counting manual

    International Nuclear Information System (INIS)

    Palmer, H.E.; Brim, C.P.; Rieksts, G.A.; Rhoads, M.C.

    1987-05-01

    This document, a reprint of the Whole Body Counting Manual, was compiled to train personnel, document operation procedures, and outline quality assurance procedures. The current manual contains information on: the location, availability, and scope of services of Hanford's whole body counting facilities; the administrative aspect of the whole body counting operation; Hanford's whole body counting facilities; the step-by-step procedure involved in the different types of in vivo measurements; the detectors, preamplifiers and amplifiers, and spectroscopy equipment; the quality assurance aspect of equipment calibration and recordkeeping; data processing, record storage, results verification, report preparation, count summaries, and unit cost accounting; and the topics of minimum detectable amount and measurement accuracy and precision. 12 refs., 13 tabs

  18. LCS Users Manual

    International Nuclear Information System (INIS)

    Redd, A.J.; Ignat, D.W.

    1998-01-01

    The Lower Hybrid Simulation Code (LSC) is a computational model of lower hybrid current drive in the presence of an electric field. Details of geometry, plasma profiles, and circuit equations are treated. Two-dimensional velocity space effects are approximated in a one-dimensional Fokker-Planck treatment. The LSC was originally written to be a module for lower hybrid current drive called by the Tokamak Simulation Code (TSC), which is a numerical model of an axisymmetric tokamak plasma and the associated control systems. The TSC simulates the time evolution of a free boundary plasma by solving the MHD equations on a rectangular computational grid. The MHD equations are coupled to the external circuits (representing poloidal field coils) through the boundary conditions. The code includes provisions for modeling the control system, external heating, and fusion heating. The LSC module can also be called by the TRANSP code. TRANSP represents the plasma with an axisymmetric, fixed-boundary model and focuses on calculation of plasma transport to determine transport coefficients from data on power inputs and parameters reached. This manual covers the basic material needed to use the LSC. If run in conjunction with TSC, the ''TSC Users Manual'' should be consulted. If run in conjunction with TRANSP, on-line documentation will be helpful. A theoretical background of the governing equations and numerical methods is given. Information on obtaining, compiling, and running the code is also provided

  19. ARDS User Manual

    Science.gov (United States)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  20. APUD - User's manual

    International Nuclear Information System (INIS)

    Vamanu, D.

    1989-04-01

    APUD is a computed code designed for rough, expeditious assessments of the dispersal of airborne radioactivity discharged, either normally or accidentally, from nuclear facilities. In particular codes such as APUD may aptly complement the tool-kit required in the adoption of informed contingency plans in case of abnormal nuclear occurences that are accompanied by atmospheric releases of radioactivity. Apart from such extreme circumstances, APUD may profitable be used a a simulation/drill facility. Given adequate inputs, APUD may also work on any sort of industrial atmospheric emissions.(author)

  1. Building a capacity building manual

    CSIR Research Space (South Africa)

    Clinton, DD

    2010-02-01

    Full Text Available Organizations 2010 Building a capacity building manual Daniel D. Clinton, Jr., P.E., F.NSPE Chair, WFEO Capacity Building Committee Dr Andrew Cleland, FIPENZ, Chief Executive, IPENZ, NZ Eng David Botha, FSAICE, Executive Director, SAICE, SA Dawit... 2010 Tertiary level University curricula Coaches and mentors Facilities EXCeeD Remuneration of Academics Experiential training Outreach to Students Students chapters Young members forum World Federation of Engineering Organizations 2010 Post...

  2. Evaluation of left atrial function by multidetector computed tomography before left atrial radiofrequency-catheter ablation: Comparison of a manual and automated 3D volume segmentation method

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Florian, E-mail: florian.wolf@meduniwien.ac.a [Department of Radiology, Medical University of Vienna, Vienna (Austria); Ourednicek, Petr [Philips Medical Systems, Prague (Czech Republic); Loewe, Christian [Department of Radiology, Medical University of Vienna, Vienna (Austria); Richter, Bernhard; Goessinger, Heinz David; Gwechenberger, Marianne [Department of Cardiology, Medical University of Vienna, Vienna (Austria); Plank, Christina; Schernthaner, Ruediger Egbert; Toepker, Michael; Lammer, Johannes [Department of Radiology, Medical University of Vienna, Vienna (Austria); Feuchtner, Gudrun M. [Department of Radiology, Innsbruck Medical University, Innsbruck (Austria); Institute of Diagnostic Radiology, University Hospital Zurich (Switzerland)

    2010-08-15

    Introduction: The purpose of this study was to compare a manual and automated 3D volume segmentation tool for evaluation of left atrial (LA) function by 64-slice multidetector-CT (MDCT). Methods and materials: In 33 patients with paroxysmal atrial fibrillation a MDCT scan was performed before radiofrequency-catheter ablation. Atrial function (minimal volume (LAmin), maximal volume (LAmax), stroke volume (SV), ejection fraction (EF)) was evaluated by two readers using a manual and an automatic tool and measurement time was evaluated. Results: Automated LA volume segmentation failed in one patient due to low LA enhancement (103HU). Mean LAmax, LAmin, SV and EF were 127.7 ml, 93 ml, 34.7 ml, 27.1% by the automated, and 122.7 ml, 89.9 ml, 32.8 ml, 26.3% by the manual method with no significant difference (p > 0.05) and high Pearsons correlation coefficients (r = 0.94, r = 0.94, r = 0.82 and r = 0.85, p < 0.0001), respectively. The automated method was significantly faster (p < 0.001). Interobserver variability was low for both methods with Pearson's correlation coefficients between 0.98 and 0.99 (p < 0.0001). Conclusions: Evaluation of LA volume and function with 64-slice MDCT is feasible with a very low interobserver variability. The automatic method is as accurate as the manual method but significantly less time consuming permitting a routine use in clinical practice before RF-catheter ablation.

  3. RADTRAN 4: Volume 4, Programmer's manual

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1992-07-01

    The RADTRAN 4 computer code is designed to analyze radiological consequences and accident risks of transporting radioactive material. This manual provides information useful for interpreting, troubleshooting, or debugging components of the code during development or revision of the program

  4. Zooplankton Methodology, Collection & identyification - A field manual

    Digital Repository Service at National Institute of Oceanography (India)

    Goswami, S.C.

    and productivity would largely depend upon the use of correct methodology which involves collection of samples, fixation, preservation, analysis and computation of data. The detailed procedures on all these aspects are given in this manual....

  5. User's manual to the ICRP Code: a series of computer programs to perform dosimetric calculations for the ICRP Committee 2 report

    International Nuclear Information System (INIS)

    Watson, S.B.; Ford, M.R.

    1980-02-01

    A computer code has been developed that implements the recommendations of ICRP Committee 2 for computing limits for occupational exposure of radionuclides. The purpose of this report is to describe the various modules of the computer code and to present a description of the methods and criteria used to compute the tables published in the Committee 2 report. The computer code contains three modules of which: (1) one computes specific effective energy; (2) one calculates cumulated activity; and (3) one computes dose and the series of ICRP tables. The description of the first two modules emphasizes the new ICRP Committee 2 recommendations in computing specific effective energy and cumulated activity. For the third module, the complex criteria are discussed for calculating the tables of committed dose equivalent, weighted committed dose equivalents, annual limit of intake, and derived air concentration

  6. User's manual to the ICRP Code: a series of computer programs to perform dosimetric calculations for the ICRP Committee 2 report

    Energy Technology Data Exchange (ETDEWEB)

    Watson, S.B.; Ford, M.R.

    1980-02-01

    A computer code has been developed that implements the recommendations of ICRP Committee 2 for computing limits for occupational exposure of radionuclides. The purpose of this report is to describe the various modules of the computer code and to present a description of the methods and criteria used to compute the tables published in the Committee 2 report. The computer code contains three modules of which: (1) one computes specific effective energy; (2) one calculates cumulated activity; and (3) one computes dose and the series of ICRP tables. The description of the first two modules emphasizes the new ICRP Committee 2 recommendations in computing specific effective energy and cumulated activity. For the third module, the complex criteria are discussed for calculating the tables of committed dose equivalent, weighted committed dose equivalents, annual limit of intake, and derived air concentration.

  7. Assessment of the integrity of structural shielding of four computed tomography facilities in the greater Accra region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.; Schandorf, C.; Boadu, M.; Fletcher, J. J.

    2013-01-01

    The structural shielding thicknesses of the walls of four computed tomography (CT) facilities in Ghana were re-evaluated to verify the shielding integrity using the new shielding design methods recommended by the National Council on Radiological Protection and Measurements (NCRP). The shielding thickness obtained ranged from 120 to 155 mm using default DLP values proposed by the European Commission and 110 to 168 mm using derived DLP values from the four CT manufacturers. These values are within the accepted standard concrete wall thickness ranging from 102 to 152 mm prescribed by the NCRP. The ultrasonic pulse testing of all walls indicated that these are of good quality and free of voids since pulse velocities estimated were within the range of 3.496±0.005 km s -1 . An average dose equivalent rate estimated for supervised areas is 3.4±0.27 μSv week -1 and that for the controlled area is 18.0±0.15 μSv week -1 , which are within acceptable values. (authors)

  8. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.

    2010-01-01

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  9. Research Facilities | Wind | NREL

    Science.gov (United States)

    Research Facilities Research Facilities NREL's state-of-the-art wind research facilities at the Research Facilities Photo of five men in hard hards observing the end of a turbine blade while it's being tested. Structural Research Facilities A photo of two people silhouetted against a computer simulation of

  10. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on

  11. Computed Tomography Scanning Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION:Advances research in the areas of marine geosciences, geotechnical, civil, and chemical engineering, physics, and ocean acoustics by using high-resolution,...

  12. WAM-E user's manual

    International Nuclear Information System (INIS)

    Rayes, L.G.; Riley, J.E.

    1986-07-01

    The WAM-E series of mainframe computer codes have been developed to efficiently analyze the large binary models (e.g., fault trees) used to represent the logic relationships within and between the systems of a nuclear power plant or other large, multisystem entity. These codes have found wide application in reliability and safety studies of nuclear power plant systems. There are now nine codes in the WAM-E series, with six (WAMBAM/WAMTAP, WAMCUT, WAMCUT-II, WAMFM, WAMMRG, and SPASM) classified as Type A Production codes and the other three (WAMFTP, WAMTOP, and WAMCONV) classified as Research codes. This document serves as a combined User's Guide, Programmer's Manual, and Theory Reference for the codes, with emphasis on the Production codes. To that end, the manual is divided into four parts: Part I, Introduction; Part II, Theory and Numerics; Part III, WAM-E User's Guide; and Part IV, WAMMRG Programmer's Manual

  13. Operating manual for the Bulk Shielding Reactor

    International Nuclear Information System (INIS)

    1983-04-01

    The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxillary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supercedes all previous operating manuals for the BSR

  14. Operating manual for the Bulk Shielding Reactor

    International Nuclear Information System (INIS)

    1987-03-01

    The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxiliary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supersedes all previous operating manuals for the BSR

  15. Operating manual for the Bulk Shielding Reactor

    Energy Technology Data Exchange (ETDEWEB)

    1987-03-01

    The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxiliary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supersedes all previous operating manuals for the BSR.

  16. Operating manual for the Bulk Shielding Reactor

    Energy Technology Data Exchange (ETDEWEB)

    1983-04-01

    The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxillary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supercedes all previous operating manuals for the BSR.

  17. DOE explosives safety manual. Revision 7

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This manual prescribes the Department of Energy (DOE) safety rules used to implement the DOE safety policy for operations involving explosives. This manual is applicable to all DOE facilities engaged in operations of development, manufacturing, handling, storage, transportation, processing, or testing of explosives, pyrotechnics and propellants, or assemblies containing these materials. The standards of this manual deal with the operations involving explosives, pyrotechnics and propellants, and the safe management of such operations. The design of all new explosives facilities shall conform to the requirements established in this manual and implemented in DOE 6430.1A, ``General Design Criteria Manual.`` It is not intended that existing physical facilities be changed arbitrarily to comply with these provisions, except as required by law. Existing facilities that do not comply with these standards may continue to be used for the balance of their functional life, as long as the current operation presents no significantly greater risk than that assumed when the facility was originally designed and it can be demonstrated clearly that a modification to bring the facility into compliance is not feasible. However, in the case of a major renovation, the facility must be brought into compliance with current standards. The standards are presented as either mandatory or advisory. Mandatory standards, denoted by the words ``shall,`` ``must,`` or ``will,`` are requirements that must be followed unless written authority for deviation is granted as an exemption by the DOE. Advisory standards denoted by ``should`` or ``may`` are standards that may be deviated from with a waiver granted by facility management.

  18. Ziptrack manual

    International Nuclear Information System (INIS)

    Ito, A.; Bosworth, W.; Rutherford, J.; Lynch, A.; Tung, L.; Yang, W.

    1983-07-01

    A phenolic cart holding 3 mutually perpendicular search coils is moved through an luminum beam into the magnetic field. Coil voltages re integrated and digitized through a Data Translation 1712 ADC. The cart is controlled and the data is processed by an on-line PDP-11/05 computer. The results are displayed on a Tektronix 4010 terminal, stored on 9 track, 800 bpi tapes, and optionally recorded on a hard copy. Software on a floppy disk controls the system. The position of the cart is located by an encoder and is checked along the beam line by optical switches. One encoder count equals 0.01945 in.. The X and Y positions are changed by manipulators at each end of the beam. 625 horizontal or 500 vertical counts equal 1 in.. The desired field mapping can be automatically set up by programming a grid of encoder counts on the ''SHOW STATUS'' chart. A ziptrack command summary is given. Following that is a typical procedure for ziptrack operations. Also attached are time constants for the integrators and coil calibrations for 30 ft. and 100 ft. long cables

  19. SYVAC3 manual

    International Nuclear Information System (INIS)

    Andres, T.H.

    2000-01-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes the use and structure of SYVAC3. It is intended for modellers, programmers, operators and reviewers who deal with simulation codes based on SYVAC3. From this manual they can learn how to link a model with SYVAC3, how to set up an input file, and how to extract results from output files. The manual lists the subroutines of SYVAC3 that are available for use by models, and describes their argument lists. It also gives an overview of how routines in the File Reading Package, the Parameter Sampling Package and the Time Series Package can be used by programs outside of SYVAC3. (author)

  20. SYVAC3 manual

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2000-07-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes the use and structure of SYVAC3. It is intended for modellers, programmers, operators and reviewers who deal with simulation codes based on SYVAC3. From this manual they can learn how to link a model with SYVAC3, how to set up an input file, and how to extract results from output files. The manual lists the subroutines of SYVAC3 that are available for use by models, and describes their argument lists. It also gives an overview of how routines in the File Reading Package, the Parameter Sampling Package and the Time Series Package can be used by programs outside of SYVAC3. (author)

  1. TAP 1, Training Program Manual

    International Nuclear Information System (INIS)

    1991-01-01

    Training programs at DOE nuclear facilities should provide well-trained, qualified personnel to safely and efficiently operate the facilities in accordance with DOE requirements. A need has been identified for guidance regarding analysis, design, development, implementation, and evaluation of consistent and reliable performance-based training programs. Accreditation of training programs at Category A reactors and high-hazard and selected moderate-hazard nonreactor nuclear facilities will assure consistent, appropriate, and cost-effective training of personnel responsible for the operation, maintenance, and technical support of these facilities. Training programs that are designed and based on systematically determined job requirements, instead of subjective estimation of trainee needs, yield training activities that are consistent and develop or improve knowledge, skills, and abilities that can be directly related to the work setting. Because the training is job-related, the content of these programs more efficiently meets the needs of the employee. Besides a better trained work force, a greater level of operational reactor safety can be realized. This manual is intended to provide an overview of the accreditation process and a brief description of the elements necessary to construct and maintain training programs that are based on the requirements of the job. Two companion manuals provide additional information to assist contractors in their efforts to accredit training programs

  2. Facile formation of dendrimer-stabilized gold nanoparticles modified with diatrizoic acid for enhanced computed tomography imaging applications.

    Science.gov (United States)

    Peng, Chen; Li, Kangan; Cao, Xueyan; Xiao, Tingting; Hou, Wenxiu; Zheng, Linfeng; Guo, Rui; Shen, Mingwu; Zhang, Guixiang; Shi, Xiangyang

    2012-11-07

    We report a facile approach to forming dendrimer-stabilized gold nanoparticles (Au DSNPs) through the use of amine-terminated fifth-generation poly(amidoamine) (PAMAM) dendrimers modified by diatrizoic acid (G5.NH(2)-DTA) as stabilizers for enhanced computed tomography (CT) imaging applications. In this study, by simply mixing G5.NH(2)-DTA dendrimers with gold salt in aqueous solution at room temperature, dendrimer-entrapped gold nanoparticles (Au DENPs) with a mean core size of 2.5 nm were able to be spontaneously formed. Followed by an acetylation reaction to neutralize the dendrimer remaining terminal amines, Au DSNPs with a mean size of 6 nm were formed. The formed DTA-containing [(Au(0))(50)-G5.NHAc-DTA] DSNPs were characterized via different techniques. We show that the Au DSNPs are colloid stable in aqueous solution under different pH and temperature conditions. In vitro hemolytic assay, cytotoxicity assay, flow cytometry analysis, and cell morphology observation reveal that the formed Au DSNPs have good hemocompatibility and are non-cytotoxic at a concentration up to 3.0 μM. X-ray absorption coefficient measurements show that the DTA-containing Au DSNPs have enhanced attenuation intensity, much higher than that of [(Au(0))(50)-G5.NHAc] DENPs without DTA or Omnipaque at the same molar concentration of the active element (Au or iodine). The formed DTA-containing Au DSNPs can be used for CT imaging of cancer cells in vitro as well as for blood pool CT imaging of mice in vivo with significantly improved signal enhancement. With the two radiodense elements of Au and iodine incorporated within one particle, the formed DTA-containing Au DSNPs may be applicable for CT imaging of various biological systems with enhanced X-ray attenuation property and detection sensitivity.

  3. Dynamic Thermal Loads and Cooling Requirements Calculations for V ACs System in Nuclear Fuel Processing Facilities Using Computer Aided Energy Conservation Models

    International Nuclear Information System (INIS)

    EL Fawal, M.M.; Gadalla, A.A.; Taher, B.M.

    2010-01-01

    In terms of nuclear safety, the most important function of ventilation air conditioning (VAC) systems is to maintain safe ambient conditions for components and structures important to safety inside the nuclear facility and to maintain appropriate working conditions for the plant's operating and maintenance staff. As a part of a study aimed to evaluate the performance of VAC system of the nuclear fuel cycle facility (NFCF) a computer model was developed and verified to evaluate the thermal loads and cooling requirements for different zones of fuel processing facility. The program is based on transfer function method (TFM) and it is used to calculate the dynamic heat gain by various multilayer walls constructions and windows hour by hour at any orientation of the building. The developed model was verified by comparing the obtained calculated results of the solar heat gain by a given building with the corresponding calculated values using finite difference method (FDM) and total equivalent temperature different method (TETD). As an example the developed program is used to calculate the cooling loads of the different zones of a typical nuclear fuel facility the results showed that the cooling capacities of the different cooling units of each zone of the facility meet the design requirements according to safety regulations in nuclear facilities.

  4. GANDALF: users' manual

    International Nuclear Information System (INIS)

    Strout, R.E. II; Beach, J.L.

    1977-01-01

    The GANDALF computer code was written to calculate neutron dose equivalent given the pulse-height data obtained by using a Linear Energy Transfer (LET) proportional counter. The code also uses pre- and/or post-calibration spectra, from an alpha source, to determine a calibration factor in keV/μ/channel. Output from the code consists of the effective radius of the detection chamber in microns, a calibration factor in keV/μ/channel, and the total dose and dose equivalent in rad or rem between any two LET energies by using the equations by Attix and Roesch [Radiation Dosimetry, 1, 71 (1968)]. This report is a user's manual and is not intended as anything else, and assumes that the user has a basic knowledge of the LLL Octopus timesharing system. However, a very brief description of how the code operates is included

  5. STAIRS User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Gadjokov, V; Dragulev, V; Gove, N; Schmid, H

    1976-10-15

    The STorage And Information Retrieval System (STAIRS) of IBM is described from the user's point of view. The description is based on the experimental use of STAIRS at the IAEA computer, with INIS and AGRIS data bases, from June 1975 to May 1976. Special attention is paid to what may be termed the hierarchical approach to retrieval in STAIRS. Such an approach allows for better use of the intrinsic data-base structure and, hence, contributes to higher recall and/or relevance ratios in retrieval. The functions carried out by STAIRS are explained and the communication language between the user and the system outlined. Details are given of the specific structure of the INIS and AGRIS data bases for STAIRS. The manual should enable an inexperienced user to start his first on-line dialogues by means of a CRT or teletype terminal. (author)

  6. STAIRS User's Manual

    International Nuclear Information System (INIS)

    Gadjokov, V.; Dragulev, V.; Gove, N.; Schmid, H.

    1976-10-01

    The STorage And Information Retrieval System (STAIRS) of IBM is described from the user's point of view. The description is based on the experimental use of STAIRS at the IAEA computer, with INIS and AGRIS data bases, from June 1975 to May 1976. Special attention is paid to what may be termed the hierarchical approach to retrieval in STAIRS. Such an approach allows for better use of the intrinsic data-base structure and, hence, contributes to higher recall and/or relevance ratios in retrieval. The functions carried out by STAIRS are explained and the communication language between the user and the system outlined. Details are given of the specific structure of the INIS and AGRIS data bases for STAIRS. The manual should enable an inexperienced user to start his first on-line dialogues by means of a CRT or teletype terminal. (author)

  7. Automation of electromagnetic compatability (EMC) test facilities

    Science.gov (United States)

    Harrison, C. A.

    1986-01-01

    Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.

  8. Nuclear medicine resources manual

    International Nuclear Information System (INIS)

    2006-02-01

    Over the past decade many IAEA programmes have significantly enhanced the capabilities of numerous Member States in the field of nuclear medicine. Functional imaging using nuclear medicine procedures has become an indispensable tool for the diagnosis, treatment planning and management of patients. However, due to the heterogeneous growth and development of nuclear medicine in the IAEA's Member States, the operating standards of practice vary considerably from country to country and region to region. This publication is the result of the work of over 30 international professionals who have assisted the IAEA in the process of standardization and harmonization. This manual sets out the prerequisites for the establishment of a nuclear medicine service, including basic infrastructure, suitable premises, reliable supply of electricity, maintenance of a steady temperature, dust exclusion for gamma cameras and radiopharmacy dispensaries. It offers clear guidance on human resources and training needs for medical doctors, technologists, radiopharmaceutical scientists, physicists and specialist nurses in the practice of nuclear medicine. The manual describes the requirements for safe preparation and quality control of radiopharmaceuticals. In addition, it contains essential requirements for maintenance of facilities and instruments, for radiation hygiene and for optimization of nuclear medicine operational performance with the use of working clinical protocols. The result is a comprehensive guide at an international level that contains practical suggestions based on the experience of professionals around the globe. This publication will be of interest to nuclear medicine physicians, radiologists, medical educationalists, diagnostic centre managers, medical physicists, medical technologists, radiopharmacists, specialist nurses, clinical scientists and those engaged in quality assurance and control systems in public health in both developed and developing countries

  9. Shielding design for positron emission tomography facility

    International Nuclear Information System (INIS)

    Abdallah, I.I.

    2007-01-01

    With the recent advent of readily available tracer isotopes, there has been marked increase in the number of hospital-based and free-standing positron emission tomography (PET) clinics. PET facilities employ relatively large activities of high-energy photon emitting isotopes, which can be dangerous to the health of humans and animals. This coupled with the current dose limits for radiation worker and members of the public can result in shielding requirements. This research contributes to the calculation of the appropriate shielding to keep the level of radiation within an acceptable recommended limit. Two different methods were used including measurements made at selected points of an operating PET facility and computer simulations by using Monte Carlo Transport Code. The measurements mainly concerned the radiation exposure at different points around facility using the survey meter detectors and Thermoluminescent Dosimeters (TLD). Then the set of manual calculation procedures were used to estimate the shielding requirements for a newly built PEF facility. The results from the measurement and the computer simulation were compared to the results obtained from the set manual calculation procedure. In general, the estimated weekly dose at the points of interest is lower than the regulatory limits for the little company of Mary Hospital. Furthermore, the density and the HVL for normal strength concrete and clay bricks are almost similar. In conclusion, PET facilities present somewhat different design requirements and are more likely to require additional radiation shielding. Therefore, existing shields at the little Company of Mary Hospital are in general found to be adequate and satisfactory and additional shielding was found necessary at the new PET facility in the department of Nuclear Medicine of the Dr. George Mukhari Hospital. By use of appropriate design, by implying specific shielding requirements and by maintaining good operating practices, radiation doses to

  10. Cathare2 V1.3E post-test computations of SPE-1 and SPE-2 experiments at PMK-NVH facility

    International Nuclear Information System (INIS)

    Belliard, M.; Laugier, E.

    1994-01-01

    This paper presents the first CATHARE2 V1.3E simulations of the SPE-2 transients at PMK-NVH loop. Concerning the SPE-1 and the SPE-2 experimentations at PMK-NVH, it contains a description of the facilities and the transient, as well as different conditions of use. The paper includes also a presentation of the CATHARE2 model and different type of computation, such as the steady state computation or SPE-1 and SPE-2 transient (TEC). 4 refs., 12 figs., 4 tabs

  11. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  12. HERMES II experimenters' manual (revised)

    International Nuclear Information System (INIS)

    Schuch, R.L.

    1977-04-01

    The HERMES II is a high-intensity laboratory photon source for gamma-ray radiation effects experiments as well as a high-energy pulsed electron beam generator for a variety of potential applications. The purpose of this manual is to serve as a basic source of information for prospective users of HERMES. Included is a brief discussion of the design and operation of the accelerator system as well as a summary of environmental data for x-ray operation and output characteristics for electron beam modes. The manual also contains a description of the HERMES experimental facilities, including geometry of the test cell, instrumentation and data collection capabilities, and services and support available to experimenters

  13. TRANSWRAP II: problem definition manual

    International Nuclear Information System (INIS)

    Knittle, D.E.

    1981-02-01

    The TRANSWRAP II computer code, written in Fortran IV and described in this Problem Definition Manual, was developed to analytically predict the magnitude of pressure pulses of large scale sodium-wate reactions in LMFBR secondary systems. It is currently being used for the Clinch River Breeder Reactor Program. The code provides the options, flexibility and features necessary to consider any system configuration. The code methodology has been validated with the aid of extensive sodium-water reaction test programs

  14. A Manual of Style.

    Science.gov (United States)

    Nebraska State Dept. of Education, Lincoln.

    This "Manual of Style" is offered as a guide to assist Nebraska State employees in producing quality written communications and in presenting a consistently professional image of government documents. The manual is not designed to be all-inclusive. Sections of the manual discuss formatting documents, memorandums, letters, mailing…

  15. Procedures for economic distribution of radionuclides in research facilities

    International Nuclear Information System (INIS)

    Perry, N.A.

    1979-01-01

    A radionuclide accountability system for use in a research facility is described. It can be operated manually or adapted for computer use. All radionuclides are ordered, received, distributed and paid for by the Radiological Control Office who keep complete records of date of order, receipt, calibration use, transfer and/or disposal. Wipe leak tests, specific activity and lot number are also recorded. The procedure provides centralized total accountability records, including financial records, of all radionuclide orders, and the economic advantages of combined purchasing. The use of this system in two medical facilities has resulted in considerable financial savings in the first year of operation. (author)

  16. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Science.gov (United States)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  17. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  18. DIMAC program user's manual

    International Nuclear Information System (INIS)

    Lee, Byoung Oon; Song, Tae Young

    2003-11-01

    DIMAC (A DIspersion Metallic fuel performance Analysis Code) is a computer program for simulating the behavior of dispersion fuel rods under normal operating conditions of HYPER. It computes the one-dimensional temperature distribution and the thermo-mechanical characteristics of fuel rod under the steady state operation condition, including the swelling and rod deformation. DIMAC was developed based on the experience of research reactor fuel. DIMAC consists of the temperature calculation module, the mechanical swelling calculation module, and the fuel deformation calculation module in order to predict the deformation of a dispersion fuel as a function of power history. Because there are a little of available U-TRU-Zr or TRU-Zr characteristics, the material data of U-Pu-Zr or Pu-Zr are used for those of U-TRU-Zr or TRU-Zr. This report is mainly intended as a user's manual for the DIMAC code. The general description on this code, the description on input parameter, the description on each subroutine, the sample problem and the sample input and partial output are written in this report

  19. DIMAC program user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byoung Oon; Song, Tae Young

    2003-11-01

    DIMAC (A DIspersion Metallic fuel performance Analysis Code) is a computer program for simulating the behavior of dispersion fuel rods under normal operating conditions of HYPER. It computes the one-dimensional temperature distribution and the thermo-mechanical characteristics of fuel rod under the steady state operation condition, including the swelling and rod deformation. DIMAC was developed based on the experience of research reactor fuel. DIMAC consists of the temperature calculation module, the mechanical swelling calculation module, and the fuel deformation calculation module in order to predict the deformation of a dispersion fuel as a function of power history. Because there are a little of available U-TRU-Zr or TRU-Zr characteristics, the material data of U-Pu-Zr or Pu-Zr are used for those of U-TRU-Zr or TRU-Zr. This report is mainly intended as a user's manual for the DIMAC code. The general description on this code, the description on input parameter, the description on each subroutine, the sample problem and the sample input and partial output are written in this repo0008.

  20. User's manual of Tokamak Simulation Code

    International Nuclear Information System (INIS)

    Nakamura, Yukiharu; Nishino, Tooru; Tsunematsu, Toshihide; Sugihara, Masayoshi.

    1992-12-01

    User's manual for use of Tokamak Simulation Code (TSC), which simulates the time-evolutional process of deformable motion of axisymmetric toroidal plasma, is summarized. For the use at JAERI computer system, the TSC is linked with the data management system GAEA. This manual is forcused on the procedure for the input and output by using the GAEA system. Model equations to give axisymmetric motion, outline of code system, optimal method to get the well converged solution are also described. (author)

  1. PFP MICON maintenance manual. Revision 1

    International Nuclear Information System (INIS)

    Silvan, G.R.

    1995-01-01

    This manual covers the use of maintenance displays, maintenance procedures, system alarms and common system failures. This manual is intended to supplement the MICON maintenance training not replace it. It also assumes that the user is familiar with the normal operation of the MICON A/S system. The MICON system is a distributed control computer and, among other things, controls the HVAC system for the Plutonium Finishing Plant

  2. Program management system manual

    International Nuclear Information System (INIS)

    1989-08-01

    OCRWM has developed a program management system (PMS) to assist in organizing, planning, directing and controlling the Civilian Radioactive Waste Management Program. A well defined management system is necessary because: (1) the Program is a complex technical undertaking with a large number of participants, (2) the disposal and storage facilities to be developed by the Program must be licensed by the Nuclear Regulatory Commission (NRC) and hence are subject to rigorous quality assurance (QA) requirements, (3) the legislation mandating the Program creates a dichotomy between demanding schedules of performance and a requirement for close and continuous consultation and cooperation with external entities, (4) the various elements of the Program must be managed as parts of an integrated waste management system, (5) the Program has an estimated total system life cycle cost of over $30 billion, and (6) the Program has a unique fiduciary responsibility to the owners and generators of the nuclear waste for controlling costs and minimizing the user fees paid into the Nuclear Waste Fund. This PMS Manual is designed and structured to facilitate strong, effective Program management by providing policies and requirements for organizing, planning, directing and controlling the major Program functions

  3. PROCESS DESIGN MANUAL FOR SLUDGE TREATMENT AND DISPOSAL

    Science.gov (United States)

    The purpose of this manual is to provide the engineering community and related industry with a new source of information to be used in the planning, design, and operation of present and future wastewater pollution control facilities. This manual supplements this existing knowledg...

  4. PUNCH. GENIE MK.2.2 manual

    International Nuclear Information System (INIS)

    David, W.I.F.; Johnson, M.W.; Knowles, K.J.; Crosbie, G.D.; Graham, S.P.; Campbell, E.P.; Lyall, J.S.

    1986-01-01

    GENIE is a language for spectrum manipulation and display, that has been developed to satisfy the data-analysis requirements for all the neutron-scattering instruments at the spallation neutron source. The manual contains: the GENIE 'keyboard' commands, GENIE 'GCL' commands, command file examples, and the adding of non-standard facilities. (U.K.)

  5. The environmental survey manual: Appendix D

    International Nuclear Information System (INIS)

    1987-08-01

    The purpose of this manual is to provide guidance to the Survey and Sampling and Analysis teams that conduct the one-time Environmental Survey of the major US Department of Energy operating facilities. This appendix contains procedures for chemical analysis of organics, inorganics, and radioisotopes

  6. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  7. 33 CFR 385.28 - Operating Manuals.

    Science.gov (United States)

    2010-07-01

    ... Processes § 385.28 Operating Manuals. (a) General provisions. (1) The Corps of Engineers and the non-Federal... emergencies that can be expected to occur at a project are: drowning and other accidents, failure of the operation facilities, chemical spills, treatment plant failures and other temporary pollution problems...

  8. Thermal model of laser-induced skin damage: computer program operator's manual. Final report, September 1976--April 1977

    Energy Technology Data Exchange (ETDEWEB)

    Takata, A.N.

    1977-12-01

    A user-oriented description is given of a computer program for predicting temperature rises, irreversible damage, and degree of burns caused to skin by laser exposures. This report describes the parameters necessary to run the program and provides suggested values for the parameters. Input data are described in detail as well as the capabilities and limitations of the program. (Author)

  9. STAT, GAPS, STRAIN, DRWDIM: a system of computer codes for analyzing HTGR fuel test element metrology data. User's manual

    Energy Technology Data Exchange (ETDEWEB)

    Saurwein, J.J.

    1977-08-01

    A system of computer codes has been developed to statistically reduce Peach Bottom fuel test element metrology data and to compare the material strains and fuel rod-fuel hole gaps computed from these data with HTGR design code predictions. The codes included in this system are STAT, STRAIN, GAPS, and DRWDIM. STAT statistically evaluates test element metrology data yielding fuel rod, fuel body, and sleeve irradiation-induced strains; fuel rod anisotropy; and additional data characterizing each analyzed fuel element. STRAIN compares test element fuel rod and fuel body irradiation-induced strains computed from metrology data with the corresponding design code predictions. GAPS compares test element fuel rod, fuel hole heat transfer gaps computed from metrology data with the corresponding design code predictions. DRWDIM plots the measured and predicted gaps and strains. Although specifically developed to expedite the analysis of Peach Bottom fuel test elements, this system can be applied, without extensive modification, to the analysis of Fort St. Vrain or other HTGR-type fuel test elements.

  10. The highway capacity manual a conceptual and research history

    CERN Document Server

    Roess, Roger P

    2014-01-01

    Since 1950, the Highway Capacity Manual has been a standard used in the planning, design, analysis, and operation of virtually any highway traffic facility in the United States. It has also been widely used abroad, and has spurred the development of similar manuals in other countries. The twin concepts of capacity and level of service have been developed in the manual, and methodologies have been presented that allow highway traffic facilities to be designed on a common basis, and allow for the analysis of operational quality under various traffic demand scenarios. The manual also addresses related pedestrian, bicycle, and transit issues.   This book details the fundamental development of the concepts of capacity and level of service, and of the specific methodologies developed to describe them over a wide range of facility types. The book is comprised of two volumes. Volume 1 (this book) focuses on the development of basic principles, and their application to uninterrupted flow facilities: freeways, multila...

  11. EQ3NR, a computer program for geochemical aqueous speciation-solubility calculations: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 3

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.

    1992-09-14

    EQ3NR is an aqueous solution speciation-solubility modeling code. It is part of the EQ3/6 software package for geochemical modeling. It computes the thermodynamic state of an aqueous solution by determining the distribution of chemical species, including simple ions, ion pairs, and complexes, using standard state thermodynamic data and various equations which describe the thermodynamic activity coefficients of these species. The input to the code describes the aqueous solution in terms of analytical data, including total (analytical) concentrations of dissolved components and such other parameters as the pH, pHCl, Eh, pe, and oxygen fugacity. The input may also include a desired electrical balancing adjustment and various constraints which impose equilibrium with special pure minerals, solid solution end-member components (of specified mole fractions), and gases (of specified fugacities). The code evaluates the degree of disequilibrium in terms of the saturation index (SI = 1og Q/K) and the thermodynamic affinity (A = {minus}2.303 RT log Q/K) for various reactions, such as mineral dissolution or oxidation-reduction in the aqueous solution itself. Individual values of Eh, pe, oxygen fugacity, and Ah (redox affinity) are computed for aqueous redox couples. Equilibrium fugacities are computed for gas species. The code is highly flexible in dealing with various parameters as either model inputs or outputs. The user can specify modification or substitution of equilibrium constants at run time by using options on the input file.

  12. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 4

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Daveler, S.A.

    1992-10-09

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ``single-point`` thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics.

  13. EQ3NR, a computer program for geochemical aqueous speciation-solubility calculations: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.

    1992-01-01

    EQ3NR is an aqueous solution speciation-solubility modeling code. It is part of the EQ3/6 software package for geochemical modeling. It computes the thermodynamic state of an aqueous solution by determining the distribution of chemical species, including simple ions, ion pairs, and complexes, using standard state thermodynamic data and various equations which describe the thermodynamic activity coefficients of these species. The input to the code describes the aqueous solution in terms of analytical data, including total (analytical) concentrations of dissolved components and such other parameters as the pH, pHCl, Eh, pe, and oxygen fugacity. The input may also include a desired electrical balancing adjustment and various constraints which impose equilibrium with special pure minerals, solid solution end-member components (of specified mole fractions), and gases (of specified fugacities). The code evaluates the degree of disequilibrium in terms of the saturation index (SI = 1og Q/K) and the thermodynamic affinity (A = -2.303 RT log Q/K) for various reactions, such as mineral dissolution or oxidation-reduction in the aqueous solution itself. Individual values of Eh, pe, oxygen fugacity, and Ah (redox affinity) are computed for aqueous redox couples. Equilibrium fugacities are computed for gas species. The code is highly flexible in dealing with various parameters as either model inputs or outputs. The user can specify modification or substitution of equilibrium constants at run time by using options on the input file

  14. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.; Daveler, S.A.

    1992-01-01

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ''single-point'' thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics

  15. Manual for reactor produced radioisotopes

    International Nuclear Information System (INIS)

    2003-01-01

    numerous new developments that have taken place since then. Hence in this manual it was decided to focus only on reactor produced radioisotopes. This manual contains procedures for 48 important reactor-produced isotopes. These were contributed by major radioisotope producers from different parts of the world and are based on their practical experience. In case of widely used radioisotopes such as 131 I, 32 P and 99 Mo, information from more than one centre is included so that the users can compare the procedures. As in the earlier two versions, a general introductory write-up is included covering basic information on related aspects such as target irradiation, handling facilities, radiation protection and transportation, but in less detail. Relevant IAEA publications on such matters, particularly related to radiation protection and transportation, should be referred to for guidelines. Similarly, the nuclear data contained in the manual are only indicative and the relevant databases should be referred to for more authentic values. It is hoped that the manual will be a useful source of information for those working in radioisotope production laboratories as well as those intending to initiate such activities

  16. TAP 2: Performance-Based Training Manual

    International Nuclear Information System (INIS)

    1993-08-01

    Cornerstone of safe operation of DOE nuclear facilities is personnel performing day-to-day functions which accomplish the facility mission. Performance-based training is fundamental to the safe operation. This manual has been developed to support the Training Accreditation Program (TAP) and assist contractors in efforts to develop performance-based training programs. It provides contractors with narrative procedures on performance-based training that can be modified and incorporated for facility-specific application. It is divided into sections dealing with analysis, design, development, implementation, and evaluation

  17. PETSc Users Manual Revision 3.7

    Energy Technology Data Exchange (ETDEWEB)

    Balay, S.; Brune, P.; Buschelman, K.; Gropp, W.; Karpeyev, D.; Kaushik, D.; Knepley, M.; McInnes, L. Curfman; Rupp, K.; Smith, B.; Zhang, H.; Abhyankar, S.; Adams, M.; Dalcin, L.; Zampini, S.; Zhang, H.

    2016-04-01

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  18. PETSc Users Manual Revision 3.8

    Energy Technology Data Exchange (ETDEWEB)

    Balay, S. [Argonne National Lab. (ANL), Argonne, IL (United States); Abhyankar, S. [Argonne National Lab. (ANL), Argonne, IL (United States); Adams, M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Brune, P. [Argonne National Lab. (ANL), Argonne, IL (United States); Buschelman, K. [Argonne National Lab. (ANL), Argonne, IL (United States); Dalcin, L. D. [King Abdullah Univ. of Science and Technology, Thuwal (Saudi Arabia); Eijkhout, V. [Univ. of Texas, Austin, TX (United States); Gropp, W. [Argonne National Lab. (ANL), Argonne, IL (United States); Kaushik, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Knepley, M. [Argonne National Lab. (ANL), Argonne, IL (United States); May, D. [ETH Zurich (Switzerland); McInnes, L. Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Munson, T. [Argonne National Lab. (ANL), Argonne, IL (United States); Rupp, K. [Argonne National Lab. (ANL), Argonne, IL (United States); Sanan, P. [Univ. of Italian Switzerland, Lugano (Switzerland); Smith, B. [Argonne National Lab. (ANL), Argonne, IL (United States); Zampini, S. [King Abdullah Univ. of Science and Technology, Thuwal (Saudi Arabia); Zhang, H. [Illinois Inst. of Technology, Chicago, IL (United States); Zhang, H. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-09-01

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.

  19. THEAP-I: A computer program for thermal hydraulic analysis of a thermally interacting channel bundle of complex geometry. Code description and user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Bartzis, J G; Megaritou, A; Belessiotis, V

    1987-09-01

    THEAP-I is a computer code developed in NRCPS `DEMOCRITUS` with the aim to contribute to the safety analysis of the open pool research reactors. THEAP-I is designed for three dimensional, transient thermal/hydraulic analysis of a thermally interacting channel bundle totally immersed into water or air, such as the reactor core. In the present report the mathematical and physical models and methods of the solution are given as well as the code description and the input data. A sample problem is also included, refering to the Greek Research Reactor analysis, under an hypothetical severe loss of coolant accident.

  20. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-06-01

    A discussion is presented of the use of the RELAP4/MOD5 computer program in simulating the thermal-hydraulic behavior of light-water reactor systems when subjected to postulated transients such as a LOCA, pump failure, or nuclear excursion. The volume is divided into main sections which cover: (1) program description, (2) input data, (3) problem initialization, (4) user guidelines, (5) output discussion, (6) source program description, (7) implementation requirements, (8) data files, (9) description of PLOTR4M, (10) description of STH20, (11) summary flowchart, (12) sample problems, (13) problem definition, and (14) problem input